Compare commits

...

43 Commits

Author SHA1 Message Date
Paul Popus
ca626288ab empty commit for new hash 2025-07-30 11:34:40 +01:00
Paul Popus
f87db1cf59 add r2 and d1 to publishList in script 2025-07-28 20:26:48 +01:00
Sasha
fb196069d5 delete :memory 2025-07-28 21:56:21 +03:00
Sasha
759bb9899e change version 2025-07-28 20:34:11 +03:00
Sasha
72fbcc3567 Merge branch 'main' of github.com:payloadcms/payload into feat/add-d1-adapter 2025-07-28 20:16:38 +03:00
Sasha
92010510b0 storage r2 2025-07-28 19:49:17 +03:00
Alessio Gravili
5c94d2dc71 feat: support next.js 15.4.4 (#13280)
- bumps next.js from 15.3.2 to 15.4.4 in monorepo and templates. It's
important to run our tests against the latest Next.js version to
guarantee full compatibility.
- bumps playwright because of peer dependency conflict with next 15.4.4
- bumps react types because why not

https://nextjs.org/blog/next-15-4

As part of this upgrade, the functionality added by
https://github.com/payloadcms/payload/pull/11658 broke. This PR fixes it
by creating a wrapper around `React.isValidElemen`t that works for
Next.js 15.4.

---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
  - https://app.asana.com/0/0/1210803039809808
2025-07-28 16:23:43 +00:00
Jarrod Flesch
b1aac19668 chore(next): cleanup unused code (#13292)
Looks like a merge resolution kept unused code. The same condition is
added a couple lines below this removal.
2025-07-28 13:43:51 +00:00
Sean Zubrickas
d093bb1f00 fix: refactors toast error rendering (#13252)
Fixes #13191

- Render a single html element for single error messages
- Preserve ul structure for multiple errors
- Updates tests to check for both cases
2025-07-28 05:59:25 -07:00
Alessio Gravili
2e9ba10fb5 docs: remove obsolete scheduler property (#13278)
That property does not exist and was used in a previous, outdated
implementation of auto scheduling
2025-07-25 16:25:47 -07:00
Alessio Gravili
8518141a5e fix(drizzle): respect join.type config (#13258)
Respects join.type instead of hardcoding leftJoin
2025-07-25 15:46:20 -07:00
Alessio Gravili
6d6c9ebc56 perf(drizzle): 2x faster db.deleteMany (#13255)
Previously, `db.deleteMany` on postgres resulted in 2 roundtrips to the
database (find + delete with ids). This PR passes the where query
directly to the `deleteWhere` function, resulting in only one roundtrip
to the database (delete with where).

If the where query queries other tables (=> joins required), this falls
back to find + delete with ids. However, this is also more optimized
than before, as we now pass `select: { id: true }` to the findMany
query.

---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
  - https://app.asana.com/0/0/1210871676349299
2025-07-25 15:46:09 -07:00
German Jablonski
7cd4a8a602 fix(richtext-lexical): unify indent between different converters and make paragraphs and lists match without CSS (#13274)
Previously, the Lexical editor was using px, and the JSX converter was
using rem. #12848 fixed the inconsistency by changing the editor to rem,
but it should have been the other way around, changing the JSX converter
to px.

You can see the latest explanation about why it should be 40px
[here](https://github.com/payloadcms/payload/issues/13130#issuecomment-3058348085).
In short, that's the default indentation all browsers use for lists.

This time I'm making sure to leave clear comments everywhere and a test
to avoid another regression.

Here is an image of what the e2e test looks like:

<img width="321" height="678" alt="image"
src="https://github.com/user-attachments/assets/8880c7cb-a954-4487-8377-aee17c06754c"
/>

The first part is the Lexical editor, the second is the JSX converter.

As you can see, the checkbox in JSX looks a little odd because it uses
an input checkbox (as opposed to a pseudo-element in the Lexical
editor). I thought about adding an inline style to move it slightly to
the left, but I found that browsers don't have a standard size for the
checkbox; it varies by browser and device.
That requires a little more thought; I'll address that in a future PR.

Fixes #13130
2025-07-25 22:58:49 +01:00
Jarrod Flesch
bc802846c5 fix: serve svg+xml as svg (#13277)
Based from https://github.com/payloadcms/payload/pull/13276

Fixes https://github.com/payloadcms/payload/issues/7624

If an uploaded image has `.svg` ext, and the mimeType is read as
`application/xml` adjust the mimeType to `image/svg+xml`.

---------

Co-authored-by: Philipp Schneider <47689073+philipp-tailor@users.noreply.github.com>
2025-07-25 21:00:51 +00:00
Jarrod Flesch
e8f6cb5ed1 fix: svg+xml file detection (#13276)
Adds logic for svg+xml file type detection.

---------

Co-authored-by: Philipp Schneider <47689073+philipp-tailor@users.noreply.github.com>
2025-07-25 18:33:53 +00:00
Elliot DeNolf
23bd67515c templates: bump for v3.49.0 (#13273)
🤖 Automated bump of templates for v3.49.0

Triggered by user: @denolfe

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2025-07-25 13:39:09 -04:00
Jarrod Flesch
e29d1d98d4 fix(plugin-multi-tenant): prefer assigned tenants for selector population (#13213)
When populating the selector it should populate it with assigned tenants
before fetching all tenants that a user has access to.

You may have "public" tenants and while a user may have _access_ to the
tenant, the selector should show the ones they are assigned to. Users
with full access are the ones that should be able to see the public ones
for editing.
2025-07-25 10:10:26 -04:00
Elliot DeNolf
4ac428d250 chore(release): v3.49.0 [skip ci] 2025-07-25 09:27:41 -04:00
Sasha
75385de01f fix: filtering by polymorphic relationships inside other fields (#13265)
Previously, filtering by a polymorphic relationship inside an array /
group (unless the `name` is `version`) / tab caused `QueryError: The
following path cannot be queried:`.
2025-07-25 09:10:21 -04:00
Patrik
f63dc2a10c feat: adds trash support (soft deletes) (#12656)
### What?

This PR introduces complete trash (soft-delete) support. When a
collection is configured with `trash: true`, documents can now be
soft-deleted and restored via both the API and the admin panel.

```
import type { CollectionConfig } from 'payload'

const Posts: CollectionConfig = {
  slug: 'posts',
  trash: true, // <-- New collection config prop @default false
  fields: [
    {
      name: 'title',
      type: 'text',
    },
    // other fields...
  ],
}
```

### Why

Soft deletes allow developers and admins to safely remove documents
without losing data immediately. This enables workflows like reversible
deletions, trash views, and auditing—while preserving compatibility with
drafts, autosave, and version history.

### How?

#### Backend

- Adds new `trash: true` config option to collections.
- When enabled:
  - A `deletedAt` timestamp is conditionally injected into the schema.
- Soft deletion is performed by setting `deletedAt` instead of removing
the document from the database.
- Extends all relevant API operations (`find`, `findByID`, `update`,
`delete`, `versions`, etc.) to support a new `trash` param:
  - `trash: false` → excludes trashed documents (default)
  - `trash: true` → includes both trashed and non-trashed documents
- To query **only trashed** documents: use `trash: true` with a `where`
clause like `{ deletedAt: { exists: true } }`
- Enforces delete access control before allowing a soft delete via
update or updateByID.
- Disables version restoring on trashed documents (must be restored
first).

#### Admin Panel

- Adds a dedicated **Trash view**: `/collections/:collectionSlug/trash`
- Default delete action now soft-deletes documents when `trash: true` is
set.
- **Delete confirmation modal** includes a checkbox to permanently
delete instead.
- Trashed documents:
- Displays UI banner for better clarity of trashed document edit view vs
non-trashed document edit view
  - Render in a read-only edit view
  - Still allow access to **Preview**, **API**, and **Versions** tabs
- Updated Status component:
- Displays “Previously published” or “Previously a draft” for trashed
documents.
  - Disables status-changing actions when documents are in trash.
- Adds new **Restore** bulk action to clear the `deletedAt` timestamp.
- New `Restore` and `Permanently Delete` buttons for
single-trashed-document restore and permanent deletion.
- **Restore confirmation modal** includes a checkbox to restore as
`published`, defaults to `draft`.
- Adds **Empty Trash** and **Delete permanently** bulk actions.
  
#### Notes

- This feature is completely opt-in. Collections without trash: true
behave exactly as before.



https://github.com/user-attachments/assets/00b83f8a-0442-441e-a89e-d5dc1f49dd37
2025-07-25 09:08:22 -04:00
Sasha
3f09e27bdd execute method 2025-07-25 15:06:16 +03:00
German Jablonski
4a712b3483 fix(ui): preserve localized blocks and arrays when using CopyToLocale (#13216)
## Problem:
In PR #11887, a bug fix for `copyToLocale` was introduced to address
issues with copying content between locales in Postgres. However, an
incorrect algorithm was used, which removed all "id" properties from
documents being copied. This led to bug #12536, where `copyToLocale`
would mistakenly delete the document in the source language, affecting
not only Postgres but any database.

## Cause and Solution:

When copying documents with localized arrays or blocks, Postgres throws
errors if there are two blocks with the same ID. This is why PR #11887
removed all IDs from the document to avoid conflicts. However, this
removal was too broad and caused issues in cases where it was
unnecessary.


The correct solution should remove the IDs only in nested fields whose
ancestors are localized. The reasoning is as follows:
- When an array/block is **not localized** (`localized: false`), if it
contains localized fields, these fields share the same ID across
different locales.
- When an array/block **is localized** (`localized: true`), its
descendant fields cannot share the same ID across different locales if
Postgres is being used. This wouldn't be an issue if the table
containing localized blocks had a composite primary key of `locale +
id`. However, since the primary key is just `id`, we need to assign a
new ID for these fields.

This PR properly removes IDs **only for nested fields** whose ancestors
are localized.

Fixes #12536

## Example:
### Before Fix:
```js
// Original document (en)
array: [{
  id: "123",
  text: { en: "English text" }
}]

// After copying to 'es' locale, a new ID was created instead of updating the existing item
array: [{
  id: "456",  // 🐛 New ID created!
  text: { es: "Spanish text" } // 🐛 'en' locale is missing
}]
```
### After fix:
```js
// After fix
array: [{
  id: "123",  //  Same ID maintained
  text: {
    en: "English text",
    es: "Spanish text"  //  Properly merged with existing item
  }
}]
```


## Additional fixes:

### TraverseFields

In the process of designing an appropriate solution, I detected a couple
of bugs in traverseFields that are also addressed in this PR.

### Fixed MongoDB Empty Array Handling

During testing, I discovered that MongoDB and PostgreSQL behave
differently when querying documents that don't exist in a specific
locale:
- PostgreSQL: Returns the document with data from the fallback locale
- MongoDB: Returns the document with empty arrays for localized fields

This difference caused `copyToLocale` to fail in MongoDB because the
merge algorithm only checked for `null` or `undefined` values, but not
empty arrays. When MongoDB returned `content: []` for a non-existent
locale, the algorithm would attempt to iterate over the empty array
instead of using the source locale's data.

### Move test e2e to int

The test introduced in #11887 didn't catch the bug because our e2e suite
doesn't run on Postgres. I migrated the test to an integration test that
does run on Postgres and MongoDB.
2025-07-24 20:37:13 +01:00
Jarrod Flesch
fa7d209cc9 fix(ui): incorrect blocks label sizing (#13264)
Blocks container labels should match the Array and Tab labels. Uses same
styling approach as Array labels.

### Before
<img width="229" height="260" alt="CleanShot 2025-07-24 at 12 26 38"
src="https://github.com/user-attachments/assets/9c4eb7c5-3638-4b47-805b-1206f195f5eb"
/>

### After
<img width="245" height="259" alt="CleanShot 2025-07-24 at 12 27 00"
src="https://github.com/user-attachments/assets/c04933b4-226f-403b-9913-24ba00857aab"
/>
2025-07-24 19:34:29 +00:00
Jacob Fletcher
bccf6ab16f feat: group by (#13138)
Supports grouping documents by specific fields within the list view.

For example, imagine having a "posts" collection with a "categories"
field. To report on each specific category, you'd traditionally filter
for each category, one at a time. This can be quite inefficient,
especially with large datasets.

Now, you can interact with all categories simultaneously, grouped by
distinct values.

Here is a simple demonstration:


https://github.com/user-attachments/assets/0dcd19d2-e983-47e6-9ea2-cfdd2424d8b5

Enable on any collection by setting the `admin.groupBy` property:

```ts
import type { CollectionConfig } from 'payload'

const MyCollection: CollectionConfig = {
  // ...
  admin: {
    groupBy: true
  }
}
```

This is currently marked as beta to gather feedback while we reach full
stability, and to leave room for API changes and other modifications.
Use at your own risk.

Note: when using `groupBy`, bulk editing is done group-by-group. In the
future we may support cross-group bulk editing.

Dependent on #13102 (merged).

---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
  - https://app.asana.com/0/0/1210774523852467

---------

Co-authored-by: Paul Popus <paul@payloadcms.com>
2025-07-24 14:00:52 -04:00
Dan Ribbens
14322a71bb docs(plugin-import-export): document plugin-import-export (#13243)
Add documentation for @payloadcms/plugin-import-export.
2025-07-24 17:03:21 +00:00
Patrik
7e81d30808 fix(ui): ensure document unlocks when logging out from edit view of a locked document (#13142)
### What?

Refactors the `LeaveWithoutSaving` modal to be generic and delegates
document unlock logic back to the `DefaultEditView` component via a
callback.

### Why?

Previously, `unlockDocument` was triggered in a cleanup `useEffect` in
the edit view. When logging out from the edit view, the unlock request
would often fail due to the session ending — leaving the document in a
locked state.

### How?

- Introduced `onConfirm` and `onPrevent` props for `LeaveWithoutSaving`.
- Moved all document lock/unlock logic into `DefaultEditView`’s
`handleLeaveConfirm`.
- Captures the next navigation target via `onPrevent` and evaluates
whether to unlock based on:
  - Locking being enabled.
  - Current user owning the lock.
- Navigation not targeting internal admin views (`/preview`, `/api`,
`/versions`).

---------

Co-authored-by: Jarrod Flesch <jarrodmflesch@gmail.com>
2025-07-24 09:18:49 -07:00
Sasha
a83ed5ebb5 fix(db-postgres): search is broken when useAsTitle is not specified (#13232)
Fixes https://github.com/payloadcms/payload/issues/13171
2025-07-24 18:42:17 +03:00
Patrik
8f85da8931 fix(plugin-import-export): json preview and downloads preserve nesting and exclude disabled fields (#13210)
### What?

Improves both the JSON preview and export functionality in the
import-export plugin:
- Preserves proper nesting of object and array fields (e.g., groups,
tabs, arrays)
- Excludes any fields explicitly marked as `disabled` via
`custom.plugin-import-export`
- Ensures downloaded files use proper JSON formatting when `format` is
`json` (no CSV-style flattening)

### Why?

Previously:
- The JSON preview flattened all fields to a single level and included
disabled fields.
- Exported files with `format: json` were still CSV-style data encoded
as `.json`, rather than real JSON.

### How?

- Refactored `/preview-data` JSON handling to preserve original document
shape.
- Applied `removeDisabledFields` to clean nested fields using
dot-notation paths.
- Updated `createExport` to skip `flattenObject` for JSON formats, using
a nested JSON filter instead.
- Fixed streaming and buffered export paths to output valid JSON arrays
when `format` is `json`.
2025-07-24 11:36:46 -04:00
Sasha
65b110e4e7 fix type 2025-07-23 19:21:04 +03:00
Sasha
3d57b06f83 Merge branch 'main' of github.com:payloadcms/payload into feat/add-d1-adapter 2025-07-23 19:05:44 +03:00
Sasha
8595b575f5 fix type 2025-07-23 19:04:23 +03:00
Sasha
9ce07c75c3 fix type 2025-07-23 18:59:59 +03:00
Sasha
be4f11cd15 Merge branch 'main' of github.com:payloadcms/payload into feat/add-d1-adapter 2025-07-23 18:59:51 +03:00
Sasha
c3af32e133 fix postgres build 2 2025-05-26 19:23:41 +03:00
Sasha
fd850e734b fix postgres build 2025-05-26 18:50:25 +03:00
Sasha
5de2f52aa0 fix vercel postgres build 2025-05-26 18:49:24 +03:00
Sasha
733594b9c2 fix package jason 2025-05-26 17:54:27 +03:00
Sasha
9bd5f6f5f8 try testing 2025-05-26 17:53:47 +03:00
Sasha
da4270f299 sqlOnly migrations 2025-05-26 17:16:15 +03:00
Sasha
52f9dcae82 finish d1 package and fix errors 2025-05-26 17:11:12 +03:00
Sasha
0829cfb712 fix imports 2025-05-25 00:58:32 +03:00
Sasha
f51c972ac1 add exports from drizzle 2025-05-24 16:25:27 +03:00
Sasha
6652608c10 move sqlite logic to the drizzle package 2025-05-24 16:17:28 +03:00
458 changed files with 19739 additions and 2175 deletions

View File

@@ -284,6 +284,7 @@ jobs:
- fields__collections__Text
- fields__collections__UI
- fields__collections__Upload
- group-by
- folders
- hooks
- lexical__collections__Lexical__e2e__main
@@ -303,6 +304,7 @@ jobs:
- plugin-nested-docs
- plugin-seo
- sort
- trash
- versions
- uploads
env:
@@ -419,6 +421,7 @@ jobs:
- fields__collections__Text
- fields__collections__UI
- fields__collections__Upload
- group-by
- folders
- hooks
- lexical__collections__Lexical__e2e__main
@@ -438,6 +441,7 @@ jobs:
- plugin-nested-docs
- plugin-seo
- sort
- trash
- versions
- uploads
env:

7
.vscode/launch.json vendored
View File

@@ -139,6 +139,13 @@
"request": "launch",
"type": "node-terminal"
},
{
"command": "pnpm tsx --no-deprecation test/dev.ts trash",
"cwd": "${workspaceFolder}",
"name": "Run Dev Trash",
"request": "launch",
"type": "node-terminal"
},
{
"command": "pnpm tsx --no-deprecation test/dev.ts uploads",
"cwd": "${workspaceFolder}",

View File

@@ -77,7 +77,7 @@ All auto-generated files will contain the following comments at the top of each
## Admin Options
All options for the Admin Panel are defined in your [Payload Config](../configuration/overview) under the `admin` property:
All root-level options for the Admin Panel are defined in your [Payload Config](../configuration/overview) under the `admin` property:
```ts
import { buildConfig } from 'payload'

View File

@@ -60,32 +60,33 @@ export const Posts: CollectionConfig = {
The following options are available:
| Option | Description |
| -------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `admin` | The configuration options for the Admin Panel. [More details](#admin-options). |
| `access` | Provide Access Control functions to define exactly who should be able to do what with Documents in this Collection. [More details](../access-control/collections). |
| `auth` | Specify options if you would like this Collection to feature authentication. [More details](../authentication/overview). |
| `custom` | Extension point for adding custom data (e.g. for plugins) |
| `disableDuplicate` | When true, do not show the "Duplicate" button while editing documents within this Collection and prevent `duplicate` from all APIs. |
| `defaultSort` | Pass a top-level field to sort by default in the Collection List View. Prefix the name of the field with a minus symbol ("-") to sort in descending order. Multiple fields can be specified by using a string array. |
| `dbName` | Custom table or Collection name depending on the Database Adapter. Auto-generated from slug if not defined. |
| `endpoints` | Add custom routes to the REST API. Set to `false` to disable routes. [More details](../rest-api/overview#custom-endpoints). |
| `fields` \* | Array of field types that will determine the structure and functionality of the data stored within this Collection. [More details](../fields/overview). |
| `graphQL` | Manage GraphQL-related properties for this collection. [More](#graphql) |
| `hooks` | Entry point for Hooks. [More details](../hooks/overview#collection-hooks). |
| `orderable` | If true, enables custom ordering for the collection, and documents can be reordered via drag and drop. Uses [fractional indexing](https://observablehq.com/@dgreensp/implementing-fractional-indexing) for efficient reordering. |
| `labels` | Singular and plural labels for use in identifying this Collection throughout Payload. Auto-generated from slug if not defined. |
| `enableQueryPresets` | Enable query presets for this Collection. [More details](../query-presets/overview). |
| `lockDocuments` | Enables or disables document locking. By default, document locking is enabled. Set to an object to configure, or set to `false` to disable locking. [More details](../admin/locked-documents). |
| `slug` \* | Unique, URL-friendly string that will act as an identifier for this Collection. |
| `timestamps` | Set to false to disable documents' automatically generated `createdAt` and `updatedAt` timestamps. |
| `typescript` | An object with property `interface` as the text used in schema generation. Auto-generated from slug if not defined. |
| `upload` | Specify options if you would like this Collection to support file uploads. For more, consult the [Uploads](../upload/overview) documentation. |
| `versions` | Set to true to enable default options, or configure with object properties. [More details](../versions/overview#collection-config). |
| `defaultPopulate` | Specify which fields to select when this Collection is populated from another document. [More Details](../queries/select#defaultpopulate-collection-config-property). |
| `indexes` | Define compound indexes for this collection. This can be used to either speed up querying/sorting by 2 or more fields at the same time or to ensure uniqueness between several fields. [More details](../database/indexes#compound-indexes). |
| `forceSelect` | Specify which fields should be selected always, regardless of the `select` query which can be useful that the field exists for access control / hooks |
| `disableBulkEdit` | Disable the bulk edit operation for the collection in the admin panel and the REST API |
| Option | Description |
| -------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `admin` | The configuration options for the Admin Panel. [More details](#admin-options). |
| `access` | Provide Access Control functions to define exactly who should be able to do what with Documents in this Collection. [More details](../access-control/collections). |
| `auth` | Specify options if you would like this Collection to feature authentication. [More details](../authentication/overview). |
| `custom` | Extension point for adding custom data (e.g. for plugins) |
| `disableDuplicate` | When true, do not show the "Duplicate" button while editing documents within this Collection and prevent `duplicate` from all APIs. |
| `defaultSort` | Pass a top-level field to sort by default in the Collection List View. Prefix the name of the field with a minus symbol ("-") to sort in descending order. Multiple fields can be specified by using a string array. |
| `dbName` | Custom table or Collection name depending on the Database Adapter. Auto-generated from slug if not defined. |
| `endpoints` | Add custom routes to the REST API. Set to `false` to disable routes. [More details](../rest-api/overview#custom-endpoints). |
| `fields` \* | Array of field types that will determine the structure and functionality of the data stored within this Collection. [More details](../fields/overview). |
| `graphQL` | Manage GraphQL-related properties for this collection. [More](#graphql) |
| `hooks` | Entry point for Hooks. [More details](../hooks/overview#collection-hooks). |
| `orderable` | If true, enables custom ordering for the collection, and documents can be reordered via drag and drop. Uses [fractional indexing](https://observablehq.com/@dgreensp/implementing-fractional-indexing) for efficient reordering. |
| `labels` | Singular and plural labels for use in identifying this Collection throughout Payload. Auto-generated from slug if not defined. |
| `enableQueryPresets` | Enable query presets for this Collection. [More details](../query-presets/overview). |
| `lockDocuments` | Enables or disables document locking. By default, document locking is enabled. Set to an object to configure, or set to `false` to disable locking. [More details](../admin/locked-documents). |
| `slug` \* | Unique, URL-friendly string that will act as an identifier for this Collection. |
| `timestamps` | Set to false to disable documents' automatically generated `createdAt` and `updatedAt` timestamps. |
| `trash` | A boolean to enable soft deletes for this collection. Defaults to `false`. [More details](../trash/overview). |
| `typescript` | An object with property `interface` as the text used in schema generation. Auto-generated from slug if not defined. |
| `upload` | Specify options if you would like this Collection to support file uploads. For more, consult the [Uploads](../upload/overview) documentation. |
| `versions` | Set to true to enable default options, or configure with object properties. [More details](../versions/overview#collection-config). |
| `defaultPopulate` | Specify which fields to select when this Collection is populated from another document. [More Details](../queries/select#defaultpopulate-collection-config-property). |
| `indexes` | Define compound indexes for this collection. This can be used to either speed up querying/sorting by 2 or more fields at the same time or to ensure uniqueness between several fields. |
| `forceSelect` | Specify which fields should be selected always, regardless of the `select` query which can be useful that the field exists for access control / hooks |
| `disableBulkEdit` | Disable the bulk edit operation for the collection in the admin panel and the REST API |
_\* An asterisk denotes that a property is required._
@@ -130,6 +131,7 @@ The following options are available:
| `description` | Text to display below the Collection label in the List View to give editors more information. Alternatively, you can use the `admin.components.Description` to render a React component. [More details](#custom-components). |
| `defaultColumns` | Array of field names that correspond to which columns to show by default in this Collection's List View. |
| `disableCopyToLocale` | Disables the "Copy to Locale" button while editing documents within this Collection. Only applicable when localization is enabled. |
| `groupBy` | Beta. Enable grouping by a field in the list view. |
| `hideAPIURL` | Hides the "API URL" meta field while editing documents within this Collection. |
| `enableRichTextLink` | The [Rich Text](../fields/rich-text) field features a `Link` element which allows for users to automatically reference related documents within their rich text. Set to `true` by default. |
| `enableRichTextRelationship` | The [Rich Text](../fields/rich-text) field features a `Relationship` element which allows for users to automatically reference related documents within their rich text. Set to `true` by default. |

View File

@@ -77,7 +77,6 @@ This configuration only queues the Job - it does not execute it immediately. To
```ts
export default buildConfig({
jobs: {
scheduler: 'cron',
autoRun: [
{
cron: '* * * * *', // Runs every minute

View File

@@ -51,6 +51,7 @@ export default async function Page() {
collection: 'pages',
id: '123',
draft: true,
trash: true, // add this if trash is enabled in your collection and want to preview trashed documents
})
return (

View File

@@ -1,7 +1,7 @@
---
title: Form Builder Plugin
label: Form Builder
order: 40
order: 30
desc: Easily build and manage forms from the Admin Panel. Send dynamic, personalized emails and even accept and process payments.
keywords: plugins, plugin, form, forms, form builder
---

View File

@@ -0,0 +1,155 @@
---
title: Import Export Plugin
label: Import Export
order: 40
desc: Add Import and export functionality to create CSV and JSON data exports
keywords: plugins, plugin, import, export, csv, JSON, data, ETL, download
---
![https://www.npmjs.com/package/@payloadcms/plugin-import-export](https://img.shields.io/npm/v/@payloadcms/plugin-import-export)
<Banner type="warning">
**Note**: This plugin is in **beta** as some aspects of it may change on any
minor releases. It is under development and currently only supports exporting
of collection data.
</Banner>
This plugin adds features that give admin users the ability to download or create export data as an upload collection and import it back into a project.
## Core Features
- Export data as CSV or JSON format via the admin UI
- Download the export directly through the browser
- Create a file upload of the export data
- Use the jobs queue for large exports
- (Coming soon) Import collection data
## Installation
Install the plugin using any JavaScript package manager like [pnpm](https://pnpm.io), [npm](https://npmjs.com), or [Yarn](https://yarnpkg.com):
```bash
pnpm add @payloadcms/plugin-import-export
```
## Basic Usage
In the `plugins` array of your [Payload Config](https://payloadcms.com/docs/configuration/overview), call the plugin with [options](#options):
```ts
import { buildConfig } from 'payload'
import { importExportPlugin } from '@payloadcms/plugin-import-export'
const config = buildConfig({
collections: [Pages, Media],
plugins: [
importExportPlugin({
collections: ['users', 'pages'],
// see below for a list of available options
}),
],
})
export default config
```
## Options
| Property | Type | Description |
| -------------------------- | -------- | ------------------------------------------------------------------------------------------------------------------------------------ |
| `collections` | string[] | Collections to include Import/Export controls in. Defaults to all collections. |
| `debug` | boolean | If true, enables debug logging. |
| `disableDownload` | boolean | If true, disables the download button in the export preview UI. |
| `disableJobsQueue` | boolean | If true, forces the export to run synchronously. |
| `disableSave` | boolean | If true, disables the save button in the export preview UI. |
| `format` | string | Forces a specific export format (`csv` or `json`), hides the format dropdown, and prevents the user from choosing the export format. |
| `overrideExportCollection` | function | Function to override the default export collection; takes the default export collection and allows you to modify and return it. |
## Field Options
In addition to the above plugin configuration options, you can granularly set the following field level options using the `custom['plugin-import-export']` properties in any of your collections.
| Property | Type | Description |
| ---------- | -------- | ----------------------------------------------------------------------------------------------------------------------------- |
| `disabled` | boolean | When `true` the field is completely excluded from the import-export plugin. |
| `toCSV` | function | Custom function used to modify the outgoing csv data by manipulating the data, siblingData or by returning the desired value. |
### Customizing the output of CSV data
To manipulate the data that a field exports you can add `toCSV` custom functions. This allows you to modify the outgoing csv data by manipulating the data, siblingData or by returning the desired value.
The toCSV function argument is an object with the following properties:
| Property | Type | Description |
| ------------ | ------- | ----------------------------------------------------------------- |
| `columnName` | string | The CSV column name given to the field. |
| `doc` | object | The top level document |
| `row` | object | The object data that can be manipulated to assign data to the CSV |
| `siblingDoc` | object | The document data at the level where it belongs |
| `value` | unknown | The data for the field. |
Example function:
```ts
const pages: CollectionConfig = {
slug: 'pages',
fields: [
{
name: 'author',
type: 'relationship',
relationTo: 'users',
custom: {
'plugin-import-export': {
toCSV: ({ value, columnName, row }) => {
// add both `author_id` and the `author_email` to the csv export
if (
value &&
typeof value === 'object' &&
'id' in value &&
'email' in value
) {
row[`${columnName}_id`] = (value as { id: number | string }).id
row[`${columnName}_email`] = (value as { email: string }).email
}
},
},
},
},
],
}
```
## Exporting Data
There are four possible ways that the plugin allows for exporting documents, the first two are available in the admin UI from the list view of a collection:
1. Direct download - Using a `POST` to `/api/exports/download` and streams the response as a file download
2. File storage - Goes to the `exports` collection as an uploads enabled collection
3. Local API - A create call to the uploads collection: `payload.create({ slug: 'uploads', ...parameters })`
4. Jobs Queue - `payload.jobs.queue({ task: 'createCollectionExport', input: parameters })`
By default, a user can use the Export drawer to create a file download by choosing `Save` or stream a downloadable file directly without persisting it by using the `Download` button. Either option can be disabled to provide the export experience you desire for your use-case.
The UI for creating exports provides options so that users can be selective about which documents to include and also which columns or fields to include.
It is necessary to add access control to the uploads collection configuration using the `overrideExportCollection` function if you have enabled this plugin on collections with data that some authenticated users should not have access to.
<Banner type="warning">
**Note**: Users who have read access to the upload collection may be able to
download data that is normally not readable due to [access
control](../access-control/overview).
</Banner>
The following parameters are used by the export function to handle requests:
| Property | Type | Description |
| ---------------- | -------- | ----------------------------------------------------------------------------------------------------------------- |
| `format` | text | Either `csv` or `json` to determine the shape of data exported |
| `limit` | number | The max number of documents to return |
| `sort` | select | The field to use for ordering documents |
| `locale` | string | The locale code to query documents or `all` |
| `draft` | string | Either `yes` or `no` to return documents with their newest drafts for drafts enabled collections |
| `fields` | string[] | Which collection fields are used to create the export, defaults to all |
| `collectionSlug` | string | The slug to query against |
| `where` | object | The WhereObject used to query documents to export. This is set by making selections or filters from the list view |
| `filename` | text | What to call the export being created |

View File

@@ -1,7 +1,7 @@
---
title: Multi-Tenant Plugin
label: Multi-Tenant
order: 40
order: 50
desc: Scaffolds multi-tenancy for your Payload application
keywords: plugins, multi-tenant, multi-tenancy, plugin, payload, cms, seo, indexing, search, search engine
---
@@ -229,15 +229,15 @@ const config = buildConfig({
{
slug: 'tenants',
admin: {
useAsTitle: 'name'
useAsTitle: 'name',
},
fields: [
// remember, you own these fields
// these are merely suggestions/examples
{
name: 'name',
type: 'text',
required: true,
name: 'name',
type: 'text',
required: true,
},
{
name: 'slug',
@@ -248,7 +248,7 @@ const config = buildConfig({
name: 'domain',
type: 'text',
required: true,
}
},
],
},
],
@@ -258,7 +258,7 @@ const config = buildConfig({
pages: {},
navigation: {
isGlobal: true,
}
},
},
}),
],

View File

@@ -1,7 +1,7 @@
---
title: Nested Docs Plugin
label: Nested Docs
order: 40
order: 60
desc: Nested documents in a parent, child, and sibling relationship.
keywords: plugins, nested, documents, parent, child, sibling, relationship
---

View File

@@ -55,6 +55,7 @@ Payload maintains a set of Official Plugins that solve for some of the common us
- [Sentry](./sentry)
- [SEO](./seo)
- [Stripe](./stripe)
- [Import/Export](./import-export)
You can also [build your own plugin](./build-your-own) to easily extend Payload's functionality in some other way. Once your plugin is ready, consider [sharing it with the community](#community-plugins).

View File

@@ -1,7 +1,7 @@
---
title: Redirects Plugin
label: Redirects
order: 40
order: 70
desc: Automatically create redirects for your Payload application
keywords: plugins, redirects, redirect, plugin, payload, cms, seo, indexing, search, search engine
---

View File

@@ -1,7 +1,7 @@
---
title: Search Plugin
label: Search
order: 40
order: 80
desc: Generates records of your documents that are extremely fast to search on.
keywords: plugins, search, search plugin, search engine, search index, search results, search bar, search box, search field, search form, search input
---

View File

@@ -1,7 +1,7 @@
---
title: Sentry Plugin
label: Sentry
order: 40
order: 90
desc: Integrate Sentry error tracking into your Payload application
keywords: plugins, sentry, error, tracking, monitoring, logging, bug, reporting, performance
---

View File

@@ -2,7 +2,7 @@
description: Manage SEO metadata from your Payload admin
keywords: plugins, seo, meta, search, engine, ranking, google
label: SEO
order: 30
order: 100
title: SEO Plugin
---

View File

@@ -1,7 +1,7 @@
---
title: Stripe Plugin
label: Stripe
order: 40
order: 110
desc: Easily accept payments with Stripe
keywords: plugins, stripe, payments, ecommerce
---

200
docs/trash/overview.mdx Normal file
View File

@@ -0,0 +1,200 @@
---
title: Trash
label: Overview
order: 10
desc: Enable soft deletes for your collections to mark documents as deleted without permanently removing them.
keywords: trash, soft delete, deletedAt, recovery, restore
---
Trash (also known as soft delete) allows documents to be marked as deleted without being permanently removed. When enabled on a collection, deleted documents will receive a `deletedAt` timestamp, making it possible to restore them later, view them in a dedicated Trash view, or permanently delete them.
Soft delete is a safer way to manage content lifecycle, giving editors a chance to review and recover documents that may have been deleted by mistake.
<Banner type="warning">
**Note:** The Trash feature is currently in beta and may be subject to change
in minor version updates.
</Banner>
## Collection Configuration
To enable soft deleting for a collection, set the `trash` property to `true`:
```ts
import type { CollectionConfig } from 'payload'
export const Posts: CollectionConfig = {
slug: 'posts',
trash: true,
fields: [
{
name: 'title',
type: 'text',
},
// other fields...
],
}
```
When enabled, Payload automatically injects a deletedAt field into the collection's schema. This timestamp is set when a document is soft-deleted, and cleared when the document is restored.
## Admin Panel behavior
Once `trash` is enabled, the Admin Panel provides a dedicated Trash view for each collection:
- A new route is added at `/collections/:collectionSlug/trash`
- The `Trash` view shows all documents that have a `deletedAt` timestamp
From the Trash view, you can:
- Use bulk actions to manage trashed documents:
- **Restore** to clear the `deletedAt` timestamp and return documents to their original state
- **Delete** to permanently remove selected documents
- **Empty Trash** to select and permanently delete all trashed documents at once
- Enter each document's **edit view**, just like in the main list view. While in the edit view of a trashed document:
- All fields are in a **read-only** state
- Standard document actions (e.g., Save, Publish, Restore Version) are hidden and disabled.
- The available actions are **Restore** and **Permanently Delete**.
- Access to the **API**, **Versions**, and **Preview** views is preserved.
When deleting a document from the main collection List View, Payload will soft-delete the document by default. A checkbox in the delete confirmation modal allows users to skip the trash and permanently delete instead.
## API Support
Soft deletes are fully supported across all Payload APIs: **Local**, **REST**, and **GraphQL**.
The following operations respect and support the `trash` functionality:
- `find`
- `findByID`
- `update`
- `updateByID`
- `delete`
- `deleteByID`
- `findVersions`
- `findVersionByID`
### Understanding `trash` Behavior
Passing `trash: true` to these operations will **include soft-deleted documents** in the query results.
To return _only_ soft-deleted documents, you must combine `trash: true` with a `where` clause that checks if `deletedAt` exists.
### Examples
#### Local API
Return all documents including trashed:
```ts
const result = await payload.find({
collection: 'posts',
trash: true,
})
```
Return only trashed documents:
```ts
const result = await payload.find({
collection: 'posts',
trash: true,
where: {
deletedAt: {
exists: true,
},
},
})
```
Return only non-trashed documents:
```ts
const result = await payload.find({
collection: 'posts',
trash: false,
})
```
#### REST
Return **all** documents including trashed:
```http
GET /api/posts?trash=true
```
Return **only trashed** documents:
```http
GET /api/posts?trash=true&where[deletedAt][exists]=true
```
Return only non-trashed documents:
```http
GET /api/posts?trash=false
```
#### GraphQL
Return all documents including trashed:
```ts
query {
Posts(trash: true) {
docs {
id
deletedAt
}
}
}
```
Return only trashed documents:
```ts
query {
Posts(
trash: true
where: { deletedAt: { exists: true } }
) {
docs {
id
deletedAt
}
}
}
```
Return only non-trashed documents:
```ts
query {
Posts(trash: false) {
docs {
id
deletedAt
}
}
}
```
## Access Control
All trash-related actions (delete, permanent delete) respect the `delete` access control defined in your collection config.
This means:
- If a user is denied delete access, they cannot soft delete or permanently delete documents
## Versions and Trash
When a document is soft-deleted:
- It can no longer have a version **restored** until it is first restored from trash
- Attempting to restore a version while the document is in trash will result in an error
- This ensures consistency between the current document state and its version history
However, versions are still fully **visible and accessible** from the **edit view** of a trashed document. You can view the full version history, but must restore the document itself before restoring any individual version.

View File

@@ -1,6 +1,6 @@
{
"name": "payload-monorepo",
"version": "3.48.0",
"version": "3.49.0",
"private": true,
"type": "module",
"workspaces": [
@@ -19,6 +19,7 @@
"build:core": "turbo build --filter \"!@payloadcms/plugin-*\" --filter \"!@payloadcms/storage-*\" --filter \"!blank\" --filter \"!website\"",
"build:core:force": "pnpm clean:build && pnpm build:core --no-cache --force",
"build:create-payload-app": "turbo build --filter create-payload-app",
"build:db-d1-sqlite": "turbo build --filter \"@payloadcms/db-d1-sqlite\"",
"build:db-mongodb": "turbo build --filter \"@payloadcms/db-mongodb\"",
"build:db-postgres": "turbo build --filter \"@payloadcms/db-postgres\"",
"build:db-sqlite": "turbo build --filter \"@payloadcms/db-sqlite\"",
@@ -132,12 +133,12 @@
"devDependencies": {
"@jest/globals": "29.7.0",
"@libsql/client": "0.14.0",
"@next/bundle-analyzer": "15.3.2",
"@next/bundle-analyzer": "15.4.4",
"@payloadcms/db-postgres": "workspace:*",
"@payloadcms/eslint-config": "workspace:*",
"@payloadcms/eslint-plugin": "workspace:*",
"@payloadcms/live-preview-react": "workspace:*",
"@playwright/test": "1.50.0",
"@playwright/test": "1.54.1",
"@sentry/nextjs": "^8.33.1",
"@sentry/node": "^8.33.1",
"@swc-node/register": "1.10.10",
@@ -147,8 +148,8 @@
"@types/jest": "29.5.12",
"@types/minimist": "1.2.5",
"@types/node": "22.15.30",
"@types/react": "19.1.0",
"@types/react-dom": "19.1.2",
"@types/react": "19.1.8",
"@types/react-dom": "19.1.6",
"@types/shelljs": "0.8.15",
"chalk": "^4.1.2",
"comment-json": "^4.2.3",
@@ -168,12 +169,12 @@
"lint-staged": "15.2.7",
"minimist": "1.2.8",
"mongodb-memory-server": "10.1.4",
"next": "15.3.2",
"next": "15.4.4",
"open": "^10.1.0",
"p-limit": "^5.0.0",
"pg": "8.16.3",
"playwright": "1.50.0",
"playwright-core": "1.50.0",
"playwright": "1.54.1",
"playwright-core": "1.54.1",
"prettier": "3.5.3",
"react": "19.1.0",
"react-dom": "19.1.0",

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/admin-bar",
"version": "3.48.0",
"version": "3.49.0",
"description": "An admin bar for React apps using Payload",
"homepage": "https://payloadcms.com",
"repository": {
@@ -42,8 +42,8 @@
},
"devDependencies": {
"@payloadcms/eslint-config": "workspace:*",
"@types/react": "19.1.0",
"@types/react-dom": "19.1.2",
"@types/react": "19.1.8",
"@types/react-dom": "19.1.6",
"payload": "workspace:*"
},
"peerDependencies": {

View File

@@ -1,6 +1,6 @@
{
"name": "create-payload-app",
"version": "3.48.0",
"version": "3.49.0",
"homepage": "https://payloadcms.com",
"repository": {
"type": "git",

1
packages/db-d1-sqlite/.gitignore vendored Normal file
View File

@@ -0,0 +1 @@
/migrations

View File

@@ -0,0 +1,10 @@
.tmp
**/.git
**/.hg
**/.pnp.*
**/.svn
**/.yarn/**
**/build
**/dist/**
**/node_modules
**/temp

View File

@@ -0,0 +1,15 @@
{
"$schema": "https://json.schemastore.org/swcrc",
"sourceMaps": true,
"jsc": {
"target": "esnext",
"parser": {
"syntax": "typescript",
"tsx": true,
"dts": true
}
},
"module": {
"type": "es6"
}
}

View File

@@ -0,0 +1,22 @@
MIT License
Copyright (c) 2018-2025 Payload CMS, Inc. <info@payloadcms.com>
Permission is hereby granted, free of charge, to any person obtaining
a copy of this software and associated documentation files (the
'Software'), to deal in the Software without restriction, including
without limitation the rights to use, copy, modify, merge, publish,
distribute, sublicense, and/or sell copies of the Software, and to
permit persons to whom the Software is furnished to do so, subject to
the following conditions:
The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

View File

@@ -0,0 +1,30 @@
# Payload SQLite Adapter
Official SQLite adapter for [Payload](https://payloadcms.com).
- [Main Repository](https://github.com/payloadcms/payload)
- [Payload Docs](https://payloadcms.com/docs)
## Installation
```bash
npm install @payloadcms/db-sqlite
```
## Usage
```ts
import { buildConfig } from 'payload/config'
import { sqliteAdapter } from '@payloadcms/db-sqlite'
export default buildConfig({
db: sqliteAdapter({
client: {
url: process.env.DATABASE_URI,
},
}),
// ...rest of config
})
```
More detailed usage can be found in the [Payload Docs](https://payloadcms.com/docs/configuration/overview).

View File

@@ -0,0 +1,38 @@
import * as esbuild from 'esbuild'
import fs from 'fs'
import path from 'path'
import { fileURLToPath } from 'url'
const filename = fileURLToPath(import.meta.url)
const dirname = path.dirname(filename)
import { commonjs } from '@hyrious/esbuild-plugin-commonjs'
async function build() {
const resultServer = await esbuild.build({
entryPoints: ['src/index.ts'],
bundle: true,
platform: 'node',
format: 'esm',
outfile: 'dist/index.js',
splitting: false,
external: [
'*.scss',
'*.css',
'drizzle-kit',
'libsql',
'pg',
'@payloadcms/translations',
'@payloadcms/drizzle',
'payload',
'payload/*',
],
minify: true,
metafile: true,
tsconfig: path.resolve(dirname, './tsconfig.json'),
plugins: [commonjs()],
sourcemap: true,
})
console.log('db-sqlite bundled successfully')
fs.writeFileSync('meta_server.json', JSON.stringify(resultServer.metafile))
}
await build()

View File

@@ -0,0 +1,145 @@
{
"name": "@payloadcms/db-d1-sqlite",
"version": "3.49.0",
"description": "The officially supported D1 SQLite database adapter for Payload",
"homepage": "https://payloadcms.com",
"repository": {
"type": "git",
"url": "https://github.com/payloadcms/payload.git",
"directory": "packages/db-d1-sqlite"
},
"license": "MIT",
"author": "Payload <dev@payloadcms.com> (https://payloadcms.com)",
"maintainers": [
{
"name": "Payload",
"email": "info@payloadcms.com",
"url": "https://payloadcms.com"
}
],
"type": "module",
"exports": {
".": {
"import": "./src/index.ts",
"require": "./src/index.ts",
"types": "./src/index.ts"
},
"./types": {
"import": "./src/exports/types-deprecated.ts",
"require": "./src/exports/types-deprecated.ts",
"types": "./src/exports/types-deprecated.ts"
},
"./migration-utils": {
"import": "./src/exports/migration-utils.ts",
"require": "./src/exports/migration-utils.ts",
"types": "./src/exports/migration-utils.ts"
},
"./drizzle": {
"import": "./src/drizzle-proxy/index.ts",
"types": "./src/drizzle-proxy/index.ts",
"default": "./src/drizzle-proxy/index.ts"
},
"./drizzle/sqlite-core": {
"import": "./src/drizzle-proxy/sqlite-core.ts",
"types": "./src/drizzle-proxy/sqlite-core.ts",
"default": "./src/drizzle-proxy/sqlite-core.ts"
},
"./drizzle/d1": {
"import": "./src/drizzle-proxy/d1.ts",
"types": "./src/drizzle-proxy/d1.ts",
"default": "./src/drizzle-proxy/d1.ts"
},
"./drizzle/relations": {
"import": "./src/drizzle-proxy/relations.ts",
"types": "./src/drizzle-proxy/relations.ts",
"default": "./src/drizzle-proxy/relations.ts"
},
"./drizzle/miniflare": {
"import": "./src/drizzle-proxy/miniflare.ts",
"types": "./src/drizzle-proxy/miniflare.ts",
"default": "./src/drizzle-proxy/miniflare.ts"
}
},
"main": "./src/index.ts",
"types": "./src/index.ts",
"files": [
"dist",
"mock.js"
],
"scripts": {
"build": "pnpm build:swc && pnpm build:types",
"build:swc": "swc ./src -d ./dist --config-file .swcrc --strip-leading-paths",
"build:types": "tsc --emitDeclarationOnly --outDir dist",
"clean": "rimraf -g {dist,*.tsbuildinfo}",
"lint": "eslint .",
"lint:fix": "eslint . --fix",
"prepack": "pnpm clean && pnpm turbo build",
"prepublishOnly": "pnpm clean && pnpm turbo build"
},
"dependencies": {
"@miniflare/d1": "2.14.4",
"@payloadcms/drizzle": "workspace:*",
"console-table-printer": "2.12.1",
"drizzle-kit": "0.28.0",
"drizzle-orm": "0.36.1",
"prompts": "2.4.2",
"to-snake-case": "1.0.0",
"uuid": "9.0.0"
},
"devDependencies": {
"@payloadcms/eslint-config": "workspace:*",
"@types/pg": "8.10.2",
"@types/to-snake-case": "1.0.0",
"@types/uuid": "10.0.0",
"payload": "workspace:*"
},
"peerDependencies": {
"payload": "workspace:*"
},
"publishConfig": {
"exports": {
".": {
"import": "./dist/index.js",
"require": "./dist/index.js",
"types": "./dist/index.d.ts"
},
"./types": {
"import": "./dist/exports/types-deprecated.js",
"require": "./dist/exports/types-deprecated.js",
"types": "./dist/exports/types-deprecated.d.ts"
},
"./migration-utils": {
"import": "./dist/exports/migration-utils.js",
"require": "./dist/exports/migration-utils.js",
"types": "./dist/exports/migration-utils.d.ts"
},
"./drizzle": {
"import": "./dist/drizzle-proxy/index.js",
"types": "./dist/drizzle-proxy/index.d.ts",
"default": "./dist/drizzle-proxy/index.js"
},
"./drizzle/sqlite-core": {
"import": "./dist/drizzle-proxy/sqlite-core.js",
"types": "./dist/drizzle-proxy/sqlite-core.d.ts",
"default": "./dist/drizzle-proxy/sqlite-core.js"
},
"./drizzle/d1": {
"import": "./dist/drizzle-proxy/d1.js",
"types": "./dist/drizzle-proxy/d1.d.ts",
"default": "./dist/drizzle-proxy/d1.js"
},
"./drizzle/relations": {
"import": "./dist/drizzle-proxy/relations.js",
"types": "./dist/drizzle-proxy/relations.d.ts",
"default": "./dist/drizzle-proxy/relations.js"
},
"./drizzle/miniflare": {
"import": "./dist/drizzle-proxy/miniflare.js",
"types": "./dist/drizzle-proxy/miniflare.d.ts",
"default": "./dist/drizzle-proxy/miniflare.js"
}
},
"main": "./dist/index.js",
"types": "./dist/index.d.ts"
}
}

View File

@@ -0,0 +1,62 @@
import type { DrizzleAdapter } from '@payloadcms/drizzle/types'
import type { Connect, Migration } from 'payload'
import { D1Database } from '@miniflare/d1'
import { pushDevSchema } from '@payloadcms/drizzle'
import { drizzle } from 'drizzle-orm/d1'
import type { SQLiteD1Adapter } from './types.js'
export const connect: Connect = async function connect(
this: SQLiteD1Adapter,
options = {
hotReload: false,
},
) {
const { hotReload } = options
this.schema = {
...this.tables,
...this.relations,
}
try {
const logger = this.logger || false
this.drizzle = drizzle(new D1Database(this.binding), { logger })
this.client = this.drizzle.$client as any
if (!hotReload) {
if (process.env.PAYLOAD_DROP_DATABASE === 'true') {
this.payload.logger.info(`---- DROPPING TABLES ----`)
await this.dropDatabase({ adapter: this })
this.payload.logger.info('---- DROPPED TABLES ----')
}
}
} catch (err) {
const message = err instanceof Error ? err.message : String(err)
this.payload.logger.error({ err, msg: `Error: cannot connect to SQLite: ${message}` })
if (typeof this.rejectInitializing === 'function') {
this.rejectInitializing()
}
console.error(err)
process.exit(1)
}
// Only push schema if not in production
if (
process.env.NODE_ENV !== 'production' &&
process.env.PAYLOAD_MIGRATING !== 'true' &&
this.push !== false
) {
await pushDevSchema(this as unknown as DrizzleAdapter)
}
if (typeof this.resolveInitializing === 'function') {
this.resolveInitializing()
}
if (process.env.NODE_ENV === 'production' && this.prodMigrations) {
await this.migrate({ migrations: this.prodMigrations as Migration[] })
}
}

View File

@@ -0,0 +1 @@
export * from 'drizzle-orm/d1'

View File

@@ -0,0 +1 @@
export * from 'drizzle-orm'

View File

@@ -0,0 +1 @@
export * from '@miniflare/d1'

View File

@@ -0,0 +1 @@
export * from 'drizzle-orm/relations'

View File

@@ -0,0 +1 @@
export * from 'drizzle-orm/sqlite-core'

View File

@@ -0,0 +1,67 @@
import type { Execute } from '@payloadcms/drizzle'
import type { SQLiteRaw } from 'drizzle-orm/sqlite-core/query-builders/raw'
import { sql } from 'drizzle-orm'
interface D1Meta {
changed_db: boolean
changes: number
duration: number
last_row_id: number
rows_read: number
rows_written: number
/**
* True if-and-only-if the database instance that executed the query was the primary.
*/
served_by_primary?: boolean
/**
* The region of the database instance that executed the query.
*/
served_by_region?: string
size_after: number
timings?: {
/**
* The duration of the SQL query execution by the database instance. It doesn't include any network time.
*/
sql_duration_ms: number
}
}
interface D1Response {
error?: never
meta: D1Meta & Record<string, unknown>
success: true
}
type D1Result<T = unknown> = {
results: T[]
} & D1Response
export const execute: Execute<any> = function execute({ db, drizzle, raw, sql: statement }) {
const executeFrom: any = (db ?? drizzle)!
const mapToLibSql = (query: SQLiteRaw<D1Result<unknown>>): any => {
const execute = query.execute
query.execute = async () => {
const result: D1Result = await execute()
const resultLibSQL = {
columns: undefined,
columnTypes: undefined,
lastInsertRowid: BigInt(result.meta.last_row_id),
rows: result.results as any[],
rowsAffected: result.meta.rows_written,
}
return Object.assign(result, resultLibSQL)
}
return query
}
if (raw) {
const result = mapToLibSql(executeFrom.run(sql.raw(raw)))
return result
} else {
const result = mapToLibSql(executeFrom.run(statement))
return result
}
}

View File

@@ -0,0 +1,79 @@
import type {
Args as _Args,
CountDistinct as _CountDistinct,
DeleteWhere as _DeleteWhere,
DropDatabase as _DropDatabase,
Execute as _Execute,
GeneratedDatabaseSchema as _GeneratedDatabaseSchema,
GenericColumns as _GenericColumns,
GenericRelation as _GenericRelation,
GenericTable as _GenericTable,
IDType as _IDType,
Insert as _Insert,
MigrateDownArgs as _MigrateDownArgs,
MigrateUpArgs as _MigrateUpArgs,
SQLiteD1Adapter as _SQLiteAdapter,
SQLiteSchemaHook as _SQLiteSchemaHook,
} from '../types.js'
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type SQLiteAdapter = _SQLiteAdapter
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type Args = _Args
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type CountDistinct = _CountDistinct
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type DeleteWhere = _DeleteWhere
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type DropDatabase = _DropDatabase
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type Execute<T> = _Execute<T>
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type GeneratedDatabaseSchema = _GeneratedDatabaseSchema
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type GenericColumns = _GenericColumns
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type GenericRelation = _GenericRelation
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type GenericTable = _GenericTable
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type IDType = _IDType
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type Insert = _Insert
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type MigrateDownArgs = _MigrateDownArgs
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type MigrateUpArgs = _MigrateUpArgs
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type SQLiteSchemaHook = _SQLiteSchemaHook

View File

@@ -0,0 +1,224 @@
import type { Operators } from '@payloadcms/drizzle'
import type { DatabaseAdapterObj, Payload } from 'payload'
import {
beginTransaction,
buildCreateMigration,
commitTransaction,
count,
countGlobalVersions,
countVersions,
create,
createGlobal,
createGlobalVersion,
createSchemaGenerator,
createVersion,
deleteMany,
deleteOne,
deleteVersions,
destroy,
find,
findGlobal,
findGlobalVersions,
findMigrationDir,
findOne,
findVersions,
migrate,
migrateDown,
migrateFresh,
migrateRefresh,
migrateReset,
migrateStatus,
operatorMap,
queryDrafts,
rollbackTransaction,
updateGlobal,
updateGlobalVersion,
updateJobs,
updateMany,
updateOne,
updateVersion,
} from '@payloadcms/drizzle'
import {
columnToCodeConverter,
convertPathToJSONTraversal,
countDistinct,
createJSONQuery,
defaultDrizzleSnapshot,
deleteWhere,
dropDatabase,
init,
insert,
requireDrizzleKit,
} from '@payloadcms/drizzle/sqlite'
import { like, notLike } from 'drizzle-orm'
import { createDatabaseAdapter, defaultBeginTransaction } from 'payload'
import { fileURLToPath } from 'url'
import type { Args, SQLiteD1Adapter } from './types.js'
import { connect } from './connect.js'
import { execute } from './execute.js'
const filename = fileURLToPath(import.meta.url)
export function sqliteD1Adapter(args: Args): DatabaseAdapterObj<SQLiteD1Adapter> {
const sqliteIDType = args.idType || 'number'
const payloadIDType = sqliteIDType === 'uuid' ? 'text' : 'number'
const allowIDOnCreate = args.allowIDOnCreate ?? false
function adapter({ payload }: { payload: Payload }) {
const migrationDir = findMigrationDir(args.migrationDir)
let resolveInitializing: () => void = () => {}
let rejectInitializing: () => void = () => {}
const initializing = new Promise<void>((res, rej) => {
resolveInitializing = res
rejectInitializing = rej
})
// sqlite's like operator is case-insensitive, so we overwrite the DrizzleAdapter operators to not use ilike
const operators = {
...operatorMap,
contains: like,
like,
not_like: notLike,
} as unknown as Operators
return createDatabaseAdapter<SQLiteD1Adapter>({
name: 'sqlite',
afterSchemaInit: args.afterSchemaInit ?? [],
allowIDOnCreate,
autoIncrement: args.autoIncrement ?? false,
beforeSchemaInit: args.beforeSchemaInit ?? [],
binding: args.binding,
// @ts-expect-error - vestiges of when tsconfig was not strict. Feel free to improve
client: undefined,
defaultDrizzleSnapshot,
// @ts-expect-error - vestiges of when tsconfig was not strict. Feel free to improve
drizzle: undefined,
features: {
json: true,
},
fieldConstraints: {},
generateSchema: createSchemaGenerator({
columnToCodeConverter,
corePackageSuffix: 'sqlite-core',
defaultOutputFile: args.generateSchemaOutputFile,
tableImport: 'sqliteTable',
}),
idType: sqliteIDType,
initializing,
localesSuffix: args.localesSuffix || '_locales',
logger: args.logger,
operators,
prodMigrations: args.prodMigrations,
// @ts-expect-error - vestiges of when tsconfig was not strict. Feel free to improve
push: args.push,
rawRelations: {},
rawTables: {},
relations: {},
relationshipsSuffix: args.relationshipsSuffix || '_rels',
schema: {},
schemaName: args.schemaName,
sessions: {},
tableNameMap: new Map<string, string>(),
tables: {},
// @ts-expect-error - vestiges of when tsconfig was not strict. Feel free to improve
execute,
// @ts-expect-error - vestiges of when tsconfig was not strict. Feel free to improve
transactionOptions: args.transactionOptions || undefined,
updateJobs,
updateMany,
versionsSuffix: args.versionsSuffix || '_v',
// DatabaseAdapter
beginTransaction: args.transactionOptions ? beginTransaction : defaultBeginTransaction(),
commitTransaction,
connect,
convertPathToJSONTraversal,
count,
countDistinct,
countGlobalVersions,
countVersions,
create,
createGlobal,
createGlobalVersion,
createJSONQuery,
createMigration: buildCreateMigration({
executeMethod: 'run',
filename,
sanitizeStatements({ sqlExecute, statements }) {
return statements
.map((statement) => `${sqlExecute}${statement?.replaceAll('`', '\\`')}\`)`)
.join('\n')
},
sqlOnly: true,
}),
createVersion,
defaultIDType: payloadIDType,
deleteMany,
deleteOne,
deleteVersions,
deleteWhere,
destroy,
dropDatabase,
find,
findGlobal,
findGlobalVersions,
findOne,
findVersions,
indexes: new Set<string>(),
init,
insert,
migrate,
migrateDown,
migrateFresh,
migrateRefresh,
migrateReset,
migrateStatus,
migrationDir,
packageName: '@payloadcms/db-d1-sqlite',
payload,
queryDrafts,
rejectInitializing,
requireDrizzleKit,
resolveInitializing,
rollbackTransaction,
updateGlobal,
updateGlobalVersion,
updateOne,
updateVersion,
upsert: updateOne,
})
}
return {
name: 'd1-sqlite',
allowIDOnCreate,
defaultIDType: payloadIDType,
init: adapter,
}
}
/**
* @todo deprecate /types subpath export in 4.0
*/
export type {
Args as SQLiteAdapterArgs,
CountDistinct,
DeleteWhere,
DropDatabase,
Execute,
GeneratedDatabaseSchema,
GenericColumns,
GenericRelation,
GenericTable,
IDType,
Insert,
MigrateDownArgs,
MigrateUpArgs,
SQLiteD1Adapter as SQLiteAdapter,
SQLiteSchemaHook,
} from './types.js'
export { sql } from 'drizzle-orm'

View File

@@ -0,0 +1,202 @@
import type { Client, ResultSet } from '@libsql/client'
import type { D1Database, D1Options, DatabaseBinding } from '@miniflare/d1'
const o: D1Options = {}
import type { extendDrizzleTable } from '@payloadcms/drizzle'
import type { BaseSQLiteAdapter, BaseSQLiteArgs } from '@payloadcms/drizzle/sqlite'
import type { BuildQueryJoinAliases, DrizzleAdapter } from '@payloadcms/drizzle/types'
import type { DrizzleConfig, Relation, Relations, SQL } from 'drizzle-orm'
import type { DrizzleD1Database } from 'drizzle-orm/d1'
import type { LibSQLDatabase } from 'drizzle-orm/libsql'
import type {
AnySQLiteColumn,
SQLiteInsertOnConflictDoUpdateConfig,
SQLiteTableWithColumns,
SQLiteTransactionConfig,
} from 'drizzle-orm/sqlite-core'
import type { SQLiteRaw } from 'drizzle-orm/sqlite-core/query-builders/raw'
import type { Payload, PayloadRequest } from 'payload'
type SQLiteSchema = {
relations: Record<string, GenericRelation>
tables: Record<string, SQLiteTableWithColumns<any>>
}
type SQLiteSchemaHookArgs = {
extendTable: typeof extendDrizzleTable
schema: SQLiteSchema
}
export type SQLiteSchemaHook = (args: SQLiteSchemaHookArgs) => Promise<SQLiteSchema> | SQLiteSchema
export type Args = {
binding: DatabaseBinding
} & BaseSQLiteArgs
export type GenericColumns = {
[x: string]: AnySQLiteColumn
}
export type GenericTable = SQLiteTableWithColumns<{
columns: GenericColumns
dialect: string
name: string
schema: string
}>
export type GenericRelation = Relations<string, Record<string, Relation<string>>>
export type CountDistinct = (args: {
db: LibSQLDatabase
joins: BuildQueryJoinAliases
tableName: string
where: SQL
}) => Promise<number>
export type DeleteWhere = (args: {
db: LibSQLDatabase
tableName: string
where: SQL
}) => Promise<void>
export type DropDatabase = (args: { adapter: SQLiteD1Adapter }) => Promise<void>
export type Execute<T> = (args: {
db?: LibSQLDatabase
drizzle?: LibSQLDatabase
raw?: string
sql?: SQL<unknown>
}) => SQLiteRaw<Promise<T>> | SQLiteRaw<ResultSet>
export type Insert = (args: {
db: LibSQLDatabase
onConflictDoUpdate?: SQLiteInsertOnConflictDoUpdateConfig<any>
tableName: string
values: Record<string, unknown> | Record<string, unknown>[]
}) => Promise<Record<string, unknown>[]>
// Explicitly omit drizzle property for complete override in SQLiteAdapter, required in ts 5.5
type SQLiteDrizzleAdapter = Omit<
DrizzleAdapter,
| 'countDistinct'
| 'deleteWhere'
| 'drizzle'
| 'dropDatabase'
| 'execute'
| 'idType'
| 'insert'
| 'operators'
| 'relations'
>
export interface GeneratedDatabaseSchema {
schemaUntyped: Record<string, unknown>
}
type ResolveSchemaType<T> = 'schema' extends keyof T
? T['schema']
: GeneratedDatabaseSchema['schemaUntyped']
type Drizzle = { $client: D1Database } & DrizzleD1Database<Record<string, any>>
export type SQLiteD1Adapter = {
binding: Args['binding']
client: D1Database
drizzle: Drizzle
} & BaseSQLiteAdapter &
SQLiteDrizzleAdapter
export type IDType = 'integer' | 'numeric' | 'text'
export type MigrateUpArgs = {
/**
* The SQLite Drizzle instance that you can use to execute SQL directly within the current transaction.
* @example
* ```ts
* import { type MigrateUpArgs, sql } from '@payloadcms/db-sqlite'
*
* export async function up({ db, payload, req }: MigrateUpArgs): Promise<void> {
* const { rows: posts } = await db.run(sql`SELECT * FROM posts`)
* }
* ```
*/
db: Drizzle
/**
* The Payload instance that you can use to execute Local API methods
* To use the current transaction you must pass `req` to arguments
* @example
* ```ts
* import { type MigrateUpArgs } from '@payloadcms/db-sqlite'
*
* export async function up({ db, payload, req }: MigrateUpArgs): Promise<void> {
* const posts = await payload.find({ collection: 'posts', req })
* }
* ```
*/
payload: Payload
/**
* The `PayloadRequest` object that contains the current transaction
*/
req: PayloadRequest
}
export type MigrateDownArgs = {
/**
* The SQLite Drizzle instance that you can use to execute SQL directly within the current transaction.
* @example
* ```ts
* import { type MigrateDownArgs, sql } from '@payloadcms/db-sqlite'
*
* export async function down({ db, payload, req }: MigrateDownArgs): Promise<void> {
* const { rows: posts } = await db.run(sql`SELECT * FROM posts`)
* }
* ```
*/
db: Drizzle
/**
* The Payload instance that you can use to execute Local API methods
* To use the current transaction you must pass `req` to arguments
* @example
* ```ts
* import { type MigrateDownArgs } from '@payloadcms/db-sqlite'
*
* export async function down({ db, payload, req }: MigrateDownArgs): Promise<void> {
* const posts = await payload.find({ collection: 'posts', req })
* }
* ```
*/
payload: Payload
/**
* The `PayloadRequest` object that contains the current transaction
*/
req: PayloadRequest
}
declare module 'payload' {
export interface DatabaseAdapter
extends Omit<Args, 'idType' | 'logger' | 'migrationDir' | 'pool'>,
DrizzleAdapter {
beginTransaction: (options?: SQLiteTransactionConfig) => Promise<null | number | string>
drizzle: Drizzle
/**
* An object keyed on each table, with a key value pair where the constraint name is the key, followed by the dot-notation field name
* Used for returning properly formed errors from unique fields
*/
fieldConstraints: Record<string, Record<string, string>>
idType: Args['idType']
initializing: Promise<void>
localesSuffix?: string
logger: DrizzleConfig['logger']
prodMigrations?: {
down: (args: MigrateDownArgs) => Promise<void>
name: string
up: (args: MigrateUpArgs) => Promise<void>
}[]
push: boolean
rejectInitializing: () => void
relationshipsSuffix?: string
resolveInitializing: () => void
schema: Record<string, GenericRelation | GenericTable>
tableNameMap: Map<string, string>
transactionOptions: SQLiteTransactionConfig
versionsSuffix?: string
}
}

View File

@@ -0,0 +1,14 @@
{
"extends": "../../tsconfig.base.json",
"references": [
{
"path": "../payload"
},
{
"path": "../translations"
},
{
"path": "../drizzle"
}
]
}

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/db-mongodb",
"version": "3.48.0",
"version": "3.49.0",
"description": "The officially supported MongoDB database adapter for Payload",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/db-postgres",
"version": "3.48.0",
"version": "3.49.0",
"description": "The officially supported Postgres database adapter for Payload",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -180,8 +180,6 @@ export function postgresAdapter(args: Args): DatabaseAdapterObj<PostgresAdapter>
find,
findGlobal,
findGlobalVersions,
updateJobs,
// @ts-expect-error - vestiges of when tsconfig was not strict. Feel free to improve
findOne,
findVersions,
indexes: new Set<string>(),
@@ -199,6 +197,7 @@ export function postgresAdapter(args: Args): DatabaseAdapterObj<PostgresAdapter>
queryDrafts,
rawRelations: {},
rawTables: {},
updateJobs,
// @ts-expect-error - vestiges of when tsconfig was not strict. Feel free to improve
rejectInitializing,
requireDrizzleKit,

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/db-sqlite",
"version": "3.48.0",
"version": "3.49.0",
"description": "The officially supported SQLite database adapter for Payload",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -41,24 +41,26 @@ import {
updateVersion,
upsert,
} from '@payloadcms/drizzle'
import {
columnToCodeConverter,
convertPathToJSONTraversal,
countDistinct,
createJSONQuery,
defaultDrizzleSnapshot,
deleteWhere,
dropDatabase,
execute,
init,
insert,
requireDrizzleKit,
} from '@payloadcms/drizzle/sqlite'
import { like, notLike } from 'drizzle-orm'
import { createDatabaseAdapter, defaultBeginTransaction } from 'payload'
import { fileURLToPath } from 'url'
import type { Args, SQLiteAdapter } from './types.js'
import { columnToCodeConverter } from './columnToCodeConverter.js'
import { connect } from './connect.js'
import { countDistinct } from './countDistinct.js'
import { convertPathToJSONTraversal } from './createJSONQuery/convertPathToJSONTraversal.js'
import { createJSONQuery } from './createJSONQuery/index.js'
import { defaultDrizzleSnapshot } from './defaultSnapshot.js'
import { deleteWhere } from './deleteWhere.js'
import { dropDatabase } from './dropDatabase.js'
import { execute } from './execute.js'
import { init } from './init.js'
import { insert } from './insert.js'
import { requireDrizzleKit } from './requireDrizzleKit.js'
const filename = fileURLToPath(import.meta.url)
@@ -69,8 +71,8 @@ export function sqliteAdapter(args: Args): DatabaseAdapterObj<SQLiteAdapter> {
function adapter({ payload }: { payload: Payload }) {
const migrationDir = findMigrationDir(args.migrationDir)
let resolveInitializing
let rejectInitializing
let resolveInitializing: () => void = () => {}
let rejectInitializing: () => void = () => {}
const initializing = new Promise<void>((res, rej) => {
resolveInitializing = res
@@ -131,7 +133,6 @@ export function sqliteAdapter(args: Args): DatabaseAdapterObj<SQLiteAdapter> {
updateJobs,
updateMany,
versionsSuffix: args.versionsSuffix || '_v',
// DatabaseAdapter
beginTransaction: args.transactionOptions ? beginTransaction : defaultBeginTransaction(),
commitTransaction,
@@ -166,7 +167,6 @@ export function sqliteAdapter(args: Args): DatabaseAdapterObj<SQLiteAdapter> {
find,
findGlobal,
findGlobalVersions,
// @ts-expect-error - vestiges of when tsconfig was not strict. Feel free to improve
findOne,
findVersions,
indexes: new Set<string>(),
@@ -182,10 +182,8 @@ export function sqliteAdapter(args: Args): DatabaseAdapterObj<SQLiteAdapter> {
packageName: '@payloadcms/db-sqlite',
payload,
queryDrafts,
// @ts-expect-error - vestiges of when tsconfig was not strict. Feel free to improve
rejectInitializing,
requireDrizzleKit,
// @ts-expect-error - vestiges of when tsconfig was not strict. Feel free to improve
resolveInitializing,
rollbackTransaction,
updateGlobal,

View File

@@ -1,5 +1,6 @@
import type { Client, Config, ResultSet } from '@libsql/client'
import type { extendDrizzleTable, Operators } from '@payloadcms/drizzle'
import type { BaseSQLiteAdapter, BaseSQLiteArgs } from '@payloadcms/drizzle/sqlite'
import type { BuildQueryJoinAliases, DrizzleAdapter } from '@payloadcms/drizzle/types'
import type { DrizzleConfig, Relation, Relations, SQL } from 'drizzle-orm'
import type { LibSQLDatabase } from 'drizzle-orm/libsql'
@@ -56,23 +57,7 @@ export type Args = {
*/
blocksAsJSON?: boolean
client: Config
/** Generated schema from payload generate:db-schema file path */
generateSchemaOutputFile?: string
idType?: 'number' | 'uuid'
localesSuffix?: string
logger?: DrizzleConfig['logger']
migrationDir?: string
prodMigrations?: {
down: (args: MigrateDownArgs) => Promise<void>
name: string
up: (args: MigrateUpArgs) => Promise<void>
}[]
push?: boolean
relationshipsSuffix?: string
schemaName?: string
transactionOptions?: false | SQLiteTransactionConfig
versionsSuffix?: string
}
} & BaseSQLiteArgs
export type GenericColumns = {
[x: string]: AnySQLiteColumn
@@ -142,45 +127,11 @@ type ResolveSchemaType<T> = 'schema' extends keyof T
type Drizzle = { $client: Client } & LibSQLDatabase<ResolveSchemaType<GeneratedDatabaseSchema>>
export type SQLiteAdapter = {
afterSchemaInit: SQLiteSchemaHook[]
autoIncrement: boolean
beforeSchemaInit: SQLiteSchemaHook[]
client: Client
clientConfig: Args['client']
countDistinct: CountDistinct
defaultDrizzleSnapshot: any
deleteWhere: DeleteWhere
drizzle: Drizzle
dropDatabase: DropDatabase
execute: Execute<unknown>
/**
* An object keyed on each table, with a key value pair where the constraint name is the key, followed by the dot-notation field name
* Used for returning properly formed errors from unique fields
*/
fieldConstraints: Record<string, Record<string, string>>
idType: Args['idType']
initializing: Promise<void>
insert: Insert
localesSuffix?: string
logger: DrizzleConfig['logger']
operators: Operators
prodMigrations?: {
down: (args: MigrateDownArgs) => Promise<void>
name: string
up: (args: MigrateUpArgs) => Promise<void>
}[]
push: boolean
rejectInitializing: () => void
relations: Record<string, GenericRelation>
relationshipsSuffix?: string
resolveInitializing: () => void
schema: Record<string, GenericRelation | GenericTable>
schemaName?: Args['schemaName']
tableNameMap: Map<string, string>
tables: Record<string, GenericTable>
transactionOptions: SQLiteTransactionConfig
versionsSuffix?: string
} & SQLiteDrizzleAdapter
} & BaseSQLiteAdapter &
SQLiteDrizzleAdapter
export type IDType = 'integer' | 'numeric' | 'text'

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/db-vercel-postgres",
"version": "3.48.0",
"version": "3.49.0",
"description": "Vercel Postgres adapter for Payload",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -178,8 +178,6 @@ export function vercelPostgresAdapter(args: Args = {}): DatabaseAdapterObj<Verce
findDistinct,
findGlobal,
findGlobalVersions,
readReplicaOptions: args.readReplicas,
// @ts-expect-error - vestiges of when tsconfig was not strict. Feel free to improve
findOne,
findVersions,
init,
@@ -194,6 +192,7 @@ export function vercelPostgresAdapter(args: Args = {}): DatabaseAdapterObj<Verce
packageName: '@payloadcms/db-vercel-postgres',
payload,
queryDrafts,
readReplicaOptions: args.readReplicas,
// @ts-expect-error - vestiges of when tsconfig was not strict. Feel free to improve
rejectInitializing,
requireDrizzleKit,

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/drizzle",
"version": "3.48.0",
"version": "3.49.0",
"description": "A library of shared functions used by different payload database adapters",
"homepage": "https://payloadcms.com",
"repository": {
@@ -30,6 +30,11 @@
"types": "./src/exports/postgres.ts",
"default": "./src/exports/postgres.ts"
},
"./sqlite": {
"import": "./src/exports/sqlite.ts",
"types": "./src/exports/sqlite.ts",
"default": "./src/exports/sqlite.ts"
},
"./types": {
"import": "./src/exports/types-deprecated.ts",
"types": "./src/exports/types-deprecated.ts",
@@ -82,6 +87,11 @@
"types": "./dist/exports/postgres.d.ts",
"default": "./dist/exports/postgres.js"
},
"./sqlite": {
"import": "./dist/exports/sqlite.js",
"types": "./dist/exports/sqlite.d.ts",
"default": "./dist/exports/sqlite.js"
},
"./types": {
"import": "./dist/exports/types-deprecated.js",
"types": "./dist/exports/types-deprecated.d.ts",

View File

@@ -23,7 +23,7 @@ export async function createGlobalVersion<T extends TypeWithID>(
updatedAt,
versionData,
}: CreateGlobalVersionArgs,
) {
): Promise<TypeWithVersion<T>> {
const db = await getTransaction(this, req)
const global = this.payload.globals.config.find(({ slug }) => slug === globalSlug)

View File

@@ -24,7 +24,7 @@ export async function createVersion<T extends TypeWithID>(
updatedAt,
versionData,
}: CreateVersionArgs<T>,
) {
): Promise<TypeWithVersion<T>> {
const db = await getTransaction(this, req)
const collection = this.payload.collections[collectionSlug].config
const defaultTableName = toSnakeCase(collection.slug)

View File

@@ -6,41 +6,58 @@ import toSnakeCase from 'to-snake-case'
import type { DrizzleAdapter } from './types.js'
import { findMany } from './find/findMany.js'
import { buildQuery } from './queries/buildQuery.js'
import { getTransaction } from './utilities/getTransaction.js'
export const deleteMany: DeleteMany = async function deleteMany(
this: DrizzleAdapter,
{ collection, req, where },
{ collection, req, where: whereArg },
) {
const db = await getTransaction(this, req)
const collectionConfig = this.payload.collections[collection].config
const tableName = this.tableNameMap.get(toSnakeCase(collectionConfig.slug))
const result = await findMany({
const table = this.tables[tableName]
const { joins, where } = buildQuery({
adapter: this,
fields: collectionConfig.flattenedFields,
joins: false,
limit: 0,
locale: req?.locale,
page: 1,
pagination: false,
req,
tableName,
where,
where: whereArg,
})
const ids = []
let whereToUse = where
result.docs.forEach((data) => {
ids.push(data.id)
})
if (ids.length > 0) {
await this.deleteWhere({
db,
if (joins?.length) {
// Difficult to support joins (through where referencing other tables) in deleteMany. => 2 separate queries.
// We can look into supporting this using one single query (through a subquery) in the future, though that's difficult to do in a generic way.
const result = await findMany({
adapter: this,
fields: collectionConfig.flattenedFields,
joins: false,
limit: 0,
locale: req?.locale,
page: 1,
pagination: false,
req,
select: {
id: true,
},
tableName,
where: inArray(this.tables[tableName].id, ids),
where: whereArg,
})
whereToUse = inArray(
table.id,
result.docs.map((doc) => doc.id),
)
}
await this.deleteWhere({
db,
tableName,
where: whereToUse,
})
}

View File

@@ -0,0 +1,12 @@
export { columnToCodeConverter } from '../sqlite/columnToCodeConverter.js'
export { countDistinct } from '../sqlite/countDistinct.js'
export { convertPathToJSONTraversal } from '../sqlite/createJSONQuery/convertPathToJSONTraversal.js'
export { createJSONQuery } from '../sqlite/createJSONQuery/index.js'
export { defaultDrizzleSnapshot } from '../sqlite/defaultSnapshot.js'
export { deleteWhere } from '../sqlite/deleteWhere.js'
export { dropDatabase } from '../sqlite/dropDatabase.js'
export { execute } from '../sqlite/execute.js'
export { init } from '../sqlite/init.js'
export { insert } from '../sqlite/insert.js'
export { requireDrizzleKit } from '../sqlite/requireDrizzleKit.js'
export * from '../sqlite/types.js'

View File

@@ -9,7 +9,7 @@ import { findMany } from './find/findMany.js'
export async function findOne<T extends TypeWithID>(
this: DrizzleAdapter,
{ collection, draftsEnabled, joins, locale, req, select, where }: FindOneArgs,
): Promise<T> {
): Promise<null | T> {
const collectionConfig: SanitizedCollectionConfig = this.payload.collections[collection].config
const tableName = this.tableNameMap.get(toSnakeCase(collectionConfig.slug))

View File

@@ -30,8 +30,8 @@ export const countDistinct: CountDistinct = async function countDistinct(
.limit(1)
.$dynamic()
joins.forEach(({ condition, table }) => {
query = query.leftJoin(table as PgTableWithColumns<any>, condition)
joins.forEach(({ type, condition, table }) => {
query = query[type ?? 'leftJoin'](table as PgTableWithColumns<any>, condition)
})
// When we have any joins, we need to count each individual ID only once.

View File

@@ -219,7 +219,10 @@ export function parseParams({
if (
operator === 'like' &&
(field.type === 'number' || table[columnName].columnType === 'PgUUID')
(field.type === 'number' ||
field.type === 'relationship' ||
field.type === 'upload' ||
table[columnName].columnType === 'PgUUID')
) {
operator = 'equals'
}

View File

@@ -112,9 +112,14 @@ export const sanitizeQueryValue = ({
if (field.type === 'date' && operator !== 'exists') {
if (typeof val === 'string') {
formattedValue = new Date(val).toISOString()
if (Number.isNaN(Date.parse(formattedValue))) {
return { operator, value: undefined }
if (val === 'null' || val === '') {
formattedValue = null
} else {
const date = new Date(val)
if (Number.isNaN(date.getTime())) {
return { operator, value: undefined }
}
formattedValue = date.toISOString()
}
} else if (typeof val === 'number') {
formattedValue = new Date(val).toISOString()

View File

@@ -56,8 +56,8 @@ export const selectDistinct = ({
query = query.where(where)
}
joins.forEach(({ condition, table }) => {
query = query.leftJoin(table, condition)
joins.forEach(({ type, condition, table }) => {
query = query[type ?? 'leftJoin'](table, condition)
})
return queryModifier({

View File

@@ -1,4 +1,4 @@
import type { ColumnToCodeConverter } from '@payloadcms/drizzle/types'
import type { ColumnToCodeConverter } from '../types.js'
export const columnToCodeConverter: ColumnToCodeConverter = ({
adapter,

View File

@@ -2,10 +2,10 @@ import type { SQLiteSelect } from 'drizzle-orm/sqlite-core'
import { count, sql } from 'drizzle-orm'
import type { CountDistinct, SQLiteAdapter } from './types.js'
import type { BaseSQLiteAdapter, CountDistinct } from './types.js'
export const countDistinct: CountDistinct = async function countDistinct(
this: SQLiteAdapter,
this: BaseSQLiteAdapter,
{ column, db, joins, tableName, where },
) {
// When we don't have any joins - use a simple COUNT(*) query.
@@ -29,8 +29,8 @@ export const countDistinct: CountDistinct = async function countDistinct(
.limit(1)
.$dynamic()
joins.forEach(({ condition, table }) => {
query = query.leftJoin(table, condition)
joins.forEach(({ type, condition, table }) => {
query = query[type ?? 'leftJoin'](table, condition)
})
// When we have any joins, we need to count each individual ID only once.

View File

@@ -1,4 +1,4 @@
import type { CreateJSONQueryArgs } from '@payloadcms/drizzle/types'
import type { CreateJSONQueryArgs } from '../../types.js'
type FromArrayArgs = {
isRoot?: true
@@ -74,7 +74,7 @@ export const createJSONQuery = ({
treatAsArray,
value,
}: CreateJSONQueryArgs): string => {
if (treatAsArray?.includes(pathSegments[1]!) && table) {
if (treatAsArray?.includes(pathSegments[1]) && table) {
return fromArray({
operator,
pathSegments,

View File

@@ -1,9 +1,9 @@
import type { DeleteWhere, SQLiteAdapter } from './types.js'
import type { BaseSQLiteAdapter, DeleteWhere } from './types.js'
export const deleteWhere: DeleteWhere = async function (
// Here 'this' is not a parameter. See:
// https://www.typescriptlang.org/docs/handbook/2/classes.html#this-parameters
this: SQLiteAdapter,
this: BaseSQLiteAdapter,
{ db, tableName, where },
) {
const table = this.tables[tableName]

View File

@@ -1,15 +1,15 @@
import type { Row } from '@libsql/client'
import type { DropDatabase, SQLiteAdapter } from './types.js'
import type { BaseSQLiteAdapter, DropDatabase } from './types.js'
const getTables = (adapter: SQLiteAdapter) => {
const getTables = (adapter: BaseSQLiteAdapter) => {
return adapter.client.execute(`SELECT name
FROM sqlite_master
WHERE type = 'table'
AND name NOT LIKE 'sqlite_%';`)
}
const dropTables = (adapter: SQLiteAdapter, rows: Row[]) => {
const dropTables = (adapter: BaseSQLiteAdapter, rows: Row[]) => {
const multi = `
PRAGMA foreign_keys = OFF;\n
${rows.map(({ name }) => `DROP TABLE IF EXISTS ${name as string}`).join(';\n ')};\n

View File

@@ -3,13 +3,13 @@ import { sql } from 'drizzle-orm'
import type { Execute } from './types.js'
export const execute: Execute<any> = function execute({ db, drizzle, raw, sql: statement }) {
const executeFrom = (db ?? drizzle)!
const executeFrom = (db ?? drizzle)
if (raw) {
const result = executeFrom.run(sql.raw(raw))
return result
} else {
const result = executeFrom.run(statement!)
const result = executeFrom.run(statement)
return result
}
}

View File

@@ -1,14 +1,15 @@
import type { DrizzleAdapter } from '@payloadcms/drizzle/types'
import type { Init } from 'payload'
import { buildDrizzleRelations, buildRawSchema, executeSchemaHooks } from '@payloadcms/drizzle'
import type { SQLiteAdapter } from './types.js'
import type { DrizzleAdapter } from '../types.js'
import type { BaseSQLiteAdapter } from './types.js'
import { buildDrizzleRelations } from '../schema/buildDrizzleRelations.js'
import { buildRawSchema } from '../schema/buildRawSchema.js'
import { executeSchemaHooks } from '../utilities/executeSchemaHooks.js'
import { buildDrizzleTable } from './schema/buildDrizzleTable.js'
import { setColumnID } from './schema/setColumnID.js'
export const init: Init = async function init(this: SQLiteAdapter) {
export const init: Init = async function init(this: BaseSQLiteAdapter) {
let locales: string[] | undefined
this.rawRelations = {}
@@ -28,7 +29,7 @@ export const init: Init = async function init(this: SQLiteAdapter) {
await executeSchemaHooks({ type: 'beforeSchemaInit', adapter: this })
for (const tableName in this.rawTables) {
buildDrizzleTable({ adapter, locales: locales!, rawTable: this.rawTables[tableName]! })
buildDrizzleTable({ adapter, locales, rawTable: this.rawTables[tableName] })
}
buildDrizzleRelations({

View File

@@ -1,9 +1,9 @@
import type { Insert, SQLiteAdapter } from './types.js'
import type { BaseSQLiteAdapter, Insert } from './types.js'
export const insert: Insert = async function (
// Here 'this' is not a parameter. See:
// https://www.typescriptlang.org/docs/handbook/2/classes.html#this-parameters
this: SQLiteAdapter,
this: BaseSQLiteAdapter,
{ db, onConflictDoUpdate, tableName, values },
): Promise<Record<string, unknown>[]> {
const table = this.tables[tableName]

View File

@@ -1,7 +1,7 @@
import type { RequireDrizzleKit } from '@payloadcms/drizzle/types'
import { createRequire } from 'module'
import type { RequireDrizzleKit } from '../types.js'
const require = createRequire(import.meta.url)
export const requireDrizzleKit: RequireDrizzleKit = () => {

View File

@@ -1,4 +1,3 @@
import type { BuildDrizzleTable, RawColumn } from '@payloadcms/drizzle/types'
import type { ForeignKeyBuilder, IndexBuilder } from 'drizzle-orm/sqlite-core'
import { sql } from 'drizzle-orm'
@@ -13,6 +12,8 @@ import {
} from 'drizzle-orm/sqlite-core'
import { v4 as uuidv4 } from 'uuid'
import type { BuildDrizzleTable, RawColumn } from '../../types.js'
const rawColumnBuilderMap: Partial<Record<RawColumn['type'], any>> = {
integer,
numeric,

View File

@@ -1,6 +1,5 @@
import type { SetColumnID } from '@payloadcms/drizzle/types'
import type { SQLiteAdapter } from '../types.js'
import type { SetColumnID } from '../../types.js'
import type { BaseSQLiteAdapter } from '../types.js'
export const setColumnID: SetColumnID = ({ adapter, columns, fields }) => {
const idField = fields.find((field) => field.name === 'id')
@@ -38,7 +37,7 @@ export const setColumnID: SetColumnID = ({ adapter, columns, fields }) => {
columns.id = {
name: 'id',
type: 'integer',
autoIncrement: (adapter as unknown as SQLiteAdapter).autoIncrement,
autoIncrement: (adapter as unknown as BaseSQLiteAdapter).autoIncrement,
primaryKey: true,
}

View File

@@ -0,0 +1,244 @@
import type { Client, ResultSet } from '@libsql/client'
import type { DrizzleConfig, Relation, Relations, SQL } from 'drizzle-orm'
import type { DrizzleD1Database } from 'drizzle-orm/d1'
import type { LibSQLDatabase } from 'drizzle-orm/libsql'
import type {
AnySQLiteColumn,
SQLiteColumn,
SQLiteInsertOnConflictDoUpdateConfig,
SQLiteTableWithColumns,
SQLiteTransactionConfig,
} from 'drizzle-orm/sqlite-core'
import type { SQLiteRaw } from 'drizzle-orm/sqlite-core/query-builders/raw'
import type { Payload, PayloadRequest } from 'payload'
import type { Operators } from '../queries/operatorMap.js'
import type { BuildQueryJoinAliases, DrizzleAdapter } from '../types.js'
import type { extendDrizzleTable } from '../utilities/extendDrizzleTable.js'
type SQLiteSchema = {
relations: Record<string, GenericRelation>
tables: Record<string, SQLiteTableWithColumns<any>>
}
type SQLiteSchemaHookArgs = {
extendTable: typeof extendDrizzleTable
schema: SQLiteSchema
}
export type SQLiteSchemaHook = (args: SQLiteSchemaHookArgs) => Promise<SQLiteSchema> | SQLiteSchema
export type BaseSQLiteArgs = {
/**
* Transform the schema after it's built.
* You can use it to customize the schema with features that aren't supported by Payload.
* Examples may include: composite indices, generated columns, vectors
*/
afterSchemaInit?: SQLiteSchemaHook[]
/**
* Enable this flag if you want to thread your own ID to create operation data, for example:
* ```ts
* // doc created with id 1
* const doc = await payload.create({ collection: 'posts', data: {id: 1, title: "my title"}})
* ```
*/
allowIDOnCreate?: boolean
/**
* Enable [AUTOINCREMENT](https://www.sqlite.org/autoinc.html) for Primary Keys.
* This ensures that the same ID cannot be reused from previously deleted rows.
*/
autoIncrement?: boolean
/**
* Transform the schema before it's built.
* You can use it to preserve an existing database schema and if there are any collissions Payload will override them.
* To generate Drizzle schema from the database, see [Drizzle Kit introspection](https://orm.drizzle.team/kit-docs/commands#introspect--pull)
*/
beforeSchemaInit?: SQLiteSchemaHook[]
/** Generated schema from payload generate:db-schema file path */
generateSchemaOutputFile?: string
idType?: 'number' | 'uuid'
localesSuffix?: string
logger?: DrizzleConfig['logger']
migrationDir?: string
prodMigrations?: {
down: (args: MigrateDownArgs) => Promise<void>
name: string
up: (args: MigrateUpArgs) => Promise<void>
}[]
push?: boolean
relationshipsSuffix?: string
schemaName?: string
transactionOptions?: false | SQLiteTransactionConfig
versionsSuffix?: string
}
export type GenericColumns = {
[x: string]: AnySQLiteColumn
}
export type GenericTable = SQLiteTableWithColumns<{
columns: GenericColumns
dialect: string
name: string
schema: string
}>
export type GenericRelation = Relations<string, Record<string, Relation<string>>>
export type CountDistinct = (args: {
column?: SQLiteColumn<any>
db: LibSQLDatabase
joins: BuildQueryJoinAliases
tableName: string
where: SQL
}) => Promise<number>
export type DeleteWhere = (args: {
db: LibSQLDatabase
tableName: string
where: SQL
}) => Promise<void>
export type DropDatabase = (args: { adapter: BaseSQLiteAdapter }) => Promise<void>
export type Execute<T> = (args: {
db?: DrizzleD1Database | LibSQLDatabase
drizzle?: DrizzleD1Database | LibSQLDatabase
raw?: string
sql?: SQL<unknown>
}) => SQLiteRaw<Promise<T>> | SQLiteRaw<ResultSet>
export type Insert = (args: {
db: LibSQLDatabase
onConflictDoUpdate?: SQLiteInsertOnConflictDoUpdateConfig<any>
tableName: string
values: Record<string, unknown> | Record<string, unknown>[]
}) => Promise<Record<string, unknown>[]>
// Explicitly omit drizzle property for complete override in SQLiteAdapter, required in ts 5.5
type SQLiteDrizzleAdapter = Omit<
DrizzleAdapter,
| 'countDistinct'
| 'deleteWhere'
| 'drizzle'
| 'dropDatabase'
| 'execute'
| 'idType'
| 'insert'
| 'operators'
| 'relations'
>
export interface GeneratedDatabaseSchema {
schemaUntyped: Record<string, unknown>
}
type ResolveSchemaType<T> = 'schema' extends keyof T
? T['schema']
: GeneratedDatabaseSchema['schemaUntyped']
type Drizzle = { $client: Client } & LibSQLDatabase<ResolveSchemaType<GeneratedDatabaseSchema>>
export type BaseSQLiteAdapter = {
afterSchemaInit: SQLiteSchemaHook[]
autoIncrement: boolean
beforeSchemaInit: SQLiteSchemaHook[]
client: Client
countDistinct: CountDistinct
defaultDrizzleSnapshot: any
deleteWhere: DeleteWhere
dropDatabase: DropDatabase
execute: Execute<unknown>
/**
* An object keyed on each table, with a key value pair where the constraint name is the key, followed by the dot-notation field name
* Used for returning properly formed errors from unique fields
*/
fieldConstraints: Record<string, Record<string, string>>
idType: BaseSQLiteArgs['idType']
initializing: Promise<void>
insert: Insert
localesSuffix?: string
logger: DrizzleConfig['logger']
operators: Operators
prodMigrations?: {
down: (args: MigrateDownArgs) => Promise<void>
name: string
up: (args: MigrateUpArgs) => Promise<void>
}[]
push: boolean
rejectInitializing: () => void
relations: Record<string, GenericRelation>
relationshipsSuffix?: string
resolveInitializing: () => void
schema: Record<string, GenericRelation | GenericTable>
schemaName?: BaseSQLiteArgs['schemaName']
tableNameMap: Map<string, string>
tables: Record<string, GenericTable>
transactionOptions: SQLiteTransactionConfig
versionsSuffix?: string
} & SQLiteDrizzleAdapter
export type IDType = 'integer' | 'numeric' | 'text'
export type MigrateUpArgs = {
/**
* The SQLite Drizzle instance that you can use to execute SQL directly within the current transaction.
* @example
* ```ts
* import { type MigrateUpArgs, sql } from '@payloadcms/db-sqlite'
*
* export async function up({ db, payload, req }: MigrateUpArgs): Promise<void> {
* const { rows: posts } = await db.run(sql`SELECT * FROM posts`)
* }
* ```
*/
db: Drizzle
/**
* The Payload instance that you can use to execute Local API methods
* To use the current transaction you must pass `req` to arguments
* @example
* ```ts
* import { type MigrateUpArgs } from '@payloadcms/db-sqlite'
*
* export async function up({ db, payload, req }: MigrateUpArgs): Promise<void> {
* const posts = await payload.find({ collection: 'posts', req })
* }
* ```
*/
payload: Payload
/**
* The `PayloadRequest` object that contains the current transaction
*/
req: PayloadRequest
}
export type MigrateDownArgs = {
/**
* The SQLite Drizzle instance that you can use to execute SQL directly within the current transaction.
* @example
* ```ts
* import { type MigrateDownArgs, sql } from '@payloadcms/db-sqlite'
*
* export async function down({ db, payload, req }: MigrateDownArgs): Promise<void> {
* const { rows: posts } = await db.run(sql`SELECT * FROM posts`)
* }
* ```
*/
db: Drizzle
/**
* The Payload instance that you can use to execute Local API methods
* To use the current transaction you must pass `req` to arguments
* @example
* ```ts
* import { type MigrateDownArgs } from '@payloadcms/db-sqlite'
*
* export async function down({ db, payload, req }: MigrateDownArgs): Promise<void> {
* const posts = await payload.find({ collection: 'posts', req })
* }
* ```
*/
payload: Payload
/**
* The `PayloadRequest` object that contains the current transaction
*/
req: PayloadRequest
}

View File

@@ -26,7 +26,7 @@ export async function updateGlobalVersion<T extends TypeWithID>(
versionData,
where: whereArg,
}: UpdateGlobalVersionArgs<T>,
) {
): Promise<TypeWithVersion<T>> {
const db = await getTransaction(this, req)
const globalConfig: SanitizedGlobalConfig = this.payload.globals.config.find(
({ slug }) => slug === global,

View File

@@ -26,7 +26,7 @@ export async function updateVersion<T extends TypeWithID>(
versionData,
where: whereArg,
}: UpdateVersionArgs<T>,
) {
): Promise<TypeWithVersion<T>> {
const db = await getTransaction(this, req)
const collectionConfig: SanitizedCollectionConfig = this.payload.collections[collection].config
const whereToUse = whereArg || { id: { equals: id } }

View File

@@ -14,10 +14,12 @@ export const buildCreateMigration = ({
executeMethod,
filename,
sanitizeStatements,
sqlOnly,
}: {
executeMethod: string
filename: string
sanitizeStatements: (args: { sqlExecute: string; statements: string[] }) => string
sqlOnly?: boolean
}): CreateMigration => {
const dirname = path.dirname(filename)
return async function createMigration(
@@ -78,7 +80,9 @@ export const buildCreateMigration = ({
}
const sqlStatementsUp = await generateMigration(drizzleJsonBefore, drizzleJsonAfter)
const sqlStatementsDown = await generateMigration(drizzleJsonAfter, drizzleJsonBefore)
const sqlStatementsDown = sqlOnly
? []
: await generateMigration(drizzleJsonAfter, drizzleJsonBefore)
const sqlExecute = `await db.${executeMethod}(` + 'sql`'
if (sqlStatementsUp?.length) {
@@ -116,19 +120,22 @@ export const buildCreateMigration = ({
fs.writeFileSync(`${filePath}.json`, JSON.stringify(drizzleJsonAfter, null, 2))
}
const data = sqlOnly
? upSQL
: getMigrationTemplate({
downSQL: downSQL || ` // Migration code`,
imports,
packageName: payload.db.packageName,
upSQL: upSQL || ` // Migration code`,
})
const fullPath = sqlOnly ? `${filePath}.sql` : `${filePath}.ts`
// write migration
fs.writeFileSync(
`${filePath}.ts`,
getMigrationTemplate({
downSQL: downSQL || ` // Migration code`,
imports,
packageName: payload.db.packageName,
upSQL: upSQL || ` // Migration code`,
}),
)
fs.writeFileSync(fullPath, data)
writeMigrationIndex({ migrationsDir: payload.db.migrationDir })
payload.logger.info({ msg: `Migration created at ${filePath}.ts` })
payload.logger.info({ msg: `Migration created at ${fullPath}` })
}
}

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/email-nodemailer",
"version": "3.48.0",
"version": "3.49.0",
"description": "Payload Nodemailer Email Adapter",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/email-resend",
"version": "3.48.0",
"version": "3.49.0",
"description": "Payload Resend Email Adapter",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -151,6 +151,7 @@ export const rootEslintConfig = [
'../db-postgres/relationships-v2-v3.mjs',
'../db-postgres/scripts/renamePredefinedMigrations.ts',
'../db-sqlite/bundle.js',
'../db-d1-sqlite/bundle.js',
'../db-vercel-postgres/relationships-v2-v3.mjs',
'../db-vercel-postgres/scripts/renamePredefinedMigrations.ts',
'../plugin-cloud-storage/azure.d.ts',

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/graphql",
"version": "3.48.0",
"version": "3.49.0",
"homepage": "https://payloadcms.com",
"repository": {
"type": "git",

View File

@@ -11,6 +11,7 @@ export type Resolver<TSlug extends CollectionSlug> = (
fallbackLocale?: string
id: number | string
locale?: string
trash?: boolean
},
context: {
req: PayloadRequest
@@ -49,6 +50,7 @@ export function getDeleteResolver<TSlug extends CollectionSlug>(
collection,
depth: 0,
req: isolateObjectProperty(req, 'transactionID'),
trash: args.trash,
}
const result = await deleteByIDOperation(options)

View File

@@ -15,6 +15,7 @@ export type Resolver = (
page?: number
pagination?: boolean
sort?: string
trash?: boolean
where?: Where
},
context: {
@@ -57,6 +58,7 @@ export function findResolver(collection: Collection): Resolver {
pagination: args.pagination,
req,
sort: args.sort,
trash: args.trash,
where: args.where,
}

View File

@@ -11,6 +11,7 @@ export type Resolver<TData> = (
fallbackLocale?: string
id: string
locale?: string
trash?: boolean
},
context: {
req: PayloadRequest
@@ -50,6 +51,7 @@ export function findByIDResolver<TSlug extends CollectionSlug>(
depth: 0,
draft: args.draft,
req: isolateObjectProperty(req, 'transactionID'),
trash: args.trash,
}
const result = await findByIDOperation(options)

View File

@@ -10,6 +10,7 @@ export type Resolver<T extends TypeWithID = any> = (
fallbackLocale?: string
id: number | string
locale?: string
trash?: boolean
},
context: {
req: PayloadRequest
@@ -33,6 +34,7 @@ export function findVersionByIDResolver(collection: Collection): Resolver {
collection,
depth: 0,
req: isolateObjectProperty(req, 'transactionID'),
trash: args.trash,
}
const result = await findVersionByIDOperation(options)

View File

@@ -14,6 +14,7 @@ export type Resolver = (
page?: number
pagination?: boolean
sort?: string
trash?: boolean
where: Where
},
context: {
@@ -54,6 +55,7 @@ export function findVersionsResolver(collection: Collection): Resolver {
pagination: args.pagination,
req: isolateObjectProperty(req, 'transactionID'),
sort: args.sort,
trash: args.trash,
where: args.where,
}

View File

@@ -13,6 +13,7 @@ export type Resolver<TSlug extends CollectionSlug> = (
fallbackLocale?: string
id: number | string
locale?: string
trash?: boolean
},
context: {
req: PayloadRequest
@@ -54,6 +55,7 @@ export function updateResolver<TSlug extends CollectionSlug>(
depth: 0,
draft: args.draft,
req: isolateObjectProperty(req, 'transactionID'),
trash: args.trash,
}
const result = await updateByIDOperation<TSlug>(options)

View File

@@ -205,6 +205,7 @@ export function initCollections({ config, graphqlResult }: InitCollectionsGraphQ
locale: { type: graphqlResult.types.localeInputType },
}
: {}),
trash: { type: GraphQLBoolean },
},
resolve: findByIDResolver(collection),
}
@@ -224,6 +225,7 @@ export function initCollections({ config, graphqlResult }: InitCollectionsGraphQ
page: { type: GraphQLInt },
pagination: { type: GraphQLBoolean },
sort: { type: GraphQLString },
trash: { type: GraphQLBoolean },
},
resolve: findResolver(collection),
}
@@ -292,6 +294,7 @@ export function initCollections({ config, graphqlResult }: InitCollectionsGraphQ
locale: { type: graphqlResult.types.localeInputType },
}
: {}),
trash: { type: GraphQLBoolean },
},
resolve: updateResolver(collection),
}
@@ -300,6 +303,7 @@ export function initCollections({ config, graphqlResult }: InitCollectionsGraphQ
type: collection.graphQL.type,
args: {
id: { type: new GraphQLNonNull(idType) },
trash: { type: GraphQLBoolean },
},
resolve: getDeleteResolver(collection),
}
@@ -329,12 +333,12 @@ export function initCollections({ config, graphqlResult }: InitCollectionsGraphQ
{
name: 'createdAt',
type: 'date',
label: 'Created At',
label: ({ t }) => t('general:createdAt'),
},
{
name: 'updatedAt',
type: 'date',
label: 'Updated At',
label: ({ t }) => t('general:updatedAt'),
},
]
@@ -359,6 +363,7 @@ export function initCollections({ config, graphqlResult }: InitCollectionsGraphQ
locale: { type: graphqlResult.types.localeInputType },
}
: {}),
trash: { type: GraphQLBoolean },
},
resolve: findVersionByIDResolver(collection),
}
@@ -385,6 +390,7 @@ export function initCollections({ config, graphqlResult }: InitCollectionsGraphQ
page: { type: GraphQLInt },
pagination: { type: GraphQLBoolean },
sort: { type: GraphQLString },
trash: { type: GraphQLBoolean },
},
resolve: findVersionsResolver(collection),
}

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/live-preview-react",
"version": "3.48.0",
"version": "3.49.0",
"description": "The official React SDK for Payload Live Preview",
"homepage": "https://payloadcms.com",
"repository": {
@@ -46,8 +46,8 @@
},
"devDependencies": {
"@payloadcms/eslint-config": "workspace:*",
"@types/react": "19.1.0",
"@types/react-dom": "19.1.2",
"@types/react": "19.1.8",
"@types/react-dom": "19.1.6",
"payload": "workspace:*"
},
"peerDependencies": {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/live-preview-vue",
"version": "3.48.0",
"version": "3.49.0",
"description": "The official Vue SDK for Payload Live Preview",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/live-preview",
"version": "3.48.0",
"version": "3.49.0",
"description": "The official live preview JavaScript SDK for Payload",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/next",
"version": "3.48.0",
"version": "3.49.0",
"homepage": "https://payloadcms.com",
"repository": {
"type": "git",
@@ -117,11 +117,11 @@
"@babel/preset-env": "7.27.2",
"@babel/preset-react": "7.27.1",
"@babel/preset-typescript": "7.27.1",
"@next/eslint-plugin-next": "15.3.2",
"@next/eslint-plugin-next": "15.4.4",
"@payloadcms/eslint-config": "workspace:*",
"@types/busboy": "1.5.4",
"@types/react": "19.1.0",
"@types/react-dom": "19.1.2",
"@types/react": "19.1.8",
"@types/react-dom": "19.1.6",
"@types/uuid": "10.0.0",
"babel-plugin-react-compiler": "19.1.0-rc.2",
"esbuild": "0.25.5",

View File

@@ -38,9 +38,12 @@ export const DocumentTabLink: React.FC<{
path: `/${isCollection ? 'collections' : 'globals'}/${entitySlug}`,
})
if (isCollection && segmentThree) {
// doc ID
docPath += `/${segmentThree}`
if (isCollection) {
if (segmentThree === 'trash' && segmentFour) {
docPath += `/trash/${segmentFour}`
} else if (segmentThree) {
docPath += `/${segmentThree}`
}
}
const href = `${docPath}${hrefFromProps}`

View File

@@ -5,8 +5,6 @@ import type {
SanitizedGlobalConfig,
} from 'payload'
import { fieldAffectsData } from 'payload/shared'
import { getRouteWithoutAdmin, isAdminRoute } from './shared.js'
type Args = {
@@ -35,7 +33,7 @@ export function getRouteInfo({
if (isAdminRoute({ adminRoute, config, route })) {
const routeWithoutAdmin = getRouteWithoutAdmin({ adminRoute, route })
const routeSegments = routeWithoutAdmin.split('/').filter(Boolean)
const [entityType, entitySlug, createOrID] = routeSegments
const [entityType, entitySlug, segment3, segment4] = routeSegments
const collectionSlug = entityType === 'collections' ? entitySlug : undefined
const globalSlug = entityType === 'globals' ? entitySlug : undefined
@@ -58,12 +56,17 @@ export function getRouteInfo({
}
}
const docID =
collectionSlug && createOrID !== 'create'
? idType === 'number'
? Number(createOrID)
: createOrID
: undefined
let docID: number | string | undefined
if (collectionSlug) {
if (segment3 === 'trash' && segment4) {
// /collections/:slug/trash/:id
docID = idType === 'number' ? Number(segment4) : segment4
} else if (segment3 && segment3 !== 'create') {
// /collections/:slug/:id
docID = idType === 'number' ? Number(segment3) : segment3
}
}
return {
collectionConfig,

View File

@@ -15,16 +15,18 @@ import {
useTranslation,
} from '@payloadcms/ui'
import { useSearchParams } from 'next/navigation.js'
import * as React from 'react'
import './index.scss'
import * as React from 'react'
import { LocaleSelector } from './LocaleSelector/index.js'
import { RenderJSON } from './RenderJSON/index.js'
const baseClass = 'query-inspector'
export const APIViewClient: React.FC = () => {
const { id, collectionSlug, globalSlug, initialData } = useDocumentInfo()
const { id, collectionSlug, globalSlug, initialData, isTrashed } = useDocumentInfo()
const searchParams = useSearchParams()
const { i18n, t } = useTranslation()
@@ -69,10 +71,13 @@ export const APIViewClient: React.FC = () => {
const [authenticated, setAuthenticated] = React.useState<boolean>(true)
const [fullscreen, setFullscreen] = React.useState<boolean>(false)
const trashParam = typeof initialData?.deletedAt === 'string'
const params = new URLSearchParams({
depth,
draft: String(draft),
locale,
trash: trashParam ? 'true' : 'false',
}).toString()
const fetchURL = `${serverURL}${apiRoute}${docEndpoint}?${params}`
@@ -114,6 +119,7 @@ export const APIViewClient: React.FC = () => {
globalLabel={globalConfig?.label}
globalSlug={globalSlug}
id={id}
isTrashed={isTrashed}
pluralLabel={collectionConfig ? collectionConfig?.labels?.plural : undefined}
useAsTitle={collectionConfig ? collectionConfig?.admin?.useAsTitle : undefined}
view="API"

View File

@@ -0,0 +1,43 @@
import type { AdminViewServerProps, ListQuery } from 'payload'
import type React from 'react'
import { notFound } from 'next/navigation.js'
import { renderListView } from '../List/index.js'
type RenderTrashViewArgs = {
customCellProps?: Record<string, any>
disableBulkDelete?: boolean
disableBulkEdit?: boolean
disableQueryPresets?: boolean
drawerSlug?: string
enableRowSelections: boolean
overrideEntityVisibility?: boolean
query: ListQuery
redirectAfterDelete?: boolean
redirectAfterDuplicate?: boolean
redirectAfterRestore?: boolean
} & AdminViewServerProps
export const TrashView: React.FC<
{ query?: any } & Omit<RenderTrashViewArgs, 'enableRowSelections'>
> = async (args) => {
try {
const { List: TrashList } = await renderListView({
...args,
enableRowSelections: true,
query: {
...(args.query || {}),
trash: true, // force trash view
},
viewType: 'trash',
})
return TrashList
} catch (error) {
if (error.message === 'not-found') {
notFound()
}
console.error(error) // eslint-disable-line no-console
}
}

View File

@@ -0,0 +1,35 @@
import type { Metadata } from 'next'
import type { SanitizedCollectionConfig } from 'payload'
import { getTranslation } from '@payloadcms/translations'
import type { GenerateViewMetadata } from '../Root/index.js'
import { generateMetadata } from '../../utilities/meta.js'
export const generateCollectionTrashMetadata = async (
args: {
collectionConfig: SanitizedCollectionConfig
} & Parameters<GenerateViewMetadata>[0],
): Promise<Metadata> => {
const { collectionConfig, config, i18n } = args
let title: string = ''
const description: string = ''
const keywords: string = ''
if (collectionConfig) {
title = getTranslation(collectionConfig.labels.plural, i18n)
}
title = `${title ? `${title} ` : title}${i18n.t('general:trash')}`
return generateMetadata({
...(config.admin.meta || {}),
description,
keywords,
serverURL: config.serverURL,
title,
...(collectionConfig?.admin?.meta || {}),
})
}

View File

@@ -85,7 +85,14 @@ export const CreateFirstUserClient: React.FC<{
return (
<Form
action={`${serverURL}${apiRoute}/${userSlug}/first-register`}
initialState={initialState}
initialState={{
...initialState,
'confirm-password': {
...initialState['confirm-password'],
valid: initialState['confirm-password']['valid'] || false,
value: initialState['confirm-password']['value'] || '',
},
}}
method="POST"
onChange={[onChange]}
onSuccess={handleFirstRegister}

View File

@@ -15,6 +15,7 @@ type Args = {
locale?: Locale
payload: Payload
req?: PayloadRequest
segments?: string[]
user?: TypedUser
}
@@ -25,12 +26,15 @@ export const getDocumentData = async ({
locale,
payload,
req,
segments,
user,
}: Args): Promise<null | Record<string, unknown> | TypeWithID> => {
const id = sanitizeID(idArg)
let resolvedData: Record<string, unknown> | TypeWithID = null
const { transactionID, ...rest } = req
const isTrashedDoc = segments?.[2] === 'trash' && typeof segments?.[3] === 'string' // id exists at segment 3
try {
if (collectionSlug && id) {
resolvedData = await payload.findByID({
@@ -44,6 +48,7 @@ export const getDocumentData = async ({
req: {
...rest,
},
trash: isTrashedDoc ? true : false,
user,
})
}

View File

@@ -113,7 +113,13 @@ export const getDocumentView = ({
// --> /collections/:collectionSlug/:id/api
// --> /collections/:collectionSlug/:id/versions
// --> /collections/:collectionSlug/:id/<custom-segment>
// --> /collections/:collectionSlug/trash/:id
case 4: {
// --> /collections/:collectionSlug/trash/:id
if (segment3 === 'trash' && segment4) {
View = getCustomViewByKey(views, 'default') || DefaultEditView
break
}
switch (segment4) {
// --> /collections/:collectionSlug/:id/api
case 'api': {
@@ -167,18 +173,86 @@ export const getDocumentView = ({
break
}
// --> /collections/:collectionSlug/trash/:id/api
// --> /collections/:collectionSlug/trash/:id/versions
// --> /collections/:collectionSlug/trash/:id/<custom-segment>
// --> /collections/:collectionSlug/:id/versions/:version
// --> /collections/:collectionSlug/:id/<custom-segment>/<custom-segment>
default: {
// --> /collections/:collectionSlug/:id/versions/:version
if (segment4 === 'versions') {
case 5: {
// --> /collections/:slug/trash/:id/api
if (segment3 === 'trash') {
switch (segment5) {
case 'api': {
if (collectionConfig?.admin?.hideAPIURL !== true) {
View = getCustomViewByKey(views, 'api') || DefaultAPIView
}
break
}
// --> /collections/:slug/trash/:id/versions
case 'versions': {
if (docPermissions?.readVersions) {
View = getCustomViewByKey(views, 'versions') || DefaultVersionsView
} else {
View = UnauthorizedViewWithGutter
}
break
}
default: {
View = getCustomViewByKey(views, 'default') || DefaultEditView
break
}
}
// --> /collections/:collectionSlug/:id/versions/:version
} else if (segment4 === 'versions') {
if (docPermissions?.readVersions) {
View = getCustomViewByKey(views, 'version') || DefaultVersionView
} else {
View = UnauthorizedViewWithGutter
}
} else {
// --> /collections/:collectionSlug/:id/<custom-segment>/<custom-segment>
// --> /collections/:collectionSlug/:id/<custom>/<custom>
const baseRoute = [
adminRoute !== '/' && adminRoute,
collectionEntity,
collectionSlug,
segment3,
]
.filter(Boolean)
.join('/')
const currentRoute = [baseRoute, segment4, segment5, ...remainingSegments]
.filter(Boolean)
.join('/')
const { Component: CustomViewComponent, viewKey: customViewKey } = getCustomViewByRoute({
baseRoute,
currentRoute,
views,
})
if (customViewKey) {
viewKey = customViewKey
View = CustomViewComponent
}
}
break
}
// --> /collections/:collectionSlug/trash/:id/versions/:version
// --> /collections/:collectionSlug/:id/<custom>/<custom>/<custom...>
default: {
// --> /collections/:collectionSlug/trash/:id/versions/:version
const isTrashedVersionView = segment3 === 'trash' && segment5 === 'versions'
if (isTrashedVersionView) {
if (docPermissions?.readVersions) {
View = getCustomViewByKey(views, 'version') || DefaultVersionView
} else {
View = UnauthorizedViewWithGutter
}
} else {
// --> /collections/:collectionSlug/:id/<custom>/<custom>/<custom...>
const baseRoute = [
adminRoute !== '/' && adminRoute,
collectionEntity,

View File

@@ -15,6 +15,7 @@ export type GenerateEditViewMetadata = (
args: {
collectionConfig?: null | SanitizedCollectionConfig
globalConfig?: null | SanitizedGlobalConfig
isReadOnly?: boolean
view?: keyof EditConfig
} & Parameters<GenerateViewMetadata>[0],
) => Promise<Metadata>
@@ -42,6 +43,11 @@ export const getMetaBySegment: GenerateEditViewMetadata = async ({
fn = generateEditViewMetadata
}
// `/collections/:collection/trash/:id`
if (segments.length === 4 && segments[2] === 'trash') {
fn = (args) => generateEditViewMetadata({ ...args, isReadOnly: true })
}
// `/:collection/:id/:view`
if (params.segments.length === 4) {
switch (params.segments[3]) {
@@ -69,6 +75,25 @@ export const getMetaBySegment: GenerateEditViewMetadata = async ({
break
}
}
// `/collections/:collection/trash/:id/:view`
if (segments.length === 5 && segments[2] === 'trash') {
switch (segments[4]) {
case 'api':
fn = generateAPIViewMetadata
break
case 'versions':
fn = generateVersionsViewMetadata
break
default:
break
}
}
// `/collections/:collection/trash/:id/versions/:versionID`
if (segments.length === 6 && segments[2] === 'trash' && segments[4] === 'versions') {
fn = generateVersionViewMetadata
}
}
if (isGlobal) {

View File

@@ -65,6 +65,7 @@ export const renderDocument = async ({
redirectAfterCreate,
redirectAfterDelete,
redirectAfterDuplicate,
redirectAfterRestore,
searchParams,
versions,
viewType,
@@ -74,6 +75,7 @@ export const renderDocument = async ({
readonly redirectAfterCreate?: boolean
readonly redirectAfterDelete?: boolean
readonly redirectAfterDuplicate?: boolean
readonly redirectAfterRestore?: boolean
versions?: RenderDocumentVersionsProperties
} & AdminViewServerProps): Promise<{
data: Data
@@ -116,6 +118,7 @@ export const renderDocument = async ({
locale,
payload,
req,
segments,
user,
}))
@@ -134,6 +137,8 @@ export const renderDocument = async ({
}
}
const isTrashedDoc = typeof doc?.deletedAt === 'string'
const [
docPreferences,
{ docPermissions, hasPublishPermission, hasSavePermission },
@@ -202,6 +207,7 @@ export const renderDocument = async ({
globalSlug,
locale: locale?.code,
operation,
readOnly: isTrashedDoc,
renderAllFields: true,
req,
schemaPath: collectionSlug || globalSlug,
@@ -389,12 +395,14 @@ export const renderDocument = async ({
initialState={formState}
isEditing={isEditing}
isLocked={isLocked}
isTrashed={isTrashedDoc}
key={locale?.code}
lastUpdateTime={lastUpdateTime}
mostRecentVersionIsAutosaved={mostRecentVersionIsAutosaved}
redirectAfterCreate={redirectAfterCreate}
redirectAfterDelete={redirectAfterDelete}
redirectAfterDuplicate={redirectAfterDuplicate}
redirectAfterRestore={redirectAfterRestore}
unpublishedVersionCount={unpublishedVersionCount}
versionCount={versionCount}
>

View File

@@ -16,6 +16,7 @@ export const generateEditViewMetadata: GenerateEditViewMetadata = async ({
globalConfig,
i18n,
isEditing,
isReadOnly = false,
view = 'default',
}): Promise<Metadata> => {
const { t } = i18n
@@ -26,11 +27,17 @@ export const generateEditViewMetadata: GenerateEditViewMetadata = async ({
? getTranslation(globalConfig.label, i18n)
: ''
const verb = isReadOnly
? t('general:viewing')
: isEditing
? t('general:editing')
: t('general:creating')
const metaToUse: MetaConfig = {
...(config.admin.meta || {}),
description: `${isEditing ? t('general:editing') : t('general:creating')} - ${entityLabel}`,
description: `${verb} - ${entityLabel}`,
keywords: `${entityLabel}, Payload, CMS`,
title: `${isEditing ? t('general:editing') : t('general:creating')} - ${entityLabel}`,
title: `${verb} - ${entityLabel}`,
}
const ogToUse: MetaConfig['openGraph'] = {

View File

@@ -0,0 +1,199 @@
import type {
ClientConfig,
Column,
ListQuery,
PaginatedDocs,
PayloadRequest,
SanitizedCollectionConfig,
Where,
} from 'payload'
import { renderTable } from '@payloadcms/ui/rsc'
import { formatDate } from '@payloadcms/ui/shared'
import { flattenAllFields } from 'payload'
export const handleGroupBy = async ({
clientConfig,
collectionConfig,
collectionSlug,
columns,
customCellProps,
drawerSlug,
enableRowSelections,
query,
req,
user,
where: whereWithMergedSearch,
}: {
clientConfig: ClientConfig
collectionConfig: SanitizedCollectionConfig
collectionSlug: string
columns: any[]
customCellProps?: Record<string, any>
drawerSlug?: string
enableRowSelections?: boolean
query?: ListQuery
req: PayloadRequest
user: any
where: Where
}): Promise<{
columnState: Column[]
data: PaginatedDocs
Table: null | React.ReactNode | React.ReactNode[]
}> => {
let Table: React.ReactNode | React.ReactNode[] = null
let columnState: Column[]
const dataByGroup: Record<string, PaginatedDocs> = {}
const clientCollectionConfig = clientConfig.collections.find((c) => c.slug === collectionSlug)
// NOTE: is there a faster/better way to do this?
const flattenedFields = flattenAllFields({ fields: collectionConfig.fields })
const groupByFieldPath = query.groupBy.replace(/^-/, '')
const groupByField = flattenedFields.find((f) => f.name === groupByFieldPath)
const relationshipConfig =
groupByField?.type === 'relationship'
? clientConfig.collections.find((c) => c.slug === groupByField.relationTo)
: undefined
let populate
if (groupByField?.type === 'relationship' && groupByField.relationTo) {
const relationTo =
typeof groupByField.relationTo === 'string'
? [groupByField.relationTo]
: groupByField.relationTo
if (Array.isArray(relationTo)) {
relationTo.forEach((rel) => {
if (!populate) {
populate = {}
}
populate[rel] = { [relationshipConfig?.admin.useAsTitle || 'id']: true }
})
}
}
const distinct = await req.payload.findDistinct({
collection: collectionSlug,
depth: 1,
field: groupByFieldPath,
limit: query?.limit ? Number(query.limit) : undefined,
locale: req.locale,
overrideAccess: false,
page: query?.page ? Number(query.page) : undefined,
populate,
req,
sort: query?.groupBy,
where: whereWithMergedSearch,
})
const data = {
...distinct,
docs: distinct.values?.map(() => ({})) || [],
values: undefined,
}
await Promise.all(
distinct.values.map(async (distinctValue, i) => {
const potentiallyPopulatedRelationship = distinctValue[groupByFieldPath]
const valueOrRelationshipID =
groupByField?.type === 'relationship' &&
potentiallyPopulatedRelationship &&
typeof potentiallyPopulatedRelationship === 'object' &&
'id' in potentiallyPopulatedRelationship
? potentiallyPopulatedRelationship.id
: potentiallyPopulatedRelationship
const groupData = await req.payload.find({
collection: collectionSlug,
depth: 0,
draft: true,
fallbackLocale: false,
includeLockStatus: true,
limit: query?.queryByGroup?.[valueOrRelationshipID]?.limit
? Number(query.queryByGroup[valueOrRelationshipID].limit)
: undefined,
locale: req.locale,
overrideAccess: false,
page: query?.queryByGroup?.[valueOrRelationshipID]?.page
? Number(query.queryByGroup[valueOrRelationshipID].page)
: undefined,
req,
// Note: if we wanted to enable table-by-table sorting, we could use this:
// sort: query?.queryByGroup?.[valueOrRelationshipID]?.sort,
sort: query?.sort,
user,
where: {
...(whereWithMergedSearch || {}),
[groupByFieldPath]: {
equals: valueOrRelationshipID,
},
},
})
let heading = valueOrRelationshipID || req.i18n.t('general:noValue')
if (
groupByField?.type === 'relationship' &&
typeof potentiallyPopulatedRelationship === 'object'
) {
heading =
potentiallyPopulatedRelationship[relationshipConfig.admin.useAsTitle || 'id'] ||
valueOrRelationshipID
}
if (groupByField.type === 'date') {
heading = formatDate({
date: String(heading),
i18n: req.i18n,
pattern: clientConfig.admin.dateFormat,
})
}
if (groupData.docs && groupData.docs.length > 0) {
const { columnState: newColumnState, Table: NewTable } = renderTable({
clientCollectionConfig,
collectionConfig,
columns,
customCellProps,
data: groupData,
drawerSlug,
enableRowSelections,
groupByFieldPath,
groupByValue: valueOrRelationshipID,
heading,
i18n: req.i18n,
key: `table-${valueOrRelationshipID}`,
orderableFieldName: collectionConfig.orderable === true ? '_order' : undefined,
payload: req.payload,
query,
useAsTitle: collectionConfig.admin.useAsTitle,
})
// Only need to set `columnState` once, using the first table's column state
// This will avoid needing to generate column state explicitly for root context that wraps all tables
if (!columnState) {
columnState = newColumnState
}
if (!Table) {
Table = []
}
dataByGroup[valueOrRelationshipID] = groupData
;(Table as Array<React.ReactNode>)[i] = NewTable
}
}),
)
return {
columnState,
data,
Table,
}
}

Some files were not shown because too many files have changed in this diff Show More