Supports grouping documents by specific fields within the list view.
For example, imagine having a "posts" collection with a "categories"
field. To report on each specific category, you'd traditionally filter
for each category, one at a time. This can be quite inefficient,
especially with large datasets.
Now, you can interact with all categories simultaneously, grouped by
distinct values.
Here is a simple demonstration:
https://github.com/user-attachments/assets/0dcd19d2-e983-47e6-9ea2-cfdd2424d8b5
Enable on any collection by setting the `admin.groupBy` property:
```ts
import type { CollectionConfig } from 'payload'
const MyCollection: CollectionConfig = {
// ...
admin: {
groupBy: true
}
}
```
This is currently marked as beta to gather feedback while we reach full
stability, and to leave room for API changes and other modifications.
Use at your own risk.
Note: when using `groupBy`, bulk editing is done group-by-group. In the
future we may support cross-group bulk editing.
Dependent on #13102 (merged).
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210774523852467
---------
Co-authored-by: Paul Popus <paul@payloadcms.com>
### What?
Refactors the `LeaveWithoutSaving` modal to be generic and delegates
document unlock logic back to the `DefaultEditView` component via a
callback.
### Why?
Previously, `unlockDocument` was triggered in a cleanup `useEffect` in
the edit view. When logging out from the edit view, the unlock request
would often fail due to the session ending — leaving the document in a
locked state.
### How?
- Introduced `onConfirm` and `onPrevent` props for `LeaveWithoutSaving`.
- Moved all document lock/unlock logic into `DefaultEditView`’s
`handleLeaveConfirm`.
- Captures the next navigation target via `onPrevent` and evaluates
whether to unlock based on:
- Locking being enabled.
- Current user owning the lock.
- Navigation not targeting internal admin views (`/preview`, `/api`,
`/versions`).
---------
Co-authored-by: Jarrod Flesch <jarrodmflesch@gmail.com>
### What?
Improves both the JSON preview and export functionality in the
import-export plugin:
- Preserves proper nesting of object and array fields (e.g., groups,
tabs, arrays)
- Excludes any fields explicitly marked as `disabled` via
`custom.plugin-import-export`
- Ensures downloaded files use proper JSON formatting when `format` is
`json` (no CSV-style flattening)
### Why?
Previously:
- The JSON preview flattened all fields to a single level and included
disabled fields.
- Exported files with `format: json` were still CSV-style data encoded
as `.json`, rather than real JSON.
### How?
- Refactored `/preview-data` JSON handling to preserve original document
shape.
- Applied `removeDisabledFields` to clean nested fields using
dot-notation paths.
- Updated `createExport` to skip `flattenObject` for JSON formats, using
a nested JSON filter instead.
- Fixed streaming and buffered export paths to output valid JSON arrays
when `format` is `json`.
~~Sometimes, drizzle is adding the same join to the joins array twice
(`addJoinTable`), despite the table being the same. This is due to a bug
in `getNameFromDrizzleTable` where it would sometimes return a UUID
instead of the table name.~~
~~This PR changes it to read from the drizzle:BaseName symbol instead,
which is correctly returning the table name in my testing. It falls back
to `getTableName`, which uses drizzle:Name.~~
This for some reason fails the tests. Instead, this PR just uses the
getTableName utility now instead of searching for the symbol manually.
https://github.com/payloadcms/payload/pull/13186 actually made the
select API _more powerful_, as it can reduce the amount of db calls even
for complex collections with blocks down to 1.
This PR adds a test that verifies this.
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210871676349303
Prevents base list filters from being injected into the URL.
This is a problem with the multi-tenant plugin, for example, where
changing the tenant adds a `baseListFilter` to the query, but should
never be exposed to the end user.
Introduced in #13200.
Currently, an optimized DB update (simple data => no
delete-and-create-row) does the following:
1. sql UPDATE
2. sql SELECT
This PR reduces this further to one single DB call for simple
collections:
1. sql UPDATE with RETURNING()
This only works for simple collections that do not have any fields that
need to be fetched from other tables. If a collection has fields like
relationship or blocks, we'll need that separate SELECT call to join in
the other tables.
In 4.0, we can remove all "complex" fields from the jobs collection and
replace them with a JSON field to make use of this optimization
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210803039809814
### What?
Replaces all `payload.logger.info` calls with `payload.logger.debug` in
the `createExport` function.
### Why?
info logs are too verbose. Using debug ensures detailed logs.
### How?
- Updated all logger calls in `createExport` to use `debug` instead of
`info`.
Currently, with DocumentDB instead of a friendly error like "Value must
be unique" we see a generic "Something went wrong" message.
This PR fixes that by adding a fallback to parse the message instead of
using `error.keyValue` which doesn't exist for responses from
DocumentDB.
### What?
A comma is missing in the example code. This results in not valid JSON.
### Why?
I stumbled upon it, while setting up a Tenant-based Payload for the
first time.
### How?
Adding a comma results in valid JSON.
Fixes #
Added a comma. ;)
Fixes#13113
### How?
Does not rely on JS falseyness, instead explicitly checking for null &
undefined
I'm not actually certain this is the approach we want to take. Some
people might interpret "required" as not null, not-undefined and min
length > 1 in the case of strings. If they do, this change to the
behavior in the not-required case will break their expectations
### What?
This PR ensures that when a document is created using the `Publish in
__` button, it is saved to the correct locale.
### Why?
During document creation, the buttons `Publish` or `Publish in [locale]`
have the same effect. As a result, we overlooked the case where a user
may specifically click `Publish in [locale]` for the first save. In this
scenario, the create operation does not respect the
`publishSpecificLocale` value, so the document was always saved in the
default locale regardless of the intended one.
### How?
Passes the `publishSpecificLocale` value to the create operation,
ensuring the document and version is saved to the correct locale.
**Fixes:** #13117
Relational databases were broken with folders because it was querying
on:
```ts
{
folderType: {
equals: []
}
}
```
Which does not work since the select hasMany stores values in a separate
table.
prettier doesn't seem to cover that, and horizontal scrolling in the
browser is even more annoying than in the IDE.
Regex used in the search engine: `^[ \t]*\* `
Some search params within the list view do not properly sync to user
preferences, and visa versa.
For example, when selecting a query preset, the `?preset=123` param is
injected into the URL and saved to preferences, but when reloading the
page without the param, that preset is not reactivated as expected.
### Problem
The reason this wasn't working before is that omitting this param would
also reset prefs. It was designed this way in order to support
client-side resets, e.g. clicking the query presets "x" button.
This pattern would never work, however, because this means that every
time the user navigates to the list view directly, their preference is
cleared, as no param would exist in the query.
Note: this is not an issue with _all_ params, as not all are handled in
the same way.
### Solution
The fix is to use empty values instead, e.g. `?preset=`. When the server
receives this, it knows to clear the pref. If it doesn't exist at all,
it knows to load from prefs. And if it has a value, it saves to prefs.
On the client, we sanitize those empty values back out so they don't
appear in the URL in the end.
This PR also refactors much of the list query context and its respective
provider to be significantly more predictable and easier to work with,
namely:
- The `ListQuery` type now fully aligns with what Payload APIs expect,
e.g. `page` is a number, not a string
- The provider now receives a single `query` prop which matches the
underlying context 1:1
- Propagating the query from the server to the URL is significantly more
predictable
- Any new props that may be supported in the future will automatically
work
- No more reconciling `columns` and `listPreferences.columns`, its just
`query.columns`
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210827129744922
<!--
Thank you for the PR! Please go through the checklist below and make
sure you've completed all the steps.
Please review the
[CONTRIBUTING.md](https://github.com/payloadcms/payload/blob/main/CONTRIBUTING.md)
document in this repository if you haven't already.
The following items will ensure that your PR is handled as smoothly as
possible:
- PR Title must follow conventional commits format. For example, `feat:
my new feature`, `fix(plugin-seo): my fix`.
- Minimal description explained as if explained to someone not
immediately familiar with the code.
- Provide before/after screenshots or code diffs if applicable.
- Link any related issues/discussions from GitHub or Discord.
- Add review comments if necessary to explain to the reviewer the logic
behind a change
-->
### What?
Fixes a bug where `afterChange` hooks would attempt to access values for
fields that are `read: false` but `create: true`, resulting in
`undefined` values and unexpected behavior.
### Why?
In scenarios where access control allows field creation (`create: true`)
but disallows reading it (`read: false`), hooks like `afterChange` would
still attempt to operate on `undefined` values from `siblingDoc` or
`previousDoc`, potentially causing errors or skipped logic.
### How?
Adds safe optional chaining and fallback object initialization in
`promise.ts` for:
- `previousDoc[field.name]`
- `siblingDoc[field.name]`
- Group, Array, and Block field traversals
This ensures that these values are treated as empty objects or arrays
where appropriate to prevent runtime errors during traversal or hook
execution.
Fixes https://github.com/payloadcms/payload/issues/12660
---------
Co-authored-by: Niall Bambury <niall.bambury@cuckoo.co>
Adds a new `schedule` property to workflow and task configs that can be
used to have Payload automatically _queue_ jobs following a certain
_schedule_.
Docs:
https://payloadcms.com/docs/dynamic/jobs-queue/schedules?branch=feat/schedule-jobs
## API Example
```ts
export default buildConfig({
// ...
jobs: {
// ...
scheduler: 'manual', // Or `cron` if you're not using serverless. If `manual` is used, then user needs to set up running /api/payload-jobs/handleSchedules or payload.jobs.handleSchedules in regular intervals
tasks: [
{
schedule: [
{
cron: '* * * * * *',
queue: 'autorunSecond',
// Hooks are optional
hooks: {
// Not an array, as providing and calling `defaultBeforeSchedule` would be more error-prone if this was an array
beforeSchedule: async (args) => {
// Handles verifying that there are no jobs already scheduled or processing.
// You can override this behavior by not calling defaultBeforeSchedule, e.g. if you wanted
// to allow a maximum of 3 scheduled jobs in the queue instead of 1, or add any additional conditions
const result = await args.defaultBeforeSchedule(args)
return {
...result,
input: {
message: 'This task runs every second',
},
}
},
afterSchedule: async (args) => {
await args.defaultAfterSchedule(args) // Handles updating the payload-jobs-stats global
args.req.payload.logger.info(
'EverySecond task scheduled: ' +
(args.status === 'success' ? args.job.id : 'skipped or failed to schedule'),
)
},
},
},
],
slug: 'EverySecond',
inputSchema: [
{
name: 'message',
type: 'text',
required: true,
},
],
handler: ({ input, req }) => {
req.payload.logger.info(input.message)
return {
output: {},
}
},
}
]
}
})
```
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210495300843759
### What?
Exit the process after running jobs.
### Why?
When running the `payload jobs:run` bin script with a postgres database
the process hangs forever.
### How?
Execute `process.exit(0)` after running all jobs.
### What?
Fixes the `custom.plugin-import-export.disabled` flag to correctly
disable fields in all nested structures including:
- Groups
- Arrays
- Tabs
- Blocks
Previously, only top-level fields or direct children were respected.
This update ensures nested paths (e.g. `group.array.field1`,
`blocks.hero.title`, etc.) are matched and filtered from exports.
### Why?
- Updated regex logic in both `createExport` and Preview components to
recursively support:
- Indexed array fields (e.g. `array_0_field1`)
- Block fields with slugs (e.g. `blocks_0_hero_title`)
- Nested field accessors with correct part-by-part expansion
### How?
To allow users to disable entire field groups or deeply nested fields in
structured layouts.
As discussed in [this
RFC](https://github.com/payloadcms/payload/discussions/12729), this PR
supports collection-scoped folders. You can scope folders to multiple
collection types or just one.
This unlocks the possibility to have folders on a per collection instead
of always being shared on every collection. You can combine this feature
with the `browseByFolder: false` to completely isolate a collection from
other collections.
Things left to do:
- [x] ~~Create a custom react component for the selecting of
collectionSlugs to filter out available options based on the current
folders parameters~~
https://github.com/user-attachments/assets/14cb1f09-8d70-4cb9-b1e2-09da89302995
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210564397815557
Exports `$createLinkNode`, `$isLinkNode` and the equivalent modules for
autolinks.
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210710489889573
Adds a new operation findDistinct that can give you distinct values of a
field for a given collection
Example:
Assume you have a collection posts with multiple documents, and some of
them share the same title:
```js
// Example dataset (some titles appear multiple times)
[
{ title: 'title-1' },
{ title: 'title-2' },
{ title: 'title-1' },
{ title: 'title-3' },
{ title: 'title-2' },
{ title: 'title-4' },
{ title: 'title-5' },
{ title: 'title-6' },
{ title: 'title-7' },
{ title: 'title-8' },
{ title: 'title-9' },
]
```
You can now retrieve all unique title values using findDistinct:
```js
const result = await payload.findDistinct({
collection: 'posts',
field: 'title',
})
console.log(result.values)
// Output:
// [
// 'title-1',
// 'title-2',
// 'title-3',
// 'title-4',
// 'title-5',
// 'title-6',
// 'title-7',
// 'title-8',
// 'title-9'
// ]
```
You can also limit the number of distinct results:
```js
const limitedResult = await payload.findDistinct({
collection: 'posts',
field: 'title',
sortOrder: 'desc',
limit: 3,
})
console.log(limitedResult.values)
// Output:
// [
// 'title-1',
// 'title-2',
// 'title-3'
// ]
```
You can also pass a `where` query to filter the documents.
Enhance field-level access controls on Users collection to address
security concerns
- Restricted read/update access on `email` field to admins and the user
themselves
- Locked down `roles` field so only admins can create, read, or update
it
### What?
Adds four more arguments to the `mongooseAdapter`:
```typescript
useJoinAggregations?: boolean /* The big one */
useAlternativeDropDatabase?: boolean
useBigIntForNumberIDs?: boolean
usePipelineInSortLookup?: boolean
```
Also export a new `compatabilityOptions` object from
`@payloadcms/db-mongodb` where each key is a mongo-compatible database
and the value is the recommended `mongooseAdapter` settings for
compatability.
### Why?
When using firestore and visiting
`/admin/collections/media/payload-folders`, we get:
```
MongoServerError: invalid field(s) in lookup: [let, pipeline], only lookup(from, localField, foreignField, as) is supported
```
Firestore doesn't support the full MongoDB aggregation API used by
Payload which gets used when building aggregations for populating join
fields.
There are several other compatability issues with Firestore:
- The invalid `pipeline` property is used in the `$lookup` aggregation
in `buildSortParams`
- Firestore only supports number IDs of type `Long`, but Mongoose
converts custom ID fields of type number to `Double`
- Firestore does not support the `dropDatabase` command
- Firestore does not support the `createIndex` command (not addressed in
this PR)
### How?
```typescript
useJoinAggregations?: boolean /* The big one */
```
When this is `false` we skip the `buildJoinAggregation()` pipeline and resolve the join fields through multiple queries. This can potentially be used with AWS DocumentDB and Azure Cosmos DB to support join fields, but I have not tested with either of these databases.
```typescript
useAlternativeDropDatabase?: boolean
```
When `true`, monkey-patch (replace) the `dropDatabase` function so that
it calls `collection.deleteMany({})` on every collection instead of
sending a single `dropDatabase` command to the database
```typescript
useBigIntForNumberIDs?: boolean
```
When `true`, use `mongoose.Schema.Types.BigInt` for custom ID fields of type `number` which converts to a firestore `Long` behind the scenes
```typescript
usePipelineInSortLookup?: boolean
```
When `false`, modify the sortAggregation pipeline in `buildSortParams()` so that we don't use the `pipeline` property in the `$lookup` aggregation. Results in slightly worse performance when sorting by relationship properties.
### Limitations
This PR does not add support for transactions or creating indexes in firestore.
### Fixes
Fixed a bug (and added a test) where you weren't able to sort by multiple properties on a relationship field.
### Future work
1. Firestore supports simple `$lookup` aggregations but other databases might not. Could add a `useSortAggregations` property which can be used to disable aggregations in sorting.
---------
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Sasha <64744993+r1tsuu@users.noreply.github.com>
This is a follow-up to https://github.com/payloadcms/payload/pull/13060.
There are a bunch of other db adapter methods that use `upsertRow` for
updates: `updateGlobal`, `updateGlobalVersion`, `updateJobs`,
`updateMany`, `updateVersion`.
The previous PR had the logic for using the optimized row updating logic
inside the `updateOne` adapter. This PR moves that logic to the original
`upsertRow` function. Benefits:
- all the other db methods will benefit from this massive optimization
as well. This will be especially relevant for optimizing postgres job
queue initial updates - we should be able to close
https://github.com/payloadcms/payload/pull/11865 after another follow-up
PR
- easier to read db adapter methods due to less code.
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210803039809810
Based on https://github.com/payloadcms/payload/pull/13060 which should
be merged first
This PR adds ability to update number fields atomically, which could be
important with parallel writes. For now we support this only via
`payload.db.updateOne`.
For example:
```js
// increment by 10
const res = await payload.db.updateOne({
data: {
number: {
$inc: 10,
},
},
collection: 'posts',
where: { id: { equals: post.id } },
})
// decrement by 3
const res2 = await payload.db.updateOne({
data: {
number: {
$inc: -3,
},
},
collection: 'posts',
where: { id: { equals: post.id } },
})
```
### What?
Fixes the export field selection dropdown to correctly differentiate
between fields in named and unnamed tabs.
### Why?
Previously, when a `tabs` field contained both named and unnamed tabs,
subfields with the same `name` would appear as duplicates in the
dropdown (e.g. `Tab To CSV`, `Tab To CSV`). Additionally, selecting a
field from a named tab would incorrectly map it to the unnamed version
due to shared labels and missing path prefixes.
### How?
- Updated the `reduceFields` utility to manually construct the field
path and label using the tab’s `name` if present.
- Ensured unnamed tabs treat subfields as top-level and skip prefixing
altogether.
- Adjusted label prefix logic to show `Named Tab > Field Name` when
appropriate.
#### Before
<img width="169" height="79" alt="Screenshot 2025-07-15 at 2 55 14 PM"
src="https://github.com/user-attachments/assets/2ab2d19e-41a3-4be2-8496-1da2a79f88e1"
/>
#### After
<img width="211" height="79" alt="Screenshot 2025-07-15 at 2 50 38 PM"
src="https://github.com/user-attachments/assets/0620e96a-71cd-4eb1-9396-30d461ed47a5"
/>
Previously, we were always initializing cronjobs when calling
`getPayload` or `payload.init`.
This is undesired in bin scripts - we don't want cron jobs to start
triggering db calls while we're running an initial migration using
`payload migrate` for example. This has previously led to a race
condition, triggering the following, occasional error, if job autoruns
were enabled:
```ts
DrizzleQueryError: Failed query: select "payload_jobs"."id", "payload_jobs"."input", "payload_jobs"."completed_at", "payload_jobs"."total_tried", "payload_jobs"."has_error", "payload_jobs"."error", "payload_jobs"."workflow_slug", "payload_jobs"."task_slug", "payload_jobs"."queue", "payload_jobs"."wait_until", "payload_jobs"."processing", "payload_jobs"."updated_at", "payload_jobs"."created_at", "payload_jobs_log"."data" as "log" from "payload_jobs" "payload_jobs" left join lateral (select coalesce(json_agg(json_build_array("payload_jobs_log"."_order", "payload_jobs_log"."id", "payload_jobs_log"."executed_at", "payload_jobs_log"."completed_at", "payload_jobs_log"."task_slug", "payload_jobs_log"."task_i_d", "payload_jobs_log"."input", "payload_jobs_log"."output", "payload_jobs_log"."state", "payload_jobs_log"."error") order by "payload_jobs_log"."_order" asc), '[]'::json) as "data" from (select * from "payload_jobs_log" "payload_jobs_log" where "payload_jobs_log"."_parent_id" = "payload_jobs"."id" order by "payload_jobs_log"."_order" asc) "payload_jobs_log") "payload_jobs_log" on true where ("payload_jobs"."completed_at" is null and ("payload_jobs"."has_error" is null or "payload_jobs"."has_error" <> $1) and "payload_jobs"."processing" = $2 and ("payload_jobs"."wait_until" is null or "payload_jobs"."wait_until" < $3) and "payload_jobs"."queue" = $4) order by "payload_jobs"."created_at" asc limit $5
params: true,false,2025-07-10T21:25:03.002Z,autorunSecond,100
at NodePgPreparedQuery.queryWithCache (/Users/alessio/Documents/GitHub/payload2/node_modules/.pnpm/drizzle-orm@0.44.2_@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5__@opentelemetr_asjmtflojkxlnxrshoh4fj5f6u/node_modules/src/pg-core/session.ts:74:11)
at processTicksAndRejections (node:internal/process/task_queues:105:5)
at /Users/alessio/Documents/GitHub/payload2/node_modules/.pnpm/drizzle-orm@0.44.2_@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5__@opentelemetr_asjmtflojkxlnxrshoh4fj5f6u/node_modules/src/node-postgres/session.ts:154:19
... 6 lines matching cause stack trace ...
at N._trigger (/Users/alessio/Documents/GitHub/payload2/node_modules/.pnpm/croner@9.0.0/node_modules/croner/dist/croner.cjs:1:16806) {
query: `select "payload_jobs"."id", "payload_jobs"."input", "payload_jobs"."completed_at", "payload_jobs"."total_tried", "payload_jobs"."has_error", "payload_jobs"."error", "payload_jobs"."workflow_slug", "payload_jobs"."task_slug", "payload_jobs"."queue", "payload_jobs"."wait_until", "payload_jobs"."processing", "payload_jobs"."updated_at", "payload_jobs"."created_at", "payload_jobs_log"."data" as "log" from "payload_jobs" "payload_jobs" left join lateral (select coalesce(json_agg(json_build_array("payload_jobs_log"."_order", "payload_jobs_log"."id", "payload_jobs_log"."executed_at", "payload_jobs_log"."completed_at", "payload_jobs_log"."task_slug", "payload_jobs_log"."task_i_d", "payload_jobs_log"."input", "payload_jobs_log"."output", "payload_jobs_log"."state", "payload_jobs_log"."error") order by "payload_jobs_log"."_order" asc), '[]'::json) as "data" from (select * from "payload_jobs_log" "payload_jobs_log" where "payload_jobs_log"."_parent_id" = "payload_jobs"."id" order by "payload_jobs_log"."_order" asc) "payload_jobs_log") "payload_jobs_log" on true where ("payload_jobs"."completed_at" is null and ("payload_jobs"."has_error" is null or "payload_jobs"."has_error" <> $1) and "payload_jobs"."processing" = $2 and ("payload_jobs"."wait_until" is null or "payload_jobs"."wait_until" < $3) and "payload_jobs"."queue" = $4) order by "payload_jobs"."created_at" asc limit $5`,
params: [ true, false, '2025-07-10T21:25:03.002Z', 'autorunSecond', 100 ],
cause: error: relation "payload_jobs" does not exist
at /Users/alessio/Documents/GitHub/payload2/node_modules/.pnpm/pg@8.16.3/node_modules/pg/lib/client.js:545:17
at processTicksAndRejections (node:internal/process/task_queues:105:5)
at /Users/alessio/Documents/GitHub/payload2/node_modules/.pnpm/drizzle-orm@0.44.2_@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5__@opentelemetr_asjmtflojkxlnxrshoh4fj5f6u/node_modules/src/node-postgres/session.ts:161:13
at NodePgPreparedQuery.queryWithCache (/Users/alessio/Documents/GitHub/payload2/node_modules/.pnpm/drizzle-orm@0.44.2_@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5__@opentelemetr_asjmtflojkxlnxrshoh4fj5f6u/node_modules/src/pg-core/session.ts:72:12)
at /Users/alessio/Documents/GitHub/payload2/node_modules/.pnpm/drizzle-orm@0.44.2_@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5__@opentelemetr_asjmtflojkxlnxrshoh4fj5f6u/node_modules/src/node-postgres/session.ts:154:19
at find (/Users/alessio/Documents/GitHub/payload2/packages/drizzle/src/find/findMany.ts:162:19)
at Object.updateMany (/Users/alessio/Documents/GitHub/payload2/packages/drizzle/src/updateJobs.ts:26:16)
at updateJobs (/Users/alessio/Documents/GitHub/payload2/packages/payload/src/queues/utilities/updateJob.ts:102:37)
at runJobs (/Users/alessio/Documents/GitHub/payload2/packages/payload/src/queues/operations/runJobs/index.ts:181:25)
at Object.run (/Users/alessio/Documents/GitHub/payload2/packages/payload/src/queues/localAPI.ts:137:12)
at N.fn (/Users/alessio/Documents/GitHub/payload2/packages/payload/src/index.ts:866:13)
at N._trigger (/Users/alessio/Documents/GitHub/payload2/node_modules/.pnpm/croner@9.0.0/node_modules/croner/dist/croner.cjs:1:16806) {
length: 112,
severity: 'ERROR',
code: '42P01',
detail: undefined,
hint: undefined,
position: '406',
internalPosition: undefined,
internalQuery: undefined,
where: undefined,
schema: undefined,
table: undefined,
column: undefined,
dataType: undefined,
constraint: undefined,
file: 'parse_relation.c',
line: '1449',
routine: 'parserOpenTable'
}
}
```
This PR makes running crons opt-in using a new `cron` flag. By default,
no cron jobs will be created.
Fixes#7799
Fixes a type issue where all fields in RenderFields['fields'] admin
properties were being marked as required since we were using `Pick`.
Adds a helper type to allow extracting properties with correct
optionality.
Payload is designed with performance in mind, but its customizability
means that there are many ways to configure your app that can impact
performance.
While Payload provides several features and best practices to help you
optimize your app's specific performance needs, these are not currently
well surfaced and can be obscure.
Now:
- A high-level performance doc now exists at `/docs/performance`
- There's a new section on performance within the `/docs/queries` doc
- There's a new section on performance within the `/docs/hooks` doc
- There's a new section on performance within the
`/docs/custom-components` doc
This PR also:
- Restructures and elaborates on the `/docs/queries/pagination` docs
- Adds a new `/docs/database/indexing` doc
- More
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210743577153856
### What?
Adds support for excluding specific fields from the import-export plugin
using a custom field config.
### Why?
Some fields should not be included in exports or previews. This feature
allows users to flag those fields directly in the field config.
### How?
- Introduced a `plugin-import-export.disabled: true` custom field
property.
- Automatically collects and stores disabled field accessors in
`collection.admin.custom['plugin-import-export'].disabledFields`.
- Excludes these fields from the export field selector, preview table,
and final export output (CSV/JSON).
🤖 Automated bump of templates for v3.47.0
Triggered by user: @AlessioGr
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
### What?
Adds a new `format` option to the `plugin-import-export` config that
allows users to force the export format (`csv` or `json`) and hide the
format dropdown from the export UI.
### Why?
In some use cases, allowing the user to select between CSV and JSON is
unnecessary or undesirable. This new option allows plugin consumers to
lock the format and simplify the export interface.
### How?
- Added a `format?: 'csv' | 'json'` field to `ImportExportPluginConfig`.
- When defined, the `format` field in the export UI is:
- Hidden via `admin.condition`
- Pre-filled via `defaultValue`
- Updated `getFields` to accept the plugin config and apply logic
accordingly.
### Example
```ts
importExportPlugin({
format: 'json',
})
Just spent an entire hour trying to figure out why my environment
variables are `undefined` on the client. Turns out, when running `pnpm
next build --experimental-build-mode compile`, it skips the environment
variable inlining step.
This adds a new section to the docs mentioning that you can use `pnpm
next build --experimental-build-mode generate-env` to manually inline
them.
### What?
Adds support for two new plugin options in the import-export plugin:
- `disableSave`: disables the "Save" button in the export UI view.
- `disableDownload`: disables the "Download" button in the export UI
view.
### Why?
This allows implementers to control user access to export actions based
on context or feature requirements. For example, some use cases may want
to preview an export without saving it, or disable downloads entirely.
### How?
- Injected `disableSave` and `disableDownload` into `admin.custom` for
the `exports` collection.
- Updated the `ExportSaveButton` component to conditionally render each
button based on the injected flags.
- Defaults to `false` if the values are not explicitly set in
`pluginConfig`.
### Example
```ts
importExportPlugin({
disableSave: true, // Defaults to false
disableDownload: true, // Defaults to false
})
Fixes the post-release-templates workflow by building payload before
running the gen templates script
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210784057163370