Required for #13005.
Opening an autosave-enabled document within a drawer triggers an
infinite loop when the root document is also autosave-enabled.
This was for two reasons:
1. Autosave would run and change the `updatedAt` timestamp. This would
trigger another run of autosave, and so on. The timestamp is now removed
before comparison to ensure that sequential autosave runs are skipped.
2. The `dequal()` call was not being given the `.current` property off
the ref object. This meant that is was never evaluate to `true` and
therefore never skip unnecessary autosaves to begin with.
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210697235723932
### What?
Ensure the export preview table includes all field keys as columns, even
if those fields are not populated in any of the returned documents.
### Why?
Previously, if none of the documents in the preview result had a value
for a given field, that column would be missing entirely from the
preview table.
### How?
- Introduced a `getFlattenedFieldKeys` utility that recursively extracts
all missing flattened field accessors from the collection’s config that
are undefined
- Updates the preview UI logic to build columns from all flattened keys,
not just the first document
Adds support for `halfvec` and `sparsevec` and `bit` (binary vector)
column types. This is required for supporting indexing of embeddings >
2000 dimensions on postgres using the pg-vector extension.
Supersedes #12992. Partially closes#12975.
Right now autosave-enabled documents opened within a drawer will
unnecessarily remount on every autosave interval, causing loss of input
focus, etc. This makes it nearly impossible to edit these documents,
especially if the interval is very short.
But the same is true for non-autosave documents when "manually" saving,
e.g. pressing the "save draft" or "publish changes" buttons. This has
gone largely unnoticed, however, as the user has already lost focus of
the form to interact with these controls, and they somewhat expect this
behavior or at least accept it.
Now, the form remains mounted across autosave events and the user's
cursor never loses focus. Much better.
Before:
https://github.com/user-attachments/assets/a159cdc0-21e8-45f6-a14d-6256e53bc3df
After:
https://github.com/user-attachments/assets/cd697439-1cd3-4033-8330-a5642f7810e8
Related: #12842
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210689077645986
### What?
The "Preview Sizes" button in the file upload UI was not showing up if:
- `crop` and `focalPoint` were both `false`
- No `customUploadActions` were provided
- But image sizes were configured
### Why?
This happened because `UploadActions` wasn’t rendered at all unless
adjustments or custom actions were present.
### How?
Update the conditional in `StaticFileDetails` to also render
`UploadActions` when:
- `hasImageSizes` is `true` and the document has a `filename`
Fixes#12832
Fixes#12826
Leave without saving was being triggered when no changes were made to
the tenant. This should only happen if the value in form state differs
from that of the selected tenant, i.e. after changing tenants.
Adds tenant selector syncing so the selector updates when a tenant is
added or the name is edited.
Also adds E2E for most multi-tenant admin functionality.
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210562742356842
Mounts live preview to `../:id` instead `../:id/preview`.
This is a huge win for both UX and a maintainability standpoint.
Here are just a few of those wins:
1. If you edit a document, _then_ decide you want to preview those
changes, you are currently presented with the `LeaveWithoutSaving` modal
and are forced to either save your edits or clear them. This is because
you are being navigated to an entirely new page with it's own form
context. Instead, you should be able to freely navigate back and forth
between the two.
2. If you are an editor who most often uses Live Preview, or you are
editing a collection that typically requires it, you likely want it to
automatically enter live preview mode when you open a document.
Currently, the user has to navigate to the document _first_, then use
the live preview tab. Instead, you should be able to set a preference
and avoid this extra step.
3. Since the inception of Live Preview, we've been maintaining largely
the same code across the default edit view and the live preview view,
which often became out of sync and inconsistent—but they're essentially
doing the same thing. While we could abstract a lot of this out, it is
no longer necessary if the two views are combined into one.
This change does also include some small modifications to UI. The "Live
Preview" tab no longer exists, and instead has been replaced with a
button placed next to the document controls (subject to change).
Before:
https://github.com/user-attachments/assets/48518b02-87ba-4750-ba7b-b21b5c75240a
After:
https://github.com/user-attachments/assets/a8ec8657-a6d6-4ee1-b9a7-3c1173bcfa96
Adds full session functionality into Payload's existing local
authentication strategy.
It's enabled by default, because this is a more secure pattern that we
should enforce. However, we have provided an opt-out pattern for those
that want to stick to stateless JWT authentication by passing
`collectionConfig.auth.useSessions: false`.
Todo:
- [x] @jessrynkar to update the Next.js server functions for refresh and
logout to support these new features
- [x] @jessrynkar resolve build errors
---------
Co-authored-by: Elliot DeNolf <denolfe@gmail.com>
Co-authored-by: Jessica Chowdhury <jessica@trbl.design>
Co-authored-by: Jarrod Flesch <30633324+JarrodMFlesch@users.noreply.github.com>
Co-authored-by: Sasha <64744993+r1tsuu@users.noreply.github.com>
If you (using the MongoDB adapter) delete a block from the payload
config, but still have some data with that block in the DB, you'd
receive in the admin panel an error like:
```
Block with type "cta" was found in block data, but no block with that type is defined in the config for field with schema path pages.blocks
```
Now, we remove those "unknown" blocks at the DB adapter level.
Co-authored-by: Dan Ribbens <dan.ribbens@gmail.com>
### What?
This PR solves an issue with validation of the `point` field in Payload
CMS. If the value is `null` and the field is not required, the
validation will return `true` before trying to examine the contents of
the field
### Why?
If the point field is given a value, and saved, it is then impossible to
successfully "unset" the point field, either through the CMS UI or
through a hook like `beforeChange`. Trying to do so will throw this
error:
```
[17:09:41] ERROR: Cannot read properties of null (reading '0')
err: {
"type": "TypeError",
"message": "Cannot read properties of null (reading '0')",
"stack":
TypeError: Cannot read properties of null (reading '0')
at point (webpack-internal:///(rsc)/./node_modules/.pnpm/payload@3.43.0_graphql@16.10.0_typescript@5.7.3/node_modules/payload/dist/fields/validations.js:622:40)
```
because a value of `null` will not be changed to the default value of
`['','']`, which in any case does not pass MongoDB validation either.
```
[17:22:49] ERROR: Cast to [Number] failed for value "[ NaN, NaN ]" (type string) at path "location.coordinates.0" because of "CastError"
err: {
"type": "CastError",
"message": "Cast to [Number] failed for value \"[ NaN, NaN ]\" (type string) at path \"location.coordinates.0\" because of \"CastError\"",
"stack":
CastError: Cast to [Number] failed for value "[ NaN, NaN ]" (type string) at path "location.coordinates.0" because of "CastError"
at SchemaArray.cast (webpack-internal:///(rsc)/./node_modules/.pnpm/mongoose@8.15.1_@aws-sdk+credential-providers@3.778.0/node_modules/mongoose/lib/schema/array.js:414:15)
```
### How?
This adds a check to the top of the `point` validation function and
returns early before trying to examine the contents of the point field
---------
Co-authored-by: Dave Ryan <dmr@Daves-MacBook-Pro.local>
https://github.com/payloadcms/payload/pull/12861 introduced some flaky
test selectors. Specifically bulk editing values and then looking for
the previous values in the table rows.
This PR fixes the flakes and fixes eslint errors in `findTableRow` and
`findTableCell` helper funcitons.
### What?
Set the `limit` query param on API requests called within the
`useLivePreview` hook.
### Why?
We are heavily relying on the block system in our pages and we reuse the
media collection in a lot of the block types. When the page has more
than 10 images, the API request doesn't fetch all of them for live
preview due to the default 10 item `limit`. This PR allows the preview
page to override this `limit` so that all the items get correctly
fetched.
### Our current workaround
Set the `depth` param of `useLivePreview` hook like this:
```
useLivePreview({
// ...
depth: '1000&limit=1000',
})
```
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210643905956939
---------
Co-authored-by: Jacob Fletcher <jacobsfletch@gmail.com>
### What?
Fixes#12811
### Why?
Custom Views become unreachable when admin route is set to "/" because
the forward slash of the current route gets removed before routing to
custom view
### How?
Fixes #
-->
Fixes#12811
Custom Views become unreachable when admin route is set to "/" because
the forward slash of the current route gets removed before routing to
custom view
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210582760545830
---------
Co-authored-by: Jacob Fletcher <jacobsfletch@gmail.com>
This PR fixes an issue in the export logic where CSV downloads would
include duplicate rows and repeated column headers across paginated
batches.
Key changes:
- Ensured `page` is incremented correctly after each `payload.find` call
- Tracked and wrote CSV column headers only once for the first page
- Prevented row duplication by removing unused `result` initialization
and using isolated `page` tracking
- Streamlined both download and non-download logic for consistent batch
processing
This resolves incorrect row counts and header duplication in large CSV
exports.
We were running scripts as they were without encompassing our logic in a
function for jest's teardown and we were subsequently running
`process.exit(0)` which meant that tests didn't correctly return an
error status code when they failed in CI.
The following tests have been skipped as well:
```
● postgres vector custom column › should add a vector column and query it
● Sort › Local API › Orderable › should not break with existing base 62 digits
● Sort › Local API › Orderable join › should set order by default
● Sort › Local API › Orderable join › should allow setting the order with the local API
● Sort › Local API › Orderable join › should sort join docs in the correct
```
---------
Co-authored-by: Elliot DeNolf <denolfe@gmail.com>
Co-authored-by: Alessio Gravili <alessio@gravili.de>
### What?
Fixes a crash when exporting documents to CSV if a custom `toCSV`
function tries to access properties on a `null` value.
### Why?
In some cases (especially with Postgres), fields like relationships may
be explicitly `null` if unset. Custom `toCSV` functions that assume the
value is always defined would throw a `TypeError` when attempting to
access nested properties like `value.id`.
### How?
Added a null check in the custom `toCSV` implementation for
`customRelationship`, ensuring the field is an object before accessing
its properties.
This prevents the export from failing and makes custom field transforms
more resilient to missing or optional values.
### What?
Fixes CSV export support for polymorphic relationship and upload fields.
### Why?
Polymorphic fields in Payload use a `{ relationTo, value }` structure.
The previous implementation incorrectly accessed `.id` directly on the
top-level object, which caused issues depending on query depth or data
shape. This led to missing or invalid values in exported CSVs.
### How?
- Updated getCustomFieldFunctions to safely access relationTo and
value.id from polymorphic fields
- Ensured `hasMany` polymorphic fields export each related ID and
relationTo as separate CSV columns
### What?
Reflects any access control restrictions applied to Auth fields in the
UI. I.e. if `email` has `update: () => false` the field should be
displayed as read-only.
### Why?
Currently any access control that is applied to auth fields is
functional but is not matched within the UI.
For example:
- `password` that does not have read access will not return data, but
the field will still be shown when it should be hidden
- `email` that does not have update access, updating the field and
saving the doc will **not** update the data, but it should be displayed
as read-only so nothing can be filled out and the updating restriction
is made clear
### How?
Passes field permissions through to the Auth fields UI and adds docs
with instructions on how to override auth field access.
#### Testing
Use `access-control` test suite and `auth` collection. Tests added to
`access-control` e2e.
Fixes#11569
### What?
This fix prevents custom row labels being removed when duplicating array
items.
### Why?
Currently, when you have an array with custom row labels, if you create
a new array item by duplicating an existing item, the new item will have
no custom row label until you refresh the page.
### How?
During the `duplicate` process, we remove any react components from the
field state. This change intentionally re-adds the `RowLabel` if one
exists.
#### Reported by client
### What?
Ensure fields using a custom `toCSV` function that return `undefined`
are excluded from the exported CSV.
### Why?
Previously, when a `toCSV` function returned `undefined`, the field key
would still be added to the export row. This caused the column to appear
in the CSV output with an empty string value (`""`), leading to
unexpected results and failed assertions in tests expecting the field to
be truly omitted.
### How?
Updated the `flattenObject` utility to:
- Check if the value returned by a `toCSV` function is `undefined`
- Only assign the value to the export row if it is explicitly defined
- Applied this logic in all relevant paths (arrays, objects, primitives)
This change ensures that fields are only included in the CSV when a
meaningful value is returned.
Because of this check, if a JSON with a property `target` was saved it
would become malformed.
For example trying to save a JSON field:
```json
{
"target": {
"value": {
"foo": "bar"
}
}
}
```
would result in:
```json
{
"foo": "bar"
}
```
And trying to save:
```json
{
"target": "foo"
}
```
would just not save anything:
```json
null
```
I went through all of the field types and did not find a single one that
would rely on this ternary. Seems like it always defaulted to `const val
= e`, except the unexpected case described previously.
Fixes#12873
Added test may be overkill, will remove if so.
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210628466702813
---------
Co-authored-by: Jacob Fletcher <jacobsfletch@gmail.com>
### What?
This PR fixes a runtime error that occurs when opening the "More
versions..." drawer while browsing the versions for a global. It also
fixes a minor runtime error when navigating to a global version view
where an optional chaining operator was missing as the collection
variable would be undefined as we are viewing a global.
This PR also adds an e2e test to ensure the versions drawer is
accessible and renders the appropriate number of versions for globals.
### Why?
To properly render global version views without errors.
### How?
By threading the global slug to the versions drawer and adjusting some
properties of the `renderDocument` server function call there. This PR
also adds an optional chaining operator the `versionUseAsTitle` in the
original view to prevent an error in globals.
Notes:
- This was brought to my attention in Discord by a handful of users
Before: (Missing optional chaining error)
[error1-verions-Editing---Menu---Payload.webm](https://github.com/user-attachments/assets/3dc4dbe4-ee5a-43df-8d25-05128b05e063)
Before: (Versions drawer error)
[error2-versions-Editing---Menu---Payload.webm](https://github.com/user-attachments/assets/98c3e1da-cb0b-4a36-bafd-240f641e8814)
After:
[versions-globals-Dashboard---Payload.webm](https://github.com/user-attachments/assets/c778d3f0-a8fe-4e31-92cb-62da8e6d8cb4)
Fixes an issue when querying deeply new relationship virtual fields with
`draft: true`. Changes the method for `where` sanitization, before it
was done in `validateSearchParam` which didn't work with versions
properly, now there's a separate `sanitizeWhereQuery` function that does
this.
Fixes https://github.com/payloadcms/payload/issues/12847
- Uses rem instead of em for inline padding, for indent consistency
between nodes with different font sizes
- Use rem instead of px in deprecated html converters for consistency
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210564720112211
### What?
After creating a new document from a relationship field, this doc should
automatically become the selected document for that relationship field.
This is the expected and current behavior. However, when the
relationship ties to a collection with autosave enabled, this does not
happen.
### Why?
This is expected behavior and should still happen when the relationship
is using an autosave enabled collection.
### How?
1. The logic in `addNewRelation` contained an `if` statement that
checked for `operation === 'create'` - however when autosave is enabled,
the `create` operation runs on the first data update and subsequently it
is a `update` operation.
2. The `onSave` from the document drawer provider was not being run as
part of the autosave workflow.
#### Reported by client.
This simplifies workflow / task error handling, as well as cancelling
jobs. Previously, we were handling errors when they occur and passing
through error state using a `state` object - errors were then handled in
multiple areas of the code.
This PR adds new, clean `TaskError`, `WorkflowError` and
`JobCancelledError` errors that are thrown when they occur and are
handled **in one single place**, massively cleaning up complex functions
like
[payload/src/queues/operations/runJobs/runJob/getRunTaskFunction.ts](https://github.com/payloadcms/payload/compare/refactor/jobs-errors?expand=1#diff-53dc7ccb7c8e023c9ba63fdd2e78c32ad0be606a2c64a3512abad87893f5fd21)
Performance will also be positively improved by this change -
previously, as task / workflow failure or cancellation would have
resulted in multiple, separate `updateJob` db calls, as data
modifications to the job object required for storing failure state were
done multiple times in multiple areas of the codebase. Most notably,
task error state was handled and updated separately from workflow error
state.
Now, it's just a clean, single `updateJob` call
This PR also does the following:
- adds a new test for `deleteJobOnComplete` behavior
- cleans up test suite
- ensures `deleteJobOnComplete` does not delete definitively failed jobs
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210553277813320
Adds eslint rule `no-restricted-exports` with value `off` for payload
config files inside our `test` suite since we have to export with
default from those
Previously, there were multiple ways to type a running job:
- `GeneratedTypes['payload-jobs']` - only works in an installed project
- is `any` in monorepo
- `BaseJob` - works everywhere, but does not incorporate generated types
which may include type for custom fields added to the jobs collection
- `RunningJob<>` - more accurate version of `BaseJob`, but same problem
This PR deprecated all those types in favor of a new `Job` type.
Benefits:
- Works in both monorepo and installed projects. If no generated types
exist, it will automatically fall back to `BaseJob`
- Comes with an optional generic that can be used to narrow down
`job.input` based on the task / workflow slug. No need to use a separate
type helper like `RunningJob<>`
With this new type, I was able to replace every usage of
`GeneratedTypes['payload-jobs']`, `BaseJob` and `RunningJob<>` with the
simple `Job` type.
Additionally, this PR simplifies some of the logic used to run jobs