If you (using the MongoDB adapter) delete a block from the payload
config, but still have some data with that block in the DB, you'd
receive in the admin panel an error like:
```
Block with type "cta" was found in block data, but no block with that type is defined in the config for field with schema path pages.blocks
```
Now, we remove those "unknown" blocks at the DB adapter level.
Co-authored-by: Dan Ribbens <dan.ribbens@gmail.com>
### What
This PR updates the import-export plugin's `<Preview />` component to
render table columns and rows using the same logic as the CSV export.
Key changes:
- Adds a new `/api/preview-data` custom REST endpoint that:
- Accepts filters (`fields`, `where`, `sort`, `draft`, `limit`)
- Uses `getCustomFieldFunctions` and `flattenObject` to transform
documents
- Returns deeply flattened rows identical to the CSV export
- Refactors the <Preview /> component to:
- POST preview config to the new endpoint instead of querying the
collection directly
- Match column ordering and flattening logic with the `createExport`
function
- Ensures consistency across CSV downloads and in-admin previews
-Adds JSON preview
This ensures preview results now exactly match exported CSV content,
including support for custom field transformers and polymorphic fields.
---------
Co-authored-by: Dan Ribbens <dan.ribbens@gmail.com>
### What?
This PR solves an issue with validation of the `point` field in Payload
CMS. If the value is `null` and the field is not required, the
validation will return `true` before trying to examine the contents of
the field
### Why?
If the point field is given a value, and saved, it is then impossible to
successfully "unset" the point field, either through the CMS UI or
through a hook like `beforeChange`. Trying to do so will throw this
error:
```
[17:09:41] ERROR: Cannot read properties of null (reading '0')
err: {
"type": "TypeError",
"message": "Cannot read properties of null (reading '0')",
"stack":
TypeError: Cannot read properties of null (reading '0')
at point (webpack-internal:///(rsc)/./node_modules/.pnpm/payload@3.43.0_graphql@16.10.0_typescript@5.7.3/node_modules/payload/dist/fields/validations.js:622:40)
```
because a value of `null` will not be changed to the default value of
`['','']`, which in any case does not pass MongoDB validation either.
```
[17:22:49] ERROR: Cast to [Number] failed for value "[ NaN, NaN ]" (type string) at path "location.coordinates.0" because of "CastError"
err: {
"type": "CastError",
"message": "Cast to [Number] failed for value \"[ NaN, NaN ]\" (type string) at path \"location.coordinates.0\" because of \"CastError\"",
"stack":
CastError: Cast to [Number] failed for value "[ NaN, NaN ]" (type string) at path "location.coordinates.0" because of "CastError"
at SchemaArray.cast (webpack-internal:///(rsc)/./node_modules/.pnpm/mongoose@8.15.1_@aws-sdk+credential-providers@3.778.0/node_modules/mongoose/lib/schema/array.js:414:15)
```
### How?
This adds a check to the top of the `point` validation function and
returns early before trying to examine the contents of the point field
---------
Co-authored-by: Dave Ryan <dmr@Daves-MacBook-Pro.local>
### What?
Set the `limit` query param on API requests called within the
`useLivePreview` hook.
### Why?
We are heavily relying on the block system in our pages and we reuse the
media collection in a lot of the block types. When the page has more
than 10 images, the API request doesn't fetch all of them for live
preview due to the default 10 item `limit`. This PR allows the preview
page to override this `limit` so that all the items get correctly
fetched.
### Our current workaround
Set the `depth` param of `useLivePreview` hook like this:
```
useLivePreview({
// ...
depth: '1000&limit=1000',
})
```
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210643905956939
---------
Co-authored-by: Jacob Fletcher <jacobsfletch@gmail.com>
Needed for #12860.
If the admin panel broadcasts foreign postMessage events, i.e. those
without the `payload-live-preview` signature, client-side live preview
subscriptions will reset back to initial state.
This is because we dispatch two postMessage events in the admin panel,
one for client-side live preview to catch (`payload-live-preview`), and
the other for server-side live preview (`payload-document-event`). This
was not previously noticeable because both events would only get called
simultaneously on initial render, where initial state is already the
expected result.
Now that Live Preview can be freely toggled on and off, both events are
frequently dispatched and very obviously disregard the current working
state.
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210628466702818
Needed for #12860.
The client config unnecessarily omits the `livePreview.collections` and
`livePreview.globals` properties. This is because the root live preview
config extends the type with these two additional properties without
sharing it elsewhere. To led to the client sanitization function
overlooking these additional properties, as there was no type indication
that they exist.
The `collections` and `globals` properties are now appended to the
client config as expected, and the root live preview is standardized
behind the `RootLivePreviewConfig` type to ensure no properties are
lost.
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210628466702823
This PR adds int tests with vitest and e2e tests with playwright
directly into our templates.
The following are also updated:
- bumps core turbo to 2.5.4 in monorepo
- blank and website templates moved up to be part of the monorepo
workspace
- this means we now have thes templates filtered out in pnpm commands in
package.json
- they will now by default use workspace packages which we can use for
manual testing and int and e2e tests
- note that turbo doesnt work with these for dev in monorepo context
- CPA script will fetch latest version and then replace `workspace:*` or
the pinned version in the package.json before installation
- blank template no longer uses _template as a base, this is to simplify
management for workspace
- updated the generate template variations script
Partially closes#12121.
When you edit a document in Live Preview using the default iframe
window, then attempt to open the window as a popup, the
`LeaveWithoutSaving` modal will appear.
This is because the `usePreventLeave` hook watches for anchor tags that
might cause a page navigation, and rightfully warns the user before they
navigate away and lose their changes. The reason the popup button
triggers this hook is because it uses an anchor tag with an href for
accessibility, which fires events that are caught and processed by the
hook.
The fix is to add the `target="_blank"` attribute here so that the hook
understands that these events do not navigate the user away from the
page and can be ignored.
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210643905956946
### What?
Fixes#12811
### Why?
Custom Views become unreachable when admin route is set to "/" because
the forward slash of the current route gets removed before routing to
custom view
### How?
Fixes #
-->
Fixes#12811
Custom Views become unreachable when admin route is set to "/" because
the forward slash of the current route gets removed before routing to
custom view
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210582760545830
---------
Co-authored-by: Jacob Fletcher <jacobsfletch@gmail.com>
This PR fixes an issue in the export logic where CSV downloads would
include duplicate rows and repeated column headers across paginated
batches.
Key changes:
- Ensured `page` is incremented correctly after each `payload.find` call
- Tracked and wrote CSV column headers only once for the first page
- Prevented row duplication by removing unused `result` initialization
and using isolated `page` tracking
- Streamlined both download and non-download logic for consistent batch
processing
This resolves incorrect row counts and header duplication in large CSV
exports.
### What?
Fixes CSV export support for polymorphic relationship and upload fields.
### Why?
Polymorphic fields in Payload use a `{ relationTo, value }` structure.
The previous implementation incorrectly accessed `.id` directly on the
top-level object, which caused issues depending on query depth or data
shape. This led to missing or invalid values in exported CSVs.
### How?
- Updated getCustomFieldFunctions to safely access relationTo and
value.id from polymorphic fields
- Ensured `hasMany` polymorphic fields export each related ID and
relationTo as separate CSV columns
### What?
Reflects any access control restrictions applied to Auth fields in the
UI. I.e. if `email` has `update: () => false` the field should be
displayed as read-only.
### Why?
Currently any access control that is applied to auth fields is
functional but is not matched within the UI.
For example:
- `password` that does not have read access will not return data, but
the field will still be shown when it should be hidden
- `email` that does not have update access, updating the field and
saving the doc will **not** update the data, but it should be displayed
as read-only so nothing can be filled out and the updating restriction
is made clear
### How?
Passes field permissions through to the Auth fields UI and adds docs
with instructions on how to override auth field access.
#### Testing
Use `access-control` test suite and `auth` collection. Tests added to
`access-control` e2e.
Fixes#11569
### What?
This fix prevents custom row labels being removed when duplicating array
items.
### Why?
Currently, when you have an array with custom row labels, if you create
a new array item by duplicating an existing item, the new item will have
no custom row label until you refresh the page.
### How?
During the `duplicate` process, we remove any react components from the
field state. This change intentionally re-adds the `RowLabel` if one
exists.
#### Reported by client
### What?
Ensure fields using a custom `toCSV` function that return `undefined`
are excluded from the exported CSV.
### Why?
Previously, when a `toCSV` function returned `undefined`, the field key
would still be added to the export row. This caused the column to appear
in the CSV output with an empty string value (`""`), leading to
unexpected results and failed assertions in tests expecting the field to
be truly omitted.
### How?
Updated the `flattenObject` utility to:
- Check if the value returned by a `toCSV` function is `undefined`
- Only assign the value to the export row if it is explicitly defined
- Applied this logic in all relevant paths (arrays, objects, primitives)
This change ensures that fields are only included in the CSV when a
meaningful value is returned.
Because of this check, if a JSON with a property `target` was saved it
would become malformed.
For example trying to save a JSON field:
```json
{
"target": {
"value": {
"foo": "bar"
}
}
}
```
would result in:
```json
{
"foo": "bar"
}
```
And trying to save:
```json
{
"target": "foo"
}
```
would just not save anything:
```json
null
```
I went through all of the field types and did not find a single one that
would rely on this ternary. Seems like it always defaulted to `const val
= e`, except the unexpected case described previously.
Fixes#12873
Added test may be overkill, will remove if so.
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210628466702813
---------
Co-authored-by: Jacob Fletcher <jacobsfletch@gmail.com>
### What?
This PR fixes a runtime error that occurs when opening the "More
versions..." drawer while browsing the versions for a global. It also
fixes a minor runtime error when navigating to a global version view
where an optional chaining operator was missing as the collection
variable would be undefined as we are viewing a global.
This PR also adds an e2e test to ensure the versions drawer is
accessible and renders the appropriate number of versions for globals.
### Why?
To properly render global version views without errors.
### How?
By threading the global slug to the versions drawer and adjusting some
properties of the `renderDocument` server function call there. This PR
also adds an optional chaining operator the `versionUseAsTitle` in the
original view to prevent an error in globals.
Notes:
- This was brought to my attention in Discord by a handful of users
Before: (Missing optional chaining error)
[error1-verions-Editing---Menu---Payload.webm](https://github.com/user-attachments/assets/3dc4dbe4-ee5a-43df-8d25-05128b05e063)
Before: (Versions drawer error)
[error2-versions-Editing---Menu---Payload.webm](https://github.com/user-attachments/assets/98c3e1da-cb0b-4a36-bafd-240f641e8814)
After:
[versions-globals-Dashboard---Payload.webm](https://github.com/user-attachments/assets/c778d3f0-a8fe-4e31-92cb-62da8e6d8cb4)
Fixes an issue when querying deeply new relationship virtual fields with
`draft: true`. Changes the method for `where` sanitization, before it
was done in `validateSearchParam` which didn't work with versions
properly, now there's a separate `sanitizeWhereQuery` function that does
this.
This PR removes the `packages/payload/src/assets` folder for the
following reasons:
- they were published to npm. Removing this decreases the install size
of payload (excluding dependencies) from 6.22MB => 5.12MB
- most assets were unused. The only used ones were moved to a different
directory that does not get published to npm
This also updates some outdated asset URLs in our examples
The `@payloadcms/next/auth` functions are unnecessarily wrapped with
`try...catch` blocks that propagate the original error as a plain
string. This makes it impossible for the end user's error handling to
differentiate between error types.
These functions also throw errors regardless, and therefore must be
wrapped with proper error handling anyway. Especially after removing the
internal logging in #12881, these blocks do not serve any purpose.
This PR also removes unused imports.
### What?
Removes the console.error() statement when there is a login error.
### Why?
IMO, Libraries should not pollute the console with log statements in all
but the most exceptional cases. This prevents users of the library from
controlling what goes to standard out. For example, if I want to use
structured logging, this log line breaks it.
It would be a little better if this console.error() only executed on
unexpected errors, but it executes even when a user puts the wrong email
/ password, so it gets printed relatively frequently.
I think you can just remove the logging and let the user of this
function catch the error and log as they see fit.
---------
Co-authored-by: Jacob Fletcher <jacobsfletch@gmail.com>
<!--
Thank you for the PR! Please go through the checklist below and make
sure you've completed all the steps.
Please review the
[CONTRIBUTING.md](https://github.com/payloadcms/payload/blob/main/CONTRIBUTING.md)
document in this repository if you haven't already.
The following items will ensure that your PR is handled as smoothly as
possible:
- PR Title must follow conventional commits format. For example, `feat:
my new feature`, `fix(plugin-seo): my fix`.
- Minimal description explained as if explained to someone not
immediately familiar with the code.
- Provide before/after screenshots or code diffs if applicable.
- Link any related issues/discussions from GitHub or Discord.
- Add review comments if necessary to explain to the reviewer the logic
behind a change -->
### What?
I found that the `devBundleServerPackages` parameter is not present in
the documentation because it was spelled as `sortOnOptions`.
### Why?
Cannot find `devBundleServerPackages` using vscode intellisense.
### How?
I simply changed back `sortOnOptions` to `options` in JSDoc comments.
Fixes https://github.com/payloadcms/payload/issues/12847
- Uses rem instead of em for inline padding, for indent consistency
between nodes with different font sizes
- Use rem instead of px in deprecated html converters for consistency
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210564720112211
### What?
After creating a new document from a relationship field, this doc should
automatically become the selected document for that relationship field.
This is the expected and current behavior. However, when the
relationship ties to a collection with autosave enabled, this does not
happen.
### Why?
This is expected behavior and should still happen when the relationship
is using an autosave enabled collection.
### How?
1. The logic in `addNewRelation` contained an `if` statement that
checked for `operation === 'create'` - however when autosave is enabled,
the `create` operation runs on the first data update and subsequently it
is a `update` operation.
2. The `onSave` from the document drawer provider was not being run as
part of the autosave workflow.
#### Reported by client.
This simplifies workflow / task error handling, as well as cancelling
jobs. Previously, we were handling errors when they occur and passing
through error state using a `state` object - errors were then handled in
multiple areas of the code.
This PR adds new, clean `TaskError`, `WorkflowError` and
`JobCancelledError` errors that are thrown when they occur and are
handled **in one single place**, massively cleaning up complex functions
like
[payload/src/queues/operations/runJobs/runJob/getRunTaskFunction.ts](https://github.com/payloadcms/payload/compare/refactor/jobs-errors?expand=1#diff-53dc7ccb7c8e023c9ba63fdd2e78c32ad0be606a2c64a3512abad87893f5fd21)
Performance will also be positively improved by this change -
previously, as task / workflow failure or cancellation would have
resulted in multiple, separate `updateJob` db calls, as data
modifications to the job object required for storing failure state were
done multiple times in multiple areas of the codebase. Most notably,
task error state was handled and updated separately from workflow error
state.
Now, it's just a clean, single `updateJob` call
This PR also does the following:
- adds a new test for `deleteJobOnComplete` behavior
- cleans up test suite
- ensures `deleteJobOnComplete` does not delete definitively failed jobs
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210553277813320
### What?
This update improves the flexibility of the ConfirmationModal by
allowing the `heading` and `body` prop to accept either a string or a
React node.
If the `heading` or `body` is a string, it will be wrapped in its
respective tags for consistent styling.
If it's already a React element, it will be rendered as-is. This
prevents layout issues when passing JSX content like lists, links, or
formatted elements into the modal heading and body.
Adds eslint rule `no-restricted-exports` with value `off` for payload
config files inside our `test` suite since we have to export with
default from those
Previously, there were multiple ways to type a running job:
- `GeneratedTypes['payload-jobs']` - only works in an installed project
- is `any` in monorepo
- `BaseJob` - works everywhere, but does not incorporate generated types
which may include type for custom fields added to the jobs collection
- `RunningJob<>` - more accurate version of `BaseJob`, but same problem
This PR deprecated all those types in favor of a new `Job` type.
Benefits:
- Works in both monorepo and installed projects. If no generated types
exist, it will automatically fall back to `BaseJob`
- Comes with an optional generic that can be used to narrow down
`job.input` based on the task / workflow slug. No need to use a separate
type helper like `RunningJob<>`
With this new type, I was able to replace every usage of
`GeneratedTypes['payload-jobs']`, `BaseJob` and `RunningJob<>` with the
simple `Job` type.
Additionally, this PR simplifies some of the logic used to run jobs
Continuation of https://github.com/payloadcms/payload/pull/6245.
This PR allows you to pass `blocksAsJSON: true` to SQL adapters and the
adapter instead of aligning with the SQL preferred relation approach for
blocks will just use a simple JSON column, which can improve performance
with a large amount of blocks.
To try these changes you can install `3.43.0-internal.c5bbc84`.
### What?
Adds `constructorOptions` property to the upload config to allow any of
[these options](https://sharp.pixelplumbing.com/api-constructor/) to be
passed to the Sharp library.
### Why?
Users should be able to extend the Sharp library config as needed, to
define useful properties like `limitInputPixels` etc.
### How?
Creates new config option `constructorOptions` which passes any
compatible options directly to the Sharp library.
#### Reported by client.
Fixes https://github.com/payloadcms/payload/issues/12776
- Adds a new `jobs.config.enabled` property to the sanitized config,
which can be used to easily check if the jobs system is enabled (i.e.,
if the payload-jobs collection was added during sanitization). This is
then checked before Payload sets up job autoruns.
- Fixes some type issues that occurred due to still deep-requiring the
jobs config - we forgot to omit it from the `DeepRequired(Config)` call.
The deep-required jobs config was then incorrectly merged with the
sanitized jobs config, resulting in a SanitizedConfig where all jobs
config properties were marked as required, even though they may be
undefined.
The custom save button showing on export enabled collections in the edit
view. Instead it was meant to only appear in the export collection. This
makes it so it only appears in the export drawer.
Type declaration extending `custom.['plugin-import-export']` was
incorrectly named `toCSVFunction` instead of `toCSV`. This changes the
type to match the correct property name `toCSV`.