# Fixing invalid html from convertLexicalToHTML with UploadConverter
## What?
When using the [convertLexicalToHTML
function](https://github.com/payloadcms/payload/blob/main/packages/richtext-lexical/src/features/converters/lexicalToHtml/sync/index.ts#L33C17-L33C37)
to convert richtext into html, the resulting html is incorrect when an
upload is present.
## Why?
The error is within the UploadHTMLConverter, as you can see in my
commits.
The picture closing tag has a Dollar sign in it `</picture$>`, which is
invalid html. This results in the picture not closing at the right
place. The picture element is not only wrapping the img tag, but also
all following content of that richtext field.
> I suppose that dollar sign ended up there due to auto rename tags from
IDE.
```ts
import { convertLexicalToHTML } from "@payloadcms/richtext-lexical/html";
const html = convertLexicalToHTML({ data: content });
```
**Example before fix**
```html
<div class="payload-richtext">
<h2>Title</h2>
<p>description</p>
<picture>
<img alt="media.png" height="400" src="https://some.url/storage/v1/object/public/media/media.png" width="684">
<p>Some more text</p>
</picture>
</div>
```
**Example after fix**
```html
<div class="payload-richtext">
<h2>Title</h2>
<p>description</p>
<picture>
<img alt="media.png" height="400" src="https://some.url/storage/v1/object/public/media/media.png" width="684">
</picture>
<p>Some more text</p>
</div>
```
This PR fixes two bugs in the version diff view SetStepNav component
### Bug 1: Document title isn't shown correctly in the step navigation
if the field of `useAsTitle` is nested inside a presentational field.
The StepNav shows the title of the document consistently throughout
every view except the version diff view. In the version diff view, the
document title is always `[Untitled]` if the field of `useAsTitle` is
nested inside presentational fields. Below is a video demo of the bug:
https://github.com/user-attachments/assets/23cb140a-b6d3-4d39-babf-5e4878651869
This happens because the fields of the collection/global aren't
flattened inside SetStepNav and thus the component is not accessing the
field data correctly. This results in the title being `null` causing the
fallback title to be shown.
### Bug 2: Step navigation shows the title of the version viewed, not
the current version
The StepNav component takes the title of the current version viewed.
This causes the second part of the navigation path to change between
versions which is inconsistent between other views and doesn't seem
intentional, although it could be. Below is a video of the bug with the
first bug fixed by flattening the fields:
https://github.com/user-attachments/assets/e5beb9b3-8e2e-4232-b1e5-5cce720e46b9
This happens due to the fact that the title is taken from the
`useAsTitle` field of the **viewed** version rather than the **current**
version. This bug is fixed by using the `useDocumentTitle` hook from the
ui package instead of passing the version's `useAsTitle` data down the
component tree. The final state of the step navigation is shown in the
following video:
https://github.com/user-attachments/assets/a69d5088-e7ee-43be-8f47-d9775d43dde9
I also added a test to test that the title part in the step navigation
stays consistent between versions and implicitly also tests that the
document title is shown correctly in the step nav if the field of
`useAsTitle` is a nested inside a presentational field.
Root level collection folders should not display items, only folders.
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1210821658736793
Adds the `admin.autoRefresh` property to the root config. This allows
users to stay logged and have their token always refresh in the
background without being prompted with the "Stay Logged In?" modal.
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1211114366468735
Previously, we were sending incorrect API requests for globals.
Additionally, the global findOne endpoint did not respect the data
attribute.
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1211249618316704
### What?
Set X-Payload-HTTP-Method-Override as allowed cross origin header
### Why?
As of #13619 the live preview uses POST method to retrieve updated data.
When running cms and website on different origins, the cross origin
requires X-Payload-HTTP-Method-Override header to be allowed
### How?
Adding X-Payload-HTTP-Method-Override as a default allowed header
This release upgrades the lexical dependency from 0.34.0 to 0.35.0.
If you installed lexical manually, update it to 0.35.0. Installing
lexical manually is not recommended, as it may break between updates,
and our re-exported versions should be used. See the [yellow banner
box](https://payloadcms.com/docs/rich-text/custom-features) for details.
If you still encounter richtext-lexical errors, do the following, in
this order:
- Delete node_modules
- Delete your lockfile (e.g. pnpm-lock.json)
- Reinstall your dependencies (e.g. pnpm install)
### Lexical Breaking Changes
The following Lexical releases describe breaking changes. We recommend
reading them if you're using Lexical APIs directly
(`@payloadcms/richtext-lexical/lexical/*`).
- [v.0.35.0](https://github.com/facebook/lexical/releases/tag/v0.35.0)
Alternative solution to
https://github.com/payloadcms/payload/pull/11104. Big thanks to
@andershermansen and @GermanJablo for kickstarting work on a solution
and bringing this to our attention. This PR copies over the live-preview
test suite example from his PR.
Fixes https://github.com/payloadcms/payload/issues/5285,
https://github.com/payloadcms/payload/issues/6071 and
https://github.com/payloadcms/payload/issues/8277. Potentially fixes
#11801
This PR completely gets rid of our client-side live preview field
traversal + population and all logic related to it, and instead lets the
findByID endpoint handle it.
The data sent through the live preview message event is now passed to
findByID via the newly added `data` attribute. The findByID endpoint
will then use this data and run hooks on it (which run population),
instead of fetching the data from the database.
This new API basically behaves like a `/api/populate?data=` endpoint,
with the benefit that it runs all the hooks. Another use-case for it
will be rendering lexical data. Sometimes you may only have unpopulated
data available. This functionality allows you to then populate the
lexical portion of it on-the-fly, so that you can properly render it to
JSX while displaying images.
## Benefits
- a lot less code to maintain. No duplicative population logic
- much faster - one single API request instead of one request per
relationship to populate
- all payload features are now correctly supported (population and
hooks)
- since hooks are now running for client-side live preview, this means
the `lexicalHTML` field is now supported! This was a long-running issue
- this fixes a lot of population inconsistencies that we previously did
not know of. For example, it previously populated lexical and slate
relationships even if the data was saved in an incorrect format
## [Method Override
(POST)](https://payloadcms.com/docs/rest-api/overview#using-method-override-post)
change
The population request to the findByID endpoint is sent as a post
request, so that we can pass through the `data` without having to
squeeze it into the url params. To do that, it uses the
`X-Payload-HTTP-Method-Override` header.
Previously, this functionality still expected the data to be sent
through as URL search params - just passed to the body instead of the
URL. In this PR, I made it possible to pass it as JSON instead. This
means:
- the receiving endpoint will receive the data under `req.data` and is
not able to read it from the search params
- this means existing endpoints won't support this functionality unless
they also attempt to read from req.data.
- for the purpose of this PR, the findByID endpoint was modified to
support this behavior. This functionality is documented as it can be
useful for user-defined endpoints as well.
Passing data as json has the following benefits:
- it's more performant - no need to serialize and deserialize data to
search params via `qs-esm`. This is especially important here, as we are
passing large amounts of json data
- the current implementation was serializing the data incorrectly,
leading to incorrect data within nested lexical nodes
**Note for people passing their own live preview `requestHandler`:**
instead of sending a GET request to populate documents, you will now
need to send a POST request to the findByID endpoint and pass additional
headers. Additionally, you will need to send through the arguments as
JSON instead of search params and include `data` as an argument. Here is
the updated defaultRequestHandler for reference:
```ts
const defaultRequestHandler: CollectionPopulationRequestHandler = ({
apiPath,
data,
endpoint,
serverURL,
}) => {
const url = `${serverURL}${apiPath}/${endpoint}`
return fetch(url, {
body: JSON.stringify(data),
credentials: 'include',
headers: {
'Content-Type': 'application/json',
'X-Payload-HTTP-Method-Override': 'GET',
},
method: 'POST',
})
}
```
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1211124793355068
- https://app.asana.com/0/0/1211124793355066
### What?
Encode the cacheTag in the ImageMedia component
### Why?
In the website template, media is render using the updatedAt field as a
cacheTag, this value causes an InvalidQueryStringException when
deploying to Cloudfront
### How?
Uses encodeURIComponent on encode the date value of updatedAt
Fixes https://github.com/payloadcms/payload/issues/13557
### What?
- Fixed an issue where virtual relationship fields (`virtual: string`,
e.g. `post.title`) could not be used in the WhereBuilder filter
dropdown.
- Ensured regular virtual fields (`virtual: true`) are still excluded
since they are not backed by database fields.
### Why?
Previously, attempting to filter by a virtual relationship caused
runtime errors because the field was treated as a plain text field. At
the same time, non-queryable virtuals needed to remain excluded to avoid
invalid queries.
### How?
- Updated `reduceFieldsToOptions` to recognize `virtual: string` fields
and map them to their underlying path while keeping their declared type
and operators.
- Continued to filter out `virtual: true` fields in the same guard used
for hidden/disabled fields.
### What?
In the create-first-user view, fields like `richText` were being marked
as `readOnly: true` because they had no permissions entry in the
permissions map.
### Why?
The view was passing an incomplete `docPermissions` object.
When a field had no entry in `docPermissions.fields`, `renderField`
received `permissions: undefined`, which was interpreted as denied
access.
This caused fields (notably `richText`) to default to read-only even
though the user should have full access when creating the first user.
### How?
- Updated the create-first-user view to always pass a complete
`docPermissions` object.
- Default all fields in the user collection to `{ create: true, read:
true, update: true }`.
- Ensures every field is explicitly granted full access during the
first-user flow.
- Keeps the `renderField` logic unchanged and aligned with Payload’s
permission model.
Fixes#13612
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1211211792037939
Edited by @jacobsfletch
Autosave is run with default depth, causing form state and the document
info context to be inconsistent across the load/autosave/save lifecycle.
For example, when loading the app, the document is queried with `depth:
0` and relationships are _not_ populated. Same with save/draft/publish.
Autosave, on the other hand, populates these relationships, causing the
data to temporarily change in shape in between these events.
Here's an example:
https://github.com/user-attachments/assets/153ea112-7591-4f54-9216-575322f4edbe
Now, autosave runs with `depth: 0` as expected, consistent with the
other events of the document lifecycle.
**Plus this is a huge performance improvement.**
Fixes#13643 and #13192.
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1211224908421570
---------
Co-authored-by: Jacob Fletcher <jacobsfletch@gmail.com>
### What?
This PR updates the folder configuration types to explicitly omit the
`trash` option from `CollectionConfig` when used for folders.
### Why?
Currently, only documents are designed to support trashing. Allowing
folders to be trashed introduces inconsistencies, since removing a
folder does not properly remove its references from associated
documents.
By omitting `trash` from folder configurations, we prevent accidental
use of unsupported behavior until proper design and handling for folder
trashing is implemented.
### How?
- Updated `RootFoldersConfiguration.collectionOverrides` type to use
`Omit<CollectionConfig, 'trash'>`
- Ensures `trash` cannot be enabled on folders
### What?
Adds a new `experimental.localizeStatus` config option, set to `false`
by default. When `true`, the admin panel will display the document
status based on the *current locale* instead of the _latest_ overall
status. Also updates the edit view to only show a `changed` status when
`autosave` is enabled.
### Why?
Showing the status for the current locale is more accurate and useful in
multi-locale setups. This update will become default behavior, able to
be opted in by setting `experimental.localizeStatus: true` in the
Payload config. This option will become depreciated in V4.
### How?
When `localizeStatus` is `true`, we store the localized status in a new
`localeStatus` field group within version data. The admin panel then
reads from this field to display the correct status for the current
locale.
---------
Co-authored-by: Jarrod Flesch <jarrodmflesch@gmail.com>
Fixes https://github.com/payloadcms/payload/issues/10379
During form state requests, the passed `data` did not have access to the
document ID. This was because the data we use came from the client,
passed as an argument. The client did not pass data that included the
document ID.
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1211203844178567
### What?
Adds comprehensive documentation for the RadioField component in the
form builder plugin documentation.
### Why?
Addresses review feedback from #11908 noting that the RadioField was not
documented after being exported for end-user use.
### How?
- Added RadioField section with complete property documentation
- Added detailed Radio Options sub-section explaining option structure
- Updated Select field documentation for consistency (added missing
placeholder property and detailed options structure)
- Added radio field to configuration example to show all available
fields
Fixes the documentation gap identified in #11908 review comments.
Fixes
https://github.com/payloadcms/payload/issues/13589#issuecomment-3234832478
### Problem
The baseFilter is being applied when a user has no tenant selected, is
assigned to tenants BUT also passes the `userHasAccessToAllTenants`
check.
### Fix
Do not apply the baseFilter when no tenant is selected and the user
passes the `userHasAccessToAllTenants` check.
### What?
Restores the ability to query by `Date(...)` when using the `sqlite`
adapter.
### Why?
Queries involving JavaScript `Date`s return incorrect results because
they are not converted to ISO 8601 strings by the `sqlite` adapter.
### How?
Wraps comparison operators to convert `Date` parameters to ISO 8601
strings.
Fixes#11692
---------
Co-authored-by: German Jablonski <43938777+GermanJablo@users.noreply.github.com>
### What?
Hides the `Publish in [specific locale]` button on collections and
globals when no localized fields are present.
### Why?
Currently, the `Publish in [specific locale]` shows on every
collection/global as long as `localization` is enabled in the Payload
config, however this is not necessary when the collection/global has no
localized fields.
### How?
Checks that localized fields exist prior to rendering the `Publish in
[specific locale]` button.
Fixes#13447
---------
Co-authored-by: German Jablonski <43938777+GermanJablo@users.noreply.github.com>
### What?
Currently the `DraggableBlockPlugin` and `AddBlockHandlePlugin`
components are automatically applied to every editor. For flexibility
purposes, we want to allow these to be optionally removed when needed.
### Why?
There are scenarios where you may want to enforce certain limitations on
an editor, such as only allowing a single line of text. The draggable
block element and add block button wouldn't play nicely with this
scenario.
Previously in order to do this, you needed to use custom css to hide the
elements, which still technically allows them to be accessible to the
end-user if they removed the CSS. This implementation ensures the
handlers are properly removed when not wanted.
### How?
Add `hideDraggableBlockElement` and `hideAddBlockButton` options to the
lexical `admin` property. When these are set to `true`, the
`DraggableBlockPlugin` and `AddBlockHandlePlugin` are not rendered to
the DOM.
Addresses #13636
Reindexing all collections in the search plugin was previously done in
sequence which is slow and risks timing out under certain network
conditions. By running these requests in parallel we are able to save
**on average ~80%** in compute time.
This test includes reindexing 2000 documents in total across 2
collections, 3 times over. The indexes themselves are relatively simple,
containing only a couple simple fields each, so the savings would only
increase with more complexity and/or more documents.
Before:
Attempt 1: 38434.87ms
Attempt 2: 47852.61ms
Attempt 3: 28407.79ms
Avg: 38231.75ms
After:
Attempt 1: 7834.29ms
Attempt 2: 7744.40ms
Attempt 3: 7918.58ms
Avg: 7832.42ms
Total savings: ~79.51%
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1211162504205343
### What?
Fix `JSON` field so that it respects `admin.editorOptions` (e.g.
`tabSize`, `insertSpaces`, etc.), matching the behavior of the `code`
field. Also refactor `CodeEditor` to set indentation and whitespace
options per-model instead of globally.
### Why?
- Previously, the JSON field ignored `editorOptions` and always
serialized with spaces (`tabSize: 2`). This caused inconsistencies when
comparing JSON and code fields configured with the same options.
- Monaco’s global defaults were being overridden in a way that leaked
settings between editors, making per-field customization unreliable.
### How?
- Updated `JSON` field to extract `tabSize` from `editorOptions` and
pass it through consistently when serializing and mounting the editor.
- Refactored CodeEditor to:
- Disable `detectIndentation` globally.
- Apply `insertSpaces`, `tabSize`, and `trimAutoWhitespace` on a
per-model basis inside onMount.
- Preserve all other `editorOptions` as before.
Fixes#13583
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1211177100283503
---------
Co-authored-by: Jarrod Flesch <jarrodmflesch@gmail.com>
Fixes https://github.com/payloadcms/payload/issues/13563
When using the nested docs plugin with collections that do not have
drafts enabled. It was not syncing breadcrumbs from parent changes.
The root cause was returning early on any document that did not meet
`doc._status !== 'published'` check, which was **_always_** the case for
non-draft collections.
### Before
```ts
if (doc._status !== 'published') {
return
}
```
### After
```ts
if (collection.versions.drafts && doc._status !== 'published') {
return
}
```
Setting `runHooks: true` is already discouraged and will make the job
queue system a lot slower and less reliable. We have no test coverage
for this and it's additional code we need to maintain.
To further discourage using this property, this PR marks it as
deprecated and for removal in 4.0.
Currently, attempting to run tasks in parallel will result in DB errors.
## Solution
The problem was caused due to inefficient db update calls. After each
task completes, we need to update the log array in the payload-jobs
collection. On postgres, that's a different table.
Currently, the update works the following way:
1. Nuke the table
2. Re-insert every single row, including the new one
This will throw db errors if multiple processes start doing that.
Additionally, due to conflicts, new log rows may be lost.
This PR makes use of the the [new db $push operation
](https://github.com/payloadcms/payload/pull/13453) we recently added to
atomically push a new log row to the database in a single round-trip.
This not only reduces the amount of db round trips (=> faster job queue
system) but allows multiple tasks to perform this db operation in
parallel, without conflicts.
## Problem
**Example:**
```ts
export const fastParallelTaskWorkflow: WorkflowConfig<'fastParallelTask'> = {
slug: 'fastParallelTask',
handler: async ({nlineTask }) => {
const taskFunctions = []
for (let i = 0; i < 20; i++) {
const idx = i + 1
taskFunctions.push(async () => {
return await inlineTask(`parallel task ${idx}`, {
input: {
test: idx,
},
task: () => {
return {
output: {
taskID: idx.toString(),
},
}
},
})
})
}
await Promise.all(taskFunctions.map((f) => f()))
},
}
```
On SQLite, this would throw the following error:
```bash
Caught error Error: UNIQUE constraint failed: payload_jobs_log.id
at Object.next (/Users/alessio/Documents/GitHub/payload/node_modules/.pnpm/libsql@0.4.7/node_modules/libsql/index.js:335:20)
at Statement.all (/Users/alessio/Documents/GitHub/payload/node_modules/.pnpm/libsql@0.4.7/node_modules/libsql/index.js:360:16)
at executeStmt (/Users/alessio/Documents/GitHub/payload/node_modules/.pnpm/@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5/node_modules/@libsql/client/lib-cjs/sqlite3.js:285:34)
at Sqlite3Client.execute (/Users/alessio/Documents/GitHub/payload/node_modules/.pnpm/@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5/node_modules/@libsql/client/lib-cjs/sqlite3.js:101:16)
at /Users/alessio/Documents/GitHub/payload/node_modules/.pnpm/drizzle-orm@0.44.2_@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5__@opentelemetr_asjmtflojkxlnxrshoh4fj5f6u/node_modules/src/libsql/session.ts:288:58
at LibSQLPreparedQuery.queryWithCache (/Users/alessio/Documents/GitHub/payload/node_modules/.pnpm/drizzle-orm@0.44.2_@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5__@opentelemetr_asjmtflojkxlnxrshoh4fj5f6u/node_modules/src/sqlite-core/session.ts:79:18)
at LibSQLPreparedQuery.values (/Users/alessio/Documents/GitHub/payload/node_modules/.pnpm/drizzle-orm@0.44.2_@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5__@opentelemetr_asjmtflojkxlnxrshoh4fj5f6u/node_modules/src/libsql/session.ts:286:21)
at LibSQLPreparedQuery.all (/Users/alessio/Documents/GitHub/payload/node_modules/.pnpm/drizzle-orm@0.44.2_@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5__@opentelemetr_asjmtflojkxlnxrshoh4fj5f6u/node_modules/src/libsql/session.ts:214:27)
at QueryPromise.all (/Users/alessio/Documents/GitHub/payload/node_modules/.pnpm/drizzle-orm@0.44.2_@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5__@opentelemetr_asjmtflojkxlnxrshoh4fj5f6u/node_modules/src/sqlite-core/query-builders/insert.ts:402:26)
at QueryPromise.execute (/Users/alessio/Documents/GitHub/payload/node_modules/.pnpm/drizzle-orm@0.44.2_@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5__@opentelemetr_asjmtflojkxlnxrshoh4fj5f6u/node_modules/src/sqlite-core/query-builders/insert.ts:414:40)
at QueryPromise.then (/Users/alessio/Documents/GitHub/payload/node_modules/.pnpm/drizzle-orm@0.44.2_@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5__@opentelemetr_asjmtflojkxlnxrshoh4fj5f6u/node_modules/src/query-promise.ts:31:15) {
rawCode: 1555,
code: 'SQLITE_CONSTRAINT_PRIMARYKEY',
libsqlError: true
}
```
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1211001438499053
Conditionally send the user to the inactivity route if there was a user
but refresh failed. You can get into an infinite loop if you call this
function externally and a redirect is being used in the url.
Follow up to #12119.
You can now configure the toast notifications used in the admin panel
through the Payload config:
```ts
import { buildConfig } from 'payload'
export default buildConfig({
// ...
admin: {
// ...
toast: {
duration: 8000,
limit: 1,
// ...
}
}
})
```
_Note: the toast config is temporarily labeled as experimental to allow
for changes to the API, if necessary._
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1211139422639711
This PR adds **atomic** `$push` **support for array fields**. It makes
it possible to safely append new items to arrays, which is especially
useful when running tasks in parallel (like job queues) where multiple
processes might update the same record at the same time. By handling
pushes atomically, we avoid race conditions and keep data consistent -
especially on postgres, where the current implementation would nuke the
entire array table before re-inserting every single array item.
The feature works for both localized and unlocalized arrays, and
supports pushing either single or multiple items at once.
This PR is a requirement for reliably running parallel tasks in the job
queue - see https://github.com/payloadcms/payload/pull/13452.
Alongside documenting `$push`, this PR also adds documentation for
`$inc`.
## Changes to updatedAt behavior
https://github.com/payloadcms/payload/pull/13335 allows us to override
the updatedAt property instead of the db always setting it to the
current date.
However, we are not able to skip updating the updatedAt property
completely. This means, usage of $push results in 2 postgres db calls:
1. set updatedAt in main row
2. append array row in arrays table
This PR changes the behavior to only automatically set updatedAt if it's
undefined. If you explicitly set it to `null`, this now allows you to
skip the db adapter automatically setting updatedAt.
=> This allows us to use $push in just one single db call
## Usage Examples
### Pushing a single item to an array
```ts
const post = (await payload.db.updateOne({
data: {
array: {
$push: {
text: 'some text 2',
id: new mongoose.Types.ObjectId().toHexString(),
},
},
},
collection: 'posts',
id: post.id,
}))
```
### Pushing a single item to a localized array
```ts
const post = (await payload.db.updateOne({
data: {
arrayLocalized: {
$push: {
en: {
text: 'some text 2',
id: new mongoose.Types.ObjectId().toHexString(),
},
es: {
text: 'some text 2 es',
id: new mongoose.Types.ObjectId().toHexString(),
},
},
},
},
collection: 'posts',
id: post.id,
}))
```
### Pushing multiple items to an array
```ts
const post = (await payload.db.updateOne({
data: {
array: {
$push: [
{
text: 'some text 2',
id: new mongoose.Types.ObjectId().toHexString(),
},
{
text: 'some text 3',
id: new mongoose.Types.ObjectId().toHexString(),
},
],
},
},
collection: 'posts',
id: post.id,
}))
```
### Pushing multiple items to a localized array
```ts
const post = (await payload.db.updateOne({
data: {
arrayLocalized: {
$push: {
en: {
text: 'some text 2',
id: new mongoose.Types.ObjectId().toHexString(),
},
es: [
{
text: 'some text 2 es',
id: new mongoose.Types.ObjectId().toHexString(),
},
{
text: 'some text 3 es',
id: new mongoose.Types.ObjectId().toHexString(),
},
],
},
},
},
collection: 'posts',
id: post.id,
}))
```
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1211110462564647
### What
When using the `Restore as draft` action from the version view, the
restored document incorrectly appears as `published` in the API. This
issue was reported by a client.
### Why
In the `restoreVersion` operation, the document is updated via
`db.updateOne` (line 265). The update passes along the status stored in
the version being restored but does not respect the `draft` query
parameter.
### How
Ensures that the result status is explicitly set to `draft` when the
`draft` argument is `true`.
Fixes https://github.com/payloadcms/payload/issues/13433. Testing
release: `3.54.0-internal.90cf7d5`
Previously, when calling `getPayload`, you would always use the same,
cached payload instance within a single process, regardless of the
arguments passed to the `getPayload` function. This resulted in the
following issues - both are fixed by this PR:
- If, in your frontend you're calling `getPayload` without `cron: true`,
and you're hosting the Payload Admin Panel in the same process, crons
will not be enabled even if you visit the admin panel which calls
`getPayload` with `cron: true`. This will break jobs autorun depending
on which page you visit first - admin panel or frontend
- Within the same process, you are unable to use `getPayload` twice for
different instances of payload with different Payload Configs.
On postgres, you can get around this by manually calling new
`BasePayload()` which skips the cache. This did not work on mongoose
though, as mongoose was caching the models on a global singleton (this
PR addresses this).
In order to bust the cache for different Payload Config, this PR
introduces a new, optional `key` property to `getPayload`.
## Mongoose - disable using global singleton
This PR refactors the Payload Mongoose adapter to stop relying on the
global mongoose singleton. Instead, each adapter instance now creates
and manages its own scoped Connection object.
### Motivation
Previously, calling `getPayload()` more than once in the same process
would throw `Cannot overwrite model` errors because models were compiled
into the global singleton. This prevented running multiple Payload
instances side-by-side, even when pointing at different databases.
### Changes
- Replace usage of `mongoose.connect()` / `mongoose.model()` with
instance-scoped `createConnection()` and `connection.model()`.
- Ensure models, globals, and versions are compiled per connection, not
globally.
- Added proper `close()` handling on `this.connection` instead of
`mongoose.disconnect()`.
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1211114366468745
### What?
Brackets (`[ ]`) in option values end up in GraphQL enum names via
`formatName`, causing schema generation to fail. This PR adds a single
rule to `formatName`:
- replace `[` and `]` with `_`
### Why?
Using `_` (instead of removing the brackets) is safer and more
consistent:
- Avoid collisions: removal can merge distinct strings (`"A[B]"` →
`"AB"`). `_` keeps them distinct (`"A_B"`).
- **Consistency**: `formatName` already maps punctuation to `_` (`. - /
+ , ( ) '`). Brackets follow the same rule.
Readability: `mb-[150px]` → `mb__150px_` is clearer than `mb150px`.
Digits/units safety: removal can jam characters (`w-[2/3]` → `w23`); `_`
avoids that (`w_2_3_`).
### How?
Update formatName to include a bracket replacement step:
```
.replace(/\[|\]/g, '_')
```
No other call sites or value semantics change; only names containing
brackets are affected.
Fixes#13466
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1211141396953194
Fixes#13574.
When editing an autosave-enabled document within a document drawer, any
changes made will while the autosave is processing are ultimately
discarded from local form state. This makes is difficult or even
impossible to edit fields.
This is because we server-render, then replace, the entire document on
every save. This includes form state, which is stale because it was
rendered while new changes were still being made. We don't need to
re-render the entire view on every save, though, only on create. We
don't do this on the top-level edit view, for example. Instead, we only
need to replace form state.
This change is also a performance improvement because we are no longer
rendering all components unnecessarily, especially on every autosave
interval.
Before:
https://github.com/user-attachments/assets/e9c221bf-4800-4153-af55-8b82e93b3c26
After:
https://github.com/user-attachments/assets/d77ef2f3-b98b-41d6-ba6c-b502b9bb99cc
Note: ignore the flashing autosave status and doc controls. This is
horrible and we're actively fixing it, but is outside the scope of this
PR.
---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
- https://app.asana.com/0/0/1211139422639700
Fixes https://github.com/payloadcms/payload/issues/13565
The logout operation was running twice and causing a race condition on
user updates. This change ensures the logout operation only runs 1 time.
Really this view should have 1 purpose and that is to show the
inactivity view. Currently it has 2 purposes which is why it needs the
useEffect — the root of this issue. Instead we should just call the
`logOut` function from the logout button instead of it linking to a
logout page.
2025-08-26 13:54:57 -04:00
288 changed files with 5723 additions and 3138 deletions
- Packages are located in the `packages/` directory.
- The main Payload package is `packages/payload`. This contains the core functionality.
- Database adapters are in `packages/db-*`.
- The UI package is `packages/ui`.
- The Next.js integration is in `packages/next`.
- Rich text editor packages are in `packages/richtext-*`.
- Storage adapters are in `packages/storage-*`.
- Email adapters are in `packages/email-*`.
- Plugins which add additional functionality are in `packages/plugin-*`.
- Documentation is in the `docs/` directory.
- Monorepo tooling is in the `tools/` directory.
- Test suites and configs are in the `test/` directory.
- LLMS.txt is at URL: https://payloadcms.com/llms.txt
- LLMS-FULL.txt is at URL: https://payloadcms.com/llms-full.txt
## Dev environment tips
- Any package can be built using a `pnpm build:*` script defined in the root `package.json`. These typically follow the format `pnpm build:<directory_name>`. The options are all of the top-level directories inside the `packages/` directory. Ex `pnpm build:db-mongodb` which builds the `packages/db-mongodb` package.
- ALL packages can be built with `pnpm build:all`.
- Use `pnpm dev` to start the monorepo dev server. This loads the default config located at `test/_community/config.ts`.
- Specific dev configs for each package can be run with `pnpm dev <directory_name>`. The options are all of the top-level directories inside the `test/` directory. Ex `pnpm dev fields` which loads the `test/fields/config.ts` config. The directory name can either encompass a single area of functionality or be the name of a specific package.
## Testing instructions
- There are unit, integration, and e2e tests in the monorepo.
- Unit tests can be run with `pnpm test:unit`.
- Integration tests can be run with `pnpm test:int`. Individual test suites can be run with `pnpm test:int <directory_name>`, which will point at `test/<directory_name>/int.spec.ts`.
- E2E tests can be run with `pnpm test:e2e`.
- All tests can be run with `pnpm test`.
- Prefer running `pnpm test:int` for verifying local code changes.
## PR Guidelines
- This repository follows conventional commits for PR titles
- PR Title format: <type>(<scope>): <title>. Title must start with a lowercase letter.
- Valid types are build, chore, ci, docs, examples, feat, fix, perf, refactor, revert, style, templates, test
- Prefer `feat` for new features and `fix` for bug fixes.
- Scopes should be chosen based upon the package(s) being modified. If multiple packages are being modified, choose the most relevant one or no scope at all.
- Example PR titles:
-`feat(db-mongodb): add support for transactions`
-`feat(richtext-lexical): add options to hide block handles`
-`fix(ui): json field type ignoring editorOptions`
## Commit Guidelines
- This repository follows conventional commits for commit messages
- The first commit of a branch should follow the PR title format: <type>(<scope>): <title>. Follow the same rules as PR titles.
- Subsequent commits should prefer `chore` commits without a scope unless a specific package is being modified.
- These will eventually be squashed into the first commit when merging the PR.
| `avatar` | Set account profile picture. Options: `gravatar`, `default` or a custom React component. |
| `autoLogin` | Used to automate log-in for dev and demonstration convenience. [More details](../authentication/overview). |
| `autoRefresh` | Used to automatically refresh user tokens for users logged into the dashboard. [More details](../authentication/overview). |
| `components` | Component overrides that affect the entirety of the Admin Panel. [More details](../custom-components/overview). |
| `custom` | Any custom properties you wish to pass to the Admin Panel. |
| `dateFormat` | The date format that will be used for all dates within the Admin Panel. Any valid [date-fns](https://date-fns.org/) format pattern can be used. |
@@ -107,6 +108,7 @@ The following options are available:
| `suppressHydrationWarning` | If set to `true`, suppresses React hydration mismatch warnings during the hydration of the root `<html>` tag. Defaults to `false`. |
| `theme` | Restrict the Admin Panel theme to use only one of your choice. Default is `all`. |
| `timezones` | Configure the timezone settings for the admin panel. [More details](#timezones) |
| `toast` | Customize the handling of toast messages within the Admin Panel. [More details](#toasts) |
| `user` | The `slug` of the Collection that you want to allow to login to the Admin Panel. [More details](#the-admin-user-collection). |
<Banner type="success">
@@ -298,3 +300,20 @@ We validate the supported timezones array by checking the value against the list
`timezone: true`. See [Date Fields](../fields/overview#date) for more
information.
</Banner>
## Toast
The `admin.toast` configuration allows you to customize the handling of toast messages within the Admin Panel, such as increasing the duration they are displayed and limiting the number of visible toasts at once.
<Banner type="info">
**Note:** The Admin Panel currently uses the
[Sonner](https://sonner.emilkowal.ski) library for toast notifications.
</Banner>
The following options are available for the `admin.toast` configuration:
@@ -173,6 +173,25 @@ The following options are available:
| **`password`** | The password of the user to login as. This is only needed if `prefillOnly` is set to true |
| **`prefillOnly`** | If set to true, the login credentials will be prefilled but the user will still need to click the login button. |
## Auto-Refresh
Turning this property on will allow users to stay logged in indefinitely while their browser is open and on the admin panel, by automatically refreshing their authentication token before it expires.
To enable auto-refresh for user tokens, set `autoRefresh: true` in the [Payload Config](../admin/overview#admin-options) to:
```ts
import { buildConfig } from 'payload'
export default buildConfig({
// ...
// highlight-start
admin: {
autoRefresh: true,
},
// highlight-end
})
```
## Operations
All auth-related operations are available via Payload's REST, Local, and GraphQL APIs. These operations are automatically added to your Collection when you enable Authentication. [More details](./operations).
Since the filtering happens at the root level of the application and its result is not calculated every time you navigate to a new page, you may want to call `router.refresh` in a custom component that watches when values that affect the result change. In the example above, you would want to do this when `supportedLocales` changes on the tenant document.
## Experimental Options
Experimental options are features that may not be fully stable and may change or be removed in future releases.
These options can be enabled in your Payload Config under the `experimental` key. You can set them like this:
```ts
import { buildConfig } from 'payload'
export default buildConfig({
// ...
experimental: {
localizeStatus: true,
},
})
```
The following experimental options are available related to localization:
| **`localizeStatus`** | **Boolean.** When `true`, shows document status per locale in the admin panel instead of always showing the latest overall status. Opt-in for backwards compatibility. Defaults to `false`. |
## Field Localization
Payload Localization works on a **field** level—not a document level. In addition to configuring the base Payload Config to support Localization, you need to specify each field that you would like to localize.
@@ -70,6 +70,7 @@ The following options are available:
| **`admin`** | The configuration options for the Admin Panel, including Custom Components, Live Preview, etc. [More details](../admin/overview#admin-options). |
| **`bin`** | Register custom bin scripts for Payload to execute. [More Details](#custom-bin-scripts). |
| **`editor`** | The Rich Text Editor which will be used by `richText` fields. [More details](../rich-text/overview). |
| **`experimental`** | Configure experimental features for Payload. These may be unstable and may change or be removed in future releases. [More details](../experimental). |
| **`db`** \* | The Database Adapter which will be used by Payload. [More details](../database/overview). |
| **`serverURL`** | A string used to define the absolute URL of your app. This includes the protocol, for example `https://example.com`. No paths allowed, only protocol, domain and (optionally) port. |
| **`collections`** | An array of Collections for Payload to manage. [More details](./collections). |
Experimental features allow you to try out new functionality before it becomes a stable part of Payload. These features may still be in active development, may have incomplete functionality, and can change or be removed in future releases without warning.
## How It Works
Experimental features are configured via the root-level `experimental` property in your [Payload Config](../configuration/overview). This property contains individual feature flags, each flag can be configured independently, allowing you to selectively opt into specific functionality.
| **`localizeStatus`** | **Boolean.** When `true`, shows document status per locale in the admin panel instead of always showing the latest overall status. Opt-in for backwards compatibility. Defaults to `false`. |
This list may change without notice.
## When to Use Experimental Features
You might enable an experimental feature when:
- You want early access to new capabilities before their stable release.
- You can accept the risks of using potentially unstable functionality.
- You are testing new features in a development or staging environment.
- You wish to provide feedback to the Payload team on new functionality.
If you are working on a production application, carefully evaluate whether the benefits outweigh the risks. For most stable applications, it is recommended to wait until the feature is officially released.
<Banner type="success">
<strong>Tip:</strong> To stay up to date on experimental features or share
@@ -773,3 +773,28 @@ const res = await fetch(`${api}/${collectionSlug}?depth=1&locale=en`, {
},
})
```
### Passing as JSON
When using `X-Payload-HTTP-Method-Override`, it expects the body to be a query string. If you want to pass JSON instead, you can set the `Content-Type` to `application/json` and include the JSON body in the request.
#### Example
```ts
const res = await fetch(`${api}/${collectionSlug}/${id}`, {
// Only the findByID endpoint supports HTTP method overrides with JSON data
method: 'POST',
credentials: 'include',
headers: {
'Accept-Language': i18n.language,
'Content-Type': 'application/json',
'X-Payload-HTTP-Method-Override': 'GET',
},
body: JSON.stringify({
depth: 1,
locale: 'en',
}),
})
```
This can be more efficient for large JSON payloads, as you avoid converting data to and from query strings. However, only certain endpoints support this. Supported endpoints will read the parsed body under a `data` property, instead of reading from query parameters as with standard GET requests.
@@ -11,7 +11,7 @@ keywords: lexical, richtext, html
There are two main approaches to convert your Lexical-based rich text to HTML:
1. **Generate HTML on-demand (Recommended)**: Convert JSON to HTML wherever you need it, on-demand.
2. **Generate HTML within your Collection**: Create a new field that automatically converts your saved JSON content to HTML. This is not recommended because it adds overhead to the Payload API and may not work well with live preview.
2. **Generate HTML within your Collection**: Create a new field that automatically converts your saved JSON content to HTML. This is not recommended because it adds overhead to the Payload API.
The `lexicalHTMLField()` helper converts JSON to HTML and saves it in a field that is updated every time you read it via an `afterRead` hook. It's generally not recommended for two reasons:
1. It creates a column with duplicate content in another format.
2. In [client-side live preview](/docs/live-preview/client), it makes it not "live".
The `lexicalHTMLField()` helper converts JSON to HTML and saves it in a field that is updated every time you read it via an `afterRead` hook. It's generally not recommended, as it creates a column with duplicate content in another format.
Consider using the [on-demand HTML converter above](/docs/rich-text/converting-html#on-demand-recommended) or the [JSX converter](/docs/rich-text/converting-jsx) unless you have a good reason.
// let the db handle this. If formattedValue is explicitly set to `null` we should not set it - this means we don't want to change the value of updatedAt.
Some files were not shown because too many files have changed in this diff
Show More
Reference in New Issue
Block a user
Blocking a user prevents them from interacting with repositories, such as opening or commenting on pull requests or issues. Learn more about blocking a user.