Commit Graph

325 Commits

Author SHA1 Message Date
Jarrod Flesch
2cafe494cc fix(ui): disabled and styles add row button correctly (#13140)
Disables add row button using disabled prop from useField - i.e. when
the form is processing or initializing.

This fixes a flaky array test that clicks the button before the form has
finished initializing/processing.

Also corrects the add row button color styles with specificity.
2025-07-11 14:07:51 -04:00
Germán Jabloñski
babcd599da fix(ui): save nested richtext inside inlineBlock (#12773)
Removing the `setTimeout` not only doesn't break any tests, but it also
fixes the linked issue.

The long comment above the if statement was added in
https://github.com/payloadcms/payload/pull/5460 and explains why the if
statement is necessary GIVEN the existence of the `setTimeout`, but the
`setTimeout` was introduced [earlier because the button apparently
didn't work](https://github.com/payloadcms/payload/issues/1414).

It seems to work now without the `setTimeout`, because otherwise the
tests wouldn't even pass. I also tested it manually, and it works fine.


Fixes #12687
2025-07-02 19:43:48 +00:00
Jacob Fletcher
3f30a2e300 fix(ui): block rows unexpectedly collapse and array rows not collapsed on init (#12987) 2025-06-30 21:12:26 -04:00
Alessio Gravili
4458f74cef ci: template errors not being caught due. fix: error due to updated generated-types User type (#12973)
This PR consists of two separate changes. One change cannot pass CI
without the other, so both are included in this single PR.


## CI - ensure types are generated

Our website template is currently failing to build due to a type error.
This error was introduced by a change in our generated types.

Our CI did not catch this issue because it wasn't generating types /
import map before attempting to build the templates. This PR updates the
CI to generate types first.

It also updates some CI step names for improved clarity.

## Fix: type error

![Screenshot 2025-06-29 at 12 53
49@2x](https://github.com/user-attachments/assets/962f1513-bc6c-4e12-9b74-9b891c49900b)


This fixes the type error by ensuring we consistently use the _same_
generated `TypedUser` object within payload, instead of `BaseUser`.
Previously, we sometimes used the generated-types user and sometimes the
base user, which was causing type conflicts depending on what the
generated user type was.

It also deprecates the `User` type (which was essentially just
`BaseUser`), as consumers should use `TypedUser` instead. `TypedUser`
will automatically fall back to `BaseUser` if no generated types exists,
but will accept passing it a generated-types User.

Without this change, additional properties added to the user via
generated-types may cause the user object to not be accepted by
functions that only accept a `User` instead of a `TypedUser`, which is
what failed here.

## Templates: re-generate templates to update generated types

---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
  - https://app.asana.com/0/0/1210668927737258
2025-06-29 14:27:50 -07:00
Jacob Fletcher
f2213e5c5c feat: mount live preview to document root (#12860)
Mounts live preview to `../:id` instead `../:id/preview`.

This is a huge win for both UX and a maintainability standpoint.

Here are just a few of those wins:

1. If you edit a document, _then_ decide you want to preview those
changes, you are currently presented with the `LeaveWithoutSaving` modal
and are forced to either save your edits or clear them. This is because
you are being navigated to an entirely new page with it's own form
context. Instead, you should be able to freely navigate back and forth
between the two.
2. If you are an editor who most often uses Live Preview, or you are
editing a collection that typically requires it, you likely want it to
automatically enter live preview mode when you open a document.
Currently, the user has to navigate to the document _first_, then use
the live preview tab. Instead, you should be able to set a preference
and avoid this extra step.
3. Since the inception of Live Preview, we've been maintaining largely
the same code across the default edit view and the live preview view,
which often became out of sync and inconsistent—but they're essentially
doing the same thing. While we could abstract a lot of this out, it is
no longer necessary if the two views are combined into one.

This change does also include some small modifications to UI. The "Live
Preview" tab no longer exists, and instead has been replaced with a
button placed next to the document controls (subject to change).

Before:


https://github.com/user-attachments/assets/48518b02-87ba-4750-ba7b-b21b5c75240a

After:


https://github.com/user-attachments/assets/a8ec8657-a6d6-4ee1-b9a7-3c1173bcfa96
2025-06-27 11:58:00 -04:00
Jessica Rynkar
6f6d305f9d fix(ui): prevent error if rows is undefined in mergeServerFormState (#12962)
### What? 
Adds optional chaining when accessing `rows` in `mergeServerFormState`
to prevent error crashing the UI.

### Why? 
When an array field is populated in a `beforeChange` hook and was
previously empty, it crashes `mergeServerFormState.ts` on this line
because no `rows` exist:

```ts 
const indexInCurrentState = currentState[path].rows.findIndex
``` 

The line after this checks `if (indexInCurrentState > -1)` so returning
undefined here will not affect the subsequent code.

### How? 
Added optional chaining to the access of `rows`, which prevents the
error being thrown.

Fixes #12944
2025-06-27 15:57:48 +00:00
Jessica Rynkar
37c945b95b fix(ui): custom row labels on arrays should not be removed on field duplication (#12895)
### What?
This fix prevents custom row labels being removed when duplicating array
items.

### Why?
Currently, when you have an array with custom row labels, if you create
a new array item by duplicating an existing item, the new item will have
no custom row label until you refresh the page.

### How?
During the `duplicate` process, we remove any react components from the
field state. This change intentionally re-adds the `RowLabel` if one
exists.

#### Reported by client
2025-06-25 09:44:00 -04:00
Anatoly Kopyl
c03e9c1724 fix(ui): properly differentiate between DOM events and raw values in setValue (#12892)
Because of this check, if a JSON with a property `target` was saved it
would become malformed.

For example trying to save a JSON field:

```json
{
  "target": {
    "value": {
      "foo": "bar"
    }
  }
}
```

would result in:

```json
{
  "foo": "bar"
}
```

And trying to save:

```json
{
  "target": "foo"
}
```

would just not save anything:

```json
null
```

I went through all of the field types and did not find a single one that
would rely on this ternary. Seems like it always defaulted to `const val
= e`, except the unexpected case described previously.

Fixes #12873

Added test may be overkill, will remove if so.




---
- To see the specific tasks where the Asana app for GitHub is being
used, see below:
  - https://app.asana.com/0/0/1210628466702813

---------

Co-authored-by: Jacob Fletcher <jacobsfletch@gmail.com>
2025-06-24 14:30:52 -04:00
Jacob Fletcher
f75d62c79b feat: select field filter options (#12487)
It is a common pattern to dynamically show and validate a select field's
options based on various criteria such as the current user or underlying
document.

Some examples of this might include:
- Restricting options based on a user's role, e.g. admin-only options
- Displaying different options based on the value of another field, e.g.
a city/state selector
 
While this is already possible to do with a custom `validate` function,
the user can still view and select the forbidden option...unless you
_also_ wired up a custom component.

Now, you can define `filterOptions` on select fields.

This behaves similarly to the existing `filterOptions` property on
relationship and upload fields, except the return value of this function
is simply an array of options, not a query constraint. The result of
this function will determine what is shown to the user and what is
validated on the server.

Here's an example:

```ts
{
  name: 'select',
  type: 'select',
  options: [
    {
      label: 'One',
      value: 'one',
    },
    {
      label: 'Two',
      value: 'two',
    },
    {
      label: 'Three',
      value: 'three',
    },
  ],
  filterOptions: ({ options, data }) =>
    data.disallowOption1
      ? options.filter(
          (option) => (typeof option === 'string' ? options : option.value) !== 'one',
        )
      : options,
}
```
2025-05-22 15:54:12 -04:00
Jarrod Flesch
00667faf8d feat: folders (#10030) 2025-05-22 10:04:45 -04:00
Germán Jabloñski
2a929cf385 chore: fix all lint errors and add mechanisms to prevent them from appearing again (#12401)
I think it's easier to review this PR commit by commit, so I'll explain
it this way:

## Commits
1. [parallelize eslint script (still showing logs results in
serial)](c9ac49c12d):
Previously, `--concurrency 1` was added to the script to make the logs
more readable. However, turborepo has an option specifically for these
use cases: `--log-order=grouped` runs the tasks in parallel but outputs
them serially. As a result, the lint script is now significantly faster.
2. [run pnpm
lint:fix](9c128c276a)
The auto-fix was run, which resolved some eslint errors that were
slipped in due to the use of `no-verify`. Most of these were
`perfectionist` fixes (property ordering) and the removal of unnecessary
assertions. Starting with this PR, this won't happen again in the
future, as we'll be verifying the linter in every PR across the entire
codebase (see commit 7).
3. [fix eslint non-autofixable
errors](700f412a33)
All manual errors have been resolved except for the configuration errors
addressed in commit 5. Most were React compiler violations, which have
been disabled and commented out "TODO" for now. There's also an unused
`use no memo` and a couple of `require` errors.
4. [move react-compiler linter to eslint-config
package](4f7cb4d63a)
To simplify the eslint configuration. My concern was that there would be
a performance regression when used in non-react related packages, but
none was experienced. This is probably because it only runs on .tsx
files.
5. [remove redundant eslint config files and fix
allowDefaultProject](a94347995a)
The main feature introduced by `typescript-eslint` v8 was
`projectService`, which automatically searches each file for the closest
`tsconfig`, greatly simplifying configuration in monorepos
([source](https://typescript-eslint.io/blog/announcing-typescript-eslint-v8#project-service)).
Once I moved `projectService` to `packages/eslint-config`, all the other
configuration files could be easily removed.
I confirmed that pnpm lint still works on individual packages.
The other important change was that the pending eslint errors from
commits 2 and 3 were resolved. That is, some files were giving the
error: "[File] was not found by the project service. Consider either
including it in the tsconfig.json or including it in
allowDefaultProject." Below I copy the explanatory comment I left in the
code:
```ts
// This is necessary because `tsconfig.base.json` defines `"rootDir": "${configDir}/src"`,
// And the following files aren't in src because they aren't transpiled.
// This is typescript-eslint's way of adding files that aren't included in tsconfig.
// See: https://typescript-eslint.io/troubleshooting/typed-linting/#i-get-errors-telling-me--was-not-found-by-the-project-service-consider-either-including-it-in-the-tsconfigjson-or-including-it-in-allowdefaultproject
// The best practice is to have a tsconfig.json that covers ALL files and is used for
// typechecking (with noEmit), and a `tsconfig.build.json` that is used for the build
// (or alternatively, swc, tsup or tsdown). That's what we should ideally do, in which case
// this hardcoded list wouldn't be necessary. Note that these files don't currently go
// through ts, only through eslint.
```

6. [Differentiate errors from warnings in VScode ESLint
Rules](5914d2f48d)
There's no reason to do that. If an eslint rule isn't an error, it
should be disabled or converted to a warning.
7. [Disable skip lint, and lint over the entire repo now that it's
faster](e4b28f1360)
The GitHub action linted only the files that had changed in the PR.
While this seems like a good idea, once exceptions were introduced with
[skip lint], they opened the door to propagating more and more errors.
Often, the linter was skipped, not because someone introduced new
errors, but because they were trying to avoid those that had already
crept in, sometimes accidentally introducing new ones.
On the other hand, `pnpm lint` now runs in parallel (commit 1), so it's
not that slow. Additionally, it runs in parallel with other GitHub
actions like e2e tests, which take much longer, so it can't represent a
bottleneck in CI.
8. [fix lint in next
package](4506595f91)
Small fix missing from commit 5
9. [Merge remote-tracking branch 'origin/main' into
fix-eslint](563d4909c1)
10. [add again eslint.config.js in payload
package](78f6ffcae7)
The comment in the code explains it. Basically, after the merge from
main, the payload package runs out of memory when linting, probably
because it grew in recent PRs. That package will sooner or later
collapse for our tooling, so we may have to split it. It's already too
big.

## Future Actions
- Resolve React compiler violations, as mentioned in commit 3.
- Decouple the `tsconfig` used for typechecking and build across the
entire monorepo (as explained in point 5) to ensure ts coverage even for
files that aren't transpiled (such as scripts).
- Remove the few remaining `eslint.config.js`. I had to leave the
`richtext-lexical` and `next` ones for now. They could be moved to the
root config and scoped to their packages, as we do for example with
`templates/vercel-postgres/**`. However, I couldn't get it to work, I
don't know why.
- Make eslint in the test folder usable. Not only are we not linting
`test` in CI, but now the `pnpm eslint .` command is so large that my
computer freezes. If each suite were its own package, this would be
solved, and dynamic codegen + git hooks to modify tsconfig.base.json
wouldn't be necessary
([related](https://github.com/payloadcms/payload/pull/11984)).
2025-05-19 12:36:40 -03:00
Paul
e258cd73ef feat: allow group fields to have an optional name (#12318)
Adds the ability to completely omit `name` from group fields now so that
they're entirely presentational.

New config:
```ts
import type { CollectionConfig } from 'payload'

export const ExampleCollection: CollectionConfig = {
  slug: 'posts',
  fields: [
    {
      label: 'Page header',
      type: 'group', // required
      fields: [
        {
          name: 'title',
          type: 'text',
          required: true,
        },
      ],
    },
  ],
}
```

will create
<img width="332" alt="image"
src="https://github.com/user-attachments/assets/10b4315e-92d6-439e-82dd-7c815a844035"
/>


but the data response will still be

```
{
    "createdAt": "2025-05-05T13:42:20.326Z",
    "updatedAt": "2025-05-05T13:42:20.326Z",
    "title": "example post",
    "id": "6818c03ce92b7f92be1540f0"

}
```

Checklist:
- [x] Added int tests
- [x] Modify mongo, drizzle and graphql packages
- [x] Add type tests
- [x] Add e2e tests
2025-05-14 23:45:34 +00:00
Jessica Rynkar
d9c0c43154 fix(ui): passes value to server component args (#12352)
### What?
Allows the field value (if defined) to be accessed from `args` with
custom server components.

### Why?
Documentation states that the user can access `args.value` to get the
value of the field at time of render (if a value is defined) when using
a custom server component - however this isn't currently setup.

<img width="469" alt="Screenshot 2025-05-08 at 4 51 30 PM"
src="https://github.com/user-attachments/assets/9c167f80-5c5e-4fea-a31c-166281d9f7db"
/>

Link to docs
[here](https://payloadcms.com/docs/fields/overview#default-props).

### How?
Passes the value from `data` if it exists (does not exist for all field
types) and adds `value` to the server component types as an optional
property.

Fixes #10389
2025-05-13 11:13:23 +01:00
Philipp Schneider
a62cdc89d8 fix(ui): blockType ignored when merging server form state (#12207)
In this case, the `blockType` property is created on the server, but -
prior to this fix - was discarded on the client in
[`fieldReducer.ts`](https://github.com/payloadcms/payload/blob/main/packages/ui/src/forms/Form/fieldReducer.ts#L186-L198)
via
[`mergerServerFormState.ts`](b9832f40e4/packages/ui/src/forms/Form/mergeServerFormState.ts (L29-L31)),
because the field's path neither existed in the client's form state, nor
was it marked as `addedByServer`.

This caused later calls to POST requests to form state to send without
the `blockType` key for block rows, which in turn caused
`addFieldStatePromise.ts` to throw the following error:

```
Block with type "undefined" was found in block data, but no block with that type is defined in the config for field with schema path ${schemaPath}.
```

This prevented the client side form state update from completing, and if
the form state was saved, broke the document.

This is a follow-up to #12131, which treated the symptom, but not the
cause. The original issue seems to have been introduced in
https://github.com/payloadcms/payload/releases/tag/v3.34.0. It's unclear
to me whether this issue is connected to block E2E tests having been
disabled in the same release in
https://github.com/payloadcms/payload/pull/11988.

## How to reproduce

### Collection configuration

```ts
const RICH_TEXT_BLOCK_TYPE = 'richTextBlockType'

const RichTextBlock: Block = {
  slug: RICH_TEXT_BLOCK_TYPE,
  interfaceName: 'RichTextBlock',
  fields: [
    {
      name: 'richTextBlockField',
      label: 'Rich Text Field in Block Field',
      type: 'richText',
      editor: lexicalEditor({}),
      required: true,
    },
  ],
}

const MyCollection: CollectionConfig = {
  slug: 'my-collection-slug,
  fields: [
    {
      name: 'arrayField',
      label: 'Array Field',
      type: 'array',
      fields: [
        {
          name: 'blockField',
          type: 'blocks',
          blocks: [RichTextBlock],
          required: true,
        },
      ],
    },
  ]
}

export default MyCollection
```

### Steps

- Press "Add Array Field"
   -->  1st block with rich text is added
- Press "Add Array Field" a 2nd time

### Result
- 🛑 2nd block is indefinitely in loading state (side-note: the form UI
should preferably explicitly indicate the error).
- 🛑 If saving the document, it is corrupted and will only show a blank
page (also not indicating any error).

Client side:

<img width="1268" alt="Untitled"
src="https://github.com/user-attachments/assets/4b32fdeb-af76-41e2-9181-d2dbd686618a"
/>

API error:

<img width="1272" alt="image"
src="https://github.com/user-attachments/assets/35dc65f7-88ac-4397-b8d4-353bcf6a4bfd"
/>

Client side, when saving and re-opening document (API error of `GET
/admin/collections/${myCollection}/${documentId}` is the same (arguably
the HTTP response status code shouldn't be `200`)):

<img width="1281" alt="image"
src="https://github.com/user-attachments/assets/2e916eb5-6f10-4e82-9b84-1dc41db21d47"
/>

### Result after fix
- `blockType` is sent from the client to the server.
-  2nd block with rich text is added.
-  Document does not break when saving & re-opening.

<img width="1277" alt="Untitled"
src="https://github.com/user-attachments/assets/84d0c88b-64b2-48c4-864d-610d524ac8fc"
/>

---------

Co-authored-by: Jacob Fletcher <jacobsfletch@gmail.com>
2025-05-02 10:18:11 -04:00
Patrik
c877b1ad43 feat: threads operation through field condition function (#12132)
This PR updates the field `condition` function property to include a new
`operation` argument.

The `operation` arg provides a string relating to which operation the
field type is currently executing within.

#### Changes:

- Added `operation: Operation` in the Condition type.
- Updated relevant condition checks to ensure correct parameter usage.
2025-04-16 15:38:53 -04:00
Philipp Schneider
4426625b83 perf(ui): prevent blockType: "$undefined" from being sent through the network (#12131)
Removes `$undefined` strings from being sent through the network when
sending form state requests. When adding new array rows, we assign
`blockType: undefined` which is stringified to `"$undefined"`. This is
unnecessary, as simply not sending this property is equivalent, and this
is only a requirement for blocks. This change will save on request size,
albeit minimal.

| Before | After |
|--|--|
|<img width="1267" alt="Untitled"
src="https://github.com/user-attachments/assets/699f38bd-7db9-4a52-931d-084b8af8530f"
/> | <img width="1285" alt="image"
src="https://github.com/user-attachments/assets/986ecd4c-f22d-4143-ad38-0c5f52439c67"
/> |
2025-04-16 15:03:35 -04:00
Jacob Fletcher
21599b87f5 fix(ui): stale paths on custom components within rows (#11973)
When server rendering custom components within form state, those
components receive a path that is correct at render time, but
potentially stale after manipulating array and blocks rows. This causes
the field to briefly render incorrect values while the form state
request is in flight.

The reason for this is that paths are passed as a prop statically into
those components. Then when we manipulate rows, form state is modified,
potentially changing field paths. The component's `path` prop, however,
hasn't changed. This means it temporarily points to the wrong field in
form state, rendering the data of another row until the server responds
with a freshly rendered component.

This is not an issue with default Payload fields as they are rendered on
the client and can be passed dynamic props.

This is only an issue within custom server components, including rich
text fields which are treated as custom components. Since they are
rendered on the server and passed to the client, props are inaccessible
after render.

The fix for this is to provide paths dynamically through context. This
way as we make changes to form state, there is a mechanism in which
server components can receive the updated path without waiting on its
props to update.
2025-04-15 15:23:51 -04:00
Patrik
112e081d8f fix(ui): ensure file field is only serialized at top-level for upload-enabled collections (#12074)
This fixes an issue where fields with the name `file` was being
serialized as a top-level field in multipart form data even when the
collection was not upload-enabled. This caused the value of `file` (when
used as a regular field like a text, array, etc.) to be stripped from
the `_payload`.

- Updated `createFormData` to only delete `data.file` and serialize it
at the top level if `docConfig.upload` is defined.
- This prevents unintended loss of `file` field values for non-upload
collections.

The `file` field now remains safely nested in `_payload` unless it's
part of an upload-enabled collection.
2025-04-10 17:37:10 +00:00
Jacob Fletcher
4d7c1d45fa fix(ui): form state race conditions (#12026)
Fixes form state race conditions. Modifying state while a request is in
flight or while the response is being processed could result in those
changes being overridden.

This was happening for a few reasons:

1. Our merge logic was incorrect. We were disregarding local changes to
state that may have occurred while form state requests are pending. This
was because we were iterating over local state, then while building up
new state, we were ignoring any fields that did not exist in the server
response, like this:
    
    ```ts
    for (const [path, newFieldState] of Object.entries(existingState)) {
    
      if (!incomingState[path]) {
        continue
      }
      
      // ...
    }
    ```

To fix this, we need to use local state as the source of truth. Then
when the server state arrives, we need to iterate over _that_. If a
field matches in local state, merge in any new properties. This will
ensure all changes to the underlying state are preserved, including any
potential addition or deletions.
    
However, this logic breaks down if the server might have created _new_
fields, like when populating array rows. This means they, too, would be
ignored. To get around this, there is a new `addedByServer` property
that flags new fields to ensure they are kept.
    
This new merge strategy also saves an additional loop over form state.
    
1. We were merging form state based on a mutable ref. This meant that
changes made within one action cause concurrent actions to have dirty
reads. The fix for this is to merge in an isolated manner by copying
state. This will remove any object references. It is generally not good
practice to mutate state without setting it, anyways, as this causes
mismatches between what is rendered and what is in memory.
    
1. We were merging server form state directly within an effect, then
replacing state entirely. This meant that if another action took place
at the exact moment in time _after_ merge but _before_ dispatch, the
results of that other action would be completely overridden. The fix for
this is to perform the merge within the reducer itself. This will ensure
that we are working with a trustworthy snapshot of state at the exact
moment in time that the action was invoked, and that React can properly
queue the event within its lifecycle.
2025-04-10 12:11:54 -04:00
Jacob Fletcher
e87521a376 perf(ui): significantly optimize form state component rendering, up to 96% smaller and 75% faster (#11946)
Significantly optimizes the component rendering strategy within the form
state endpoint by precisely rendering only the fields that require it.
This cuts down on server processing and network response sizes when
invoking form state requests **that manipulate array and block rows
which contain server components**, such as rich text fields, custom row
labels, etc. (results listed below).

Here's a breakdown of the issue:

Previously, when manipulating array and block fields, _all_ rows would
render any server components that might exist within them, including
rich text fields. This means that subsequent changes to these fields
would potentially _re-render_ those same components even if they don't
require it.

For example, if you have an array field with a rich text field within
it, adding the first row would cause the rich text field to render,
which is expected. However, when you add a second row, the rich text
field within the first row would render again unnecessarily along with
the new row.

This is especially noticeable for fields with many rows, where every
single row processes its server components and returns RSC data. And
this does not only affect nested rich text fields, but any custom
component defined on the field level, as these are handled in the same
way.

The reason this was necessary in the first place was to ensure that the
server components receive the proper data when they are rendered, such
as the row index and the row's data. Changing one of these rows could
cause the server component to receive the wrong data if it was not
freshly rendered.

While this is still a requirement that rows receive up-to-date props, it
is no longer necessary to render everything.

Here's a breakdown of the actual fix:

This change ensures that only the fields that are actually being
manipulated will be rendered, rather than all rows. The existing rows
will remain in memory on the client, while the newly rendered components
will return from the server. For example, if you add a new row to an
array field, only the new row will render its server components.

To do this, we send the path of the field that is being manipulated to
the server. The server can then use this path to determine for itself
which fields have already been rendered and which ones need required
rendering.

## Results

The following results were gathered by booting up the `form-state` test
suite and seeding 100 array rows, each containing a rich text field. To
invoke a form state request, we navigate to a document within the
"posts" collection, then add a new array row to the list. The result is
then saved to the file system for comparison.

| Test Suite | Collection | Number of Rows | Before | After | Percentage
Change |
|------|------|---------|--------|--------|--------|
| `form-state` | `posts` | 101 | 1.9MB / 266ms | 80KB / 70ms | ~96%
smaller / ~75% faster |

---------

Co-authored-by: James <james@trbl.design>
Co-authored-by: Alessio Gravili <alessio@gravili.de>
2025-04-03 12:27:14 -04:00
Jacob Fletcher
8880d705e3 fix(ui): optimistic rows disappear while form state requests are pending (#11961)
When manipulating array and blocks rows on slow networks, rows can
sometimes disappear and then reappear as requests in the queue arrive.

Consider this scenario:

1. You add a row to form state: this pushes the row in local state
optimistically then triggers a long-running form state request
containing a single row
2. You add another row to form state: this pushes a second row into
local state optimistically then triggers another long-running form state
request containing two rows
3. The first form state request returns with a single row in the
response and replaces local state (which contained two rows)
4. AT THIS MOMENT IN TIME, THE SECOND ROW DISAPPEARS
5. The second form state request returns with two rows in the response
and replaces local state
6. THE UI IS NO LONGER STALE AND BOTH ROWS APPEAR AS EXPECTED

The same issue applies when deleting, moving, and duplicating rows.
Local state becomes out of sync with the form state response and is
ultimately overridden.

The issue is that when we merge the result from form state, we do not
traverse the rows themselves, and instead take the rows in their
entirety. This means that we lose local row state. Instead, we need to
compare the results with what is saved to local state and intelligently
merge them.
2025-04-03 12:23:14 -04:00
Jacob Fletcher
373f6d1032 fix(ui): nested fields disappear when manipulating rows in form state (#11906)
Continuation of #11867. When rendering custom fields nested within
arrays or blocks, such as the Lexical rich text editor which is treated
as a custom field, these fields will sometimes disappear when form state
requests are invoked sequentially. This is especially reproducible on
slow networks.

This is different from the previous PR in that this issue is caused by
adding _rows_ back-to-back, whereas the previous issue was caused when
adding a single row followed by a change to another field.

Here's a screen recording demonstrating the issue:


https://github.com/user-attachments/assets/5ecfa9ec-b747-49ed-8618-df282e64519d

The problem is that `requiresRender` is never sent in the form state
request for row 2. This is because the [task
queue](https://github.com/payloadcms/payload/pull/11579) processes tasks
within a single `useEffect`. This forces React to batch the results of
these tasks into a single rendering cycle. So if request 1 sets state
that request 2 relies on, request 2 will never use that state since
they'll execute within the same lifecycle.

Here's a play-by-play of the current behavior:

1. The "add row" event is dispatched
    a. This sets `requiresRender: true` in form state
1. A form state request is sent with `requiresRender: true`
1. While that request is processing, another "add row" event is
dispatched
    a. This sets `requiresRender: true` in form state
    b. This adds a form state request into the queue
1. The initial form state request finishes
    a. This sets `requiresRender: false` in form state
1. The next form state request that was queued up in 3b is sent with
`requiresRender: false`
    a. THIS IS EXPECTED, BUT SHOULD ACTUALLY BE `true`!!

To fix this this, we need to ensure that the `requiresRender` property
is persisted into the second request instead of overridden. To do this,
we can add a new `serverPropsToIgnore` to form state which is read when
the processing results from the server. So if `requiresRender` exists in
`serverPropsToIgnore`, we do not merge it. This works because we
actually mutate form state in between requests. So request 2 can read
the results from request 1 without going through an additional rendering
cycle.

Here's a play-by-play of the fix:

1. The "add row" event is dispatched
    a. This sets `requiresRender: true` in form state
b. This adds a task in the queue to mutate form state with
`requiresRender: true`
1. A form state request is sent with `requiresRender: true`
1. While that request is processing, another "add row" event is
dispatched
a. This sets `requiresRender: true` in form state AND
`serverPropsToIgnore: [ "requiresRender" ]`
    c. This adds a form state request into the queue
1. The initial form state request finishes
a. This returns `requiresRender: false` from the form state endpoint BUT
IS IGNORED
1. The next form state request that was queued up in 3c is sent with
`requiresRender: true`
2025-04-01 09:54:22 -04:00
Jacob Fletcher
10ac9893ad fix(ui): nested custom components sometimes disappear when queued in form state (#11867)
When rendering custom fields nested within arrays or blocks, such as the
Lexical rich text editor which is treated as a custom field, these
fields will sometimes disappear when form state requests are invoked
sequentially. This is especially reproducible on slow networks.

This is because form state invocations are placed into a [task
queue](https://github.com/payloadcms/payload/pull/11579) which aborts
the currently running tasks when a new one arrives. By doing this, local
form state is never dispatched, and the second task in the queue becomes
stale.

The fix is to _not_ abort the currently running task. This will trigger
a complete rendering cycle, and when the second task is invoked, local
state will be up to date.

Fixes #11340, #11425, and #11824.
2025-03-25 20:40:16 -04:00
Jacob Fletcher
7532c4ab66 fix(ui): exclude fields lacking permissions from bulk edit (#11776)
Top-level fields that lack read or update permissions still appear as
options in the field selector within the bulk edit drawer.
2025-03-21 14:44:49 -04:00
Jacob Fletcher
31211e9755 feat: pass i18n through field label and description functions (#11802)
Passes the `i18n` arg through field label and description functions.
This is to avoid using custom components when simply needing to
translate a `StaticLabel` object, such as collection labels.

Here's an example:

```ts
{
  labels: {
    singular: {
      en: 'My Collection'
    }
  },
  fields: [
   // ...
   {
     type: 'collapsible',
     label: ({ i18n }) => `Translate this: ${getTranslation(collectionConfig.labels.singular, i18n)}`
     // ...
    }
  ]
}
```
2025-03-20 13:43:17 -04:00
Jacob Fletcher
b5fc8c6573 fix(ui): bulk edit subfields (#10035)
Fixes #10019. When bulk editing subfields, such as a field within a
group, changes are not persisted to the database. Not only this, but
nested fields with the same name as another selected field are
controlled by the same input. E.g. typing into one fields changes the
value of both.

The root problem is that field paths are incorrect.

When opening the bulk edit drawer, fields are flattened into options for
the field selector. This is so that fields in a tab, for example, aren't
hidden behind their tab when bulk editing. The problem is that
`RenderFields` is not set up to receive pre-determined field paths. It
attempts to build up its own field paths, but are never correct because
`getFieldPaths` receives the wrong arguments.

The fix is to just render the top-level fields directly, bypassing
`RenderFields` altogether.

Fields with subfields will still recurse through this function, but at
the top-level, fields can be sent directly to `RenderField` (singular)
since their paths have already been already formatted in the flattening
step.
2025-03-19 21:01:59 +00:00
Jacob Fletcher
d8bfb227b7 perf(ui): implements select in bulk edit (#11708)
Bulk edit can now request a partial form state thanks to #11689. This
means that we only need to build form state (and send it through the
network) for the currently selected fields, as opposed to the entire
field schema.

Not only this, but there is no longer a need to filter out unselected
fields before submitting the form, as the form state will only ever
include the currently selected fields. This is unnecessary processing
and causes an excessive amount of rendering, especially since we were
dispatching actions within a `for` loop to remove each field. React may
have batched these updates, but is bad practice regardless.

Related: stripping unselected fields was also error prone. This is
because the `overrides` function we were using to do this receives
`FormState` (shallow) as an argument, but was being treated as `Data`
(not shallow, what the create and update operations expect).

E.g. `{ myGroup.myTitle: { value: 'myValue' }}` → `{ myGroup: { myTitle:
'myValue' }}`.
 
This led to the `sanitizeUnselectedFields` function improperly
formatting data sent to the server and would throw an API error upon
submission. This is only evident when sanitizing nested fields. Instead
of converting this data _again_, the select API takes care of this by
ensuring only selected fields exist in form state.

Related: bulk upload was not hitting form state on change. This means
that no field-level validation was occurring on type.
2025-03-17 23:06:58 -04:00
Jacob Fletcher
0b1a1b585b fix(ui): processing and initializing form does not disable standalone fields (#11714)
The form component's `initializing` and `processing` states do not
disable fields that are rendered outside of `DocumentFields`. Fields
currently rely on the `readOnly` prop provided by `DocumentFields` and
do not subscribe to these states for themselves. This means that fields
that are rendered outright, such as within the bulk edit drawer, they do
not receive a `readOnly` prop and are therefore never disabled.

The fix is add a `disabled` property to the `useField` hook. This
subscribes to the `initializing` and `processing` states in the same way
as `DocumentFields`, however, now each field can determine its own
disabled state instead of relying solely on the `readOnly` prop. Adding
this new prop has no overhead as `processing` and `initializing` is
already being subscribed to within `useField`.
2025-03-17 10:27:21 -04:00
Jacob Fletcher
9ea8a7acf0 feat: form state select (#11689)
Implements a select-like API into the form state endpoint. This follows
the same spec as the Select API on existing Payload operations, but
works on form state rather than at the db level. This means you can send
the `select` argument through the form state handler, and it will only
process and return the fields you've explicitly identified.

This is especially useful when you only need to generate a partial form
state, for example within the bulk edit form where you select only a
subset of fields to edit. There is no need to iterate all fields of the
schema, generate default values for each, and return them all through
the network. This will also simplify and reduce the amount of
client-side processing required, where we longer need to strip
unselected fields before submission.
2025-03-14 13:11:12 -04:00
Paul
878dc54579 feat: added support for conditional tabs (#8720)
Adds support for conditional tabs.

You can now add a `condition` function like other fields to each
individual tab's admin config like so:

```ts
{
  name: 'contentTab',
  admin: {
    condition: (data) => Boolean(data?.enableTab)
  }
}
```

This will toggle the individual tab's visibility in the document listing

### Example


https://github.com/user-attachments/assets/45cf9cfd-eaed-4dfe-8a32-1992385fd05c

This is an updated PR from
https://github.com/payloadcms/payload/pull/8406 thanks to @willviles

---------

Co-authored-by: Will Viles <will@willviles.com>
Co-authored-by: Jarrod Flesch <jarrodmflesch@gmail.com>
2025-03-13 13:32:53 +00:00
Jacob Fletcher
355bd12c61 chore: infer React context providers and prefer use (#11669)
As of [React 19](https://react.dev/blog/2024/12/05/react-19), context
providers no longer require the `<MyContext.Provider>` syntax and can be
rendered as `<MyContext>` directly. This will be deprecated in future
versions of React, which is now being caught by the
[`@eslint-react/no-context-provider`](https://eslint-react.xyz/docs/rules/no-context-provider)
ESLint rule.

Similarly, the [`use`](https://react.dev/reference/react/use) API is now
preferred over `useContext` because it is more flexible, for example
they can be called within loops and conditional statements. See the
[`@eslint-react/no-use-context`](https://eslint-react.xyz/docs/rules/no-use-context)
ESLint rule for more details.
2025-03-12 15:48:20 -04:00
Jarrod Flesch
4defa33221 fix(ui): add RowLabelProvider context for blocks row labels (#11664) 2025-03-12 13:08:18 -04:00
Jacob Fletcher
ac1e3cf69e feat(ui): form state queues (#11579)
Implements a form state task queue. This will prevent onChange handlers
within the form component from processing unnecessarily often, sometimes
long after the user has stopped making changes. This leads to a
potentially huge number of network requests if those changes were made
slower than the debounce rate. This is especially noticeable on slow
networks.

Does so through a new `useQueue` hook. This hook maintains a stack of
events that need processing but only processes the final event to
arrive. Every time a new event is pushed to the stack, the currently
running process is aborted (if any), and that event becomes the next in
the queue. This results in a shocking reduction in the time it takes
between final change to form state and the final network response, from
~1.5 minutes to ~3 seconds (depending on the scenario, see below).

This likely fixes a number of existing open issues. I will link those
issues here once they are identified and verifiably fixed.

Before:

I'm typing slowly here to ensure my changes aren't debounce by the form.
There are a total of 60 characters typed, triggering 58 network requests
and taking around 1.5 minutes to complete after the final change was
made.


https://github.com/user-attachments/assets/49ba0790-a8f8-4390-8421-87453ff8b650

After:

Here there are a total of 69 characters typed, triggering 11 network
requests and taking only about 3 seconds to complete after the final
change was made.


https://github.com/user-attachments/assets/447f8303-0957-41bd-bb2d-9e1151ed9ec3
2025-03-10 21:25:14 -04:00
Jessica Chowdhury
9ac7a3ed49 fix(ui): adds fallback locale when defaultLocale is unavailable (#11614) 2025-03-10 15:20:58 -04:00
Patrik
3ede7abe00 feat: threads path through field validate function (#11591)
This PR updates the field `validate` function property to include a new
`path` argument.

The `path` arg provides the schema path of the field, including array
indices where applicable.

#### Changes:

- Added `path: (number | string)[]` in the ValidateOptions type.
2025-03-10 11:41:23 -04:00
Jarrod Flesch
48115311e7 fix(ui): incorrect error states (#11574)
Fixes https://github.com/payloadcms/payload/issues/11568

### What? Out of sync errors states
- Collaspibles & Tabs were not reporting accurate child error counts
- Arrays could get into a state where they would not update their error
states
- Slight issue with toasts 

### Tabs & Collapsibles
The logic for determining matching field paths was not functioning as
intended. Fields were attempting to match with paths such as `_index-0`
which will not work.

### Arrays
The form state was not updating when the server sent back errorPaths.
This PR adds `errorPaths` to `serverPropsToAccept`.

### Toasts
Some toasts could report errors in the form of `my > > error`. This
ensures they will be `my > error`

### Misc
Removes 2 files that were not in use:
- `getFieldStateFromPaths.ts`
- `getNestedFieldState.ts`
2025-03-06 14:02:10 -05:00
Jarrod Flesch
2163b0fdb5 feat(ui): improves field error toast messages (#11521)
### What?
Adjusts how field errors are displayed within toasts so they are easier
to read.

![Frame 36
(1)](https://github.com/user-attachments/assets/3debec4f-8d78-42ef-84bc-efd574a63ac6)
2025-03-05 14:28:26 -05:00
Patrik
ba30d7641f feat: threads path through field condition functions (#11528)
This PR updates the field `condition` function property to include a new
`path` argument.

The `path` arg provides the schema path of the field, including array
indices where applicable.

#### Changes:

- Added `path: (number | string)[]` in the Condition type.
- Updated relevant condition checks to ensure correct parameter usage.
2025-03-05 12:45:08 -05:00
Sasha
31e217967e fix(ui): execute client upload handler only when file exists (#11538)
Fixes https://github.com/payloadcms/payload/issues/11537
2025-03-05 16:05:19 +02:00
Alessio Gravili
6d8aca5ab3 fix(richtext-lexical): ensure nested forms do not use form element (#11462)
Previously, lexical blocks initialized a new `Form` component that rendered as `<form>` in the DOM. This may lead to React errors, as forms nested within forms is not valid HTML.

This PR changes them to render as `<div>` in the DOM instead.
2025-02-28 19:29:07 -07:00
Alessio Gravili
7118b6418f fix(ui): disable publish button if form is autosaving (#11343)
Fixes https://github.com/payloadcms/payload/issues/6648

This PR introduces a new `useFormBackgroundProcessing` hook and a corresponding `setBackgroundProcessing` function in the `useForm` hook.

Unlike `useFormProcessing` / `setProcessing`, which mark the entire form as read-only, this new approach only disables the Publish button during autosaving, keeping form fields editable for a better user experience.

I named it `backgroundProcessing` because it should run behind the scenes without disrupting the user. You could argue that it is a bit more generic than something like `isAutosaving`, but it signals intent: Background = do not disrupt the user.
2025-02-27 10:28:08 -07:00
Alessio Gravili
2a3682ff68 fix(deps): ensure Next.js 15.2.0 compatibility, upgrade nextjs and @types/react versions in monorepo (#11419)
This bumps next.js to 15.2.0 in our monorepo, as well as all @types/react and @types/react-dom versions. Additionally, it removes the obsolete `peerDependencies` property from our root package.json.

This PR also fixes 2 bugs introduced by Next.js 15.2.0. This highlights why running our test suite against the latest Next.js, to make sure Payload is compatible, version is important.

## 1. handleWhereChange running endlessly

Upgrading to Next.js 15.2.0 caused `handleWhereChange` to be continuously called by a `useEffect` when the list view filters were opened, leading to a React error - I did not investigate why upgrading the Next.js version caused that, but this PR fixes it by making use of the more predictable `useEffectEvent`.

## 2. Custom Block and Array label React key errors

Upgrading to Next.js 15.2.0 caused react key errors when rendering custom block and array row labels on the server. This has been fixed by rendering those with a key

## 3. Table React key errors

When rendering a `Table`, a React key error is thrown since Next.js 15.2.0
2025-02-27 05:56:09 +00:00
Sasha
b540da53ec feat(storage-*): large file uploads on Vercel (#11382)
Currently, usage of Payload on Vercel has a limitation - uploads are
limited by 4.5MB file size.
This PR allows you to pass `clientUploads: true` to all existing storage
adapters
* Storage S3
* Vercel Blob
* Google Cloud Storage
* Uploadthing
* Azure Blob

And then, Payload will do uploads on the client instead. With the S3
Adapter it uses signed URLs and with Vercel Blob it does this -
https://vercel.com/guides/how-to-bypass-vercel-body-size-limit-serverless-functions#step-2:-create-a-client-upload-route.
Note that it doesn't mean that anyone can now upload files to your
storage, it still does auth checks and you can customize that with
`clientUploads.access`


https://github.com/user-attachments/assets/5083c76c-8f5a-43dc-a88c-9ddc4527d91c

Implements https://github.com/payloadcms/payload/discussions/7569
feature request.
2025-02-26 21:59:34 +02:00
Jacob Fletcher
2477fc6c75 fix(ui): custom block labels stale when reordering blocks (#11367)
When blocks have custom row labels, those row labels become stale when
reordering blocks. After moving a block, for example, the row label will
jump back the original block until form state returns with the proper
rendering order. This is especially evident on slow networks.
2025-02-24 16:52:12 +00:00
Sasha
4224c68002 docs: fix links to react hooks (#11344) 2025-02-22 13:23:00 +00:00
Alessio Gravili
132852290a fix(ui): database errors when running autosave and ensure autosave doesn't run unnecessarily (#11270)
## Change 1 - database errors when running autosave

The previous autosave implementation allowed multiple autosave fetch
calls (=> save document draft) to run in parallel. While the
AbortController aborted previous autosave calls if a new one comes in in
order to only process the latest one, this had one flaw:

Using the AbortController to abort the autosave call only aborted the
`fetch` call - it did not however abort the database operation that may
have started as part of this fetch call. If you then started a new
autosave call, this will start yet another database operation on the
backend, resulting in two database operations that would be running at
the same time.

This has caused a lot of transaction errors that were only noticeable
when connected to a slower, remote database. This PR removes the
AbortController and ensures that the previous autosave operation is
properly awaited before starting a new one, while still discarding
outdated autosave requests from the queue **that have not started yet**.

Additionally, it cleans up the AutoSave component to make it more
readable.

## Change 2 - ensure autosave doesn't run unnecessarily

If connected to a slower backend or database, one change in a document
may trigger two autosave operations instead of just one. This is how it
could happen:

1. Type something => formstate changes => autosave is triggered
2. 200ms later: form state request is triggered. Autosave is still
processing
3. 100ms later: form state comes back from server => local form state is
updated => another autosave is triggered
4. First autosave is aborted - this lead to a browser error. This PR
ensures that that error is no longer surfaced to the user
5. Another autosave is started

This PR adds additional checks to ONLY trigger an autosave if the form
DATA (not the entire form state itself) changes. Previously, it ran
every time the object reference of the form state changes. This includes
changes that do not affect the form data, like `field.valid`. =>
Basically every time form state comes back from the server, we were
triggering another, unnecessary autosave
2025-02-19 03:38:32 +00:00
Paul
06debf5e14 fix(ui): issues with prevent leave and autosave when the form is submitted but invalid (#11233)
Fixes https://github.com/payloadcms/payload/issues/11224
Fixes https://github.com/payloadcms/payload/issues/10492

This PR fixes a few weird behaviours when `validate: true` is set on drafts:
- when autosave is on and you submit an invalid form it would get stuck in an infinite loop
- PreventLeave would not trigger for submitted but invalid forms leading to potential data loss

Changes:
- Adds e2e tests for the above scenarios
- Adds a new `isValid` flag on the `Form` context provider to signal globally if the form is in a valid or invalid state
  - Components like Autosave will manage this internally since it manages its own submission flow as well
- Adds PreventLeave to Autosave too for when form is invalid meaning data hasn't been actually saved so we want to prevent the user accidentally losing data by reloading or closing the page


The following tests have been added
![image](https://github.com/user-attachments/assets/db208aa4-6ed6-4287-b200-59575cd3c9d0)
2025-02-18 12:12:41 -07:00
Alessio Gravili
4c8cafd6a6 perf: deduplicate blocks used in multiple places using new config.blocks property (#10905)
If you have multiple blocks that are used in multiple places, this can quickly blow up the size of your Payload Config. This will incur a performance hit, as more data is
1.  sent to the client (=> bloated `ClientConfig` and large initial html) and
2. processed on the server (permissions are calculated every single time you navigate to a page - this iterates through all blocks you have defined, even if they're duplicative)

This can be optimized by defining your block **once** in your Payload Config, and just referencing the block slug whenever it's used, instead of passing the entire block config. To do this, the block can be defined in the `blocks` array of the Payload Config. The slug can then be passed to the `blockReferences` array in the Blocks Field - the `blocks` array has to be empty for compatibility reasons.

```ts
import { buildConfig } from 'payload'
import { lexicalEditor, BlocksFeature } from '@payloadcms/richtext-lexical'

// Payload Config
const config = buildConfig({
  // Define the block once
  blocks: [
    {
      slug: 'TextBlock',
      fields: [
        {
          name: 'text',
          type: 'text',
        },
      ],
    },
  ],
  collections: [
    {
      slug: 'collection1',
      fields: [
        {
          name: 'content',
          type: 'blocks',
          // Reference the block by slug
          blockReferences: ['TextBlock'],
          blocks: [], // Required to be empty, for compatibility reasons
        },
      ],
    },
     {
      slug: 'collection2',
      fields: [
        {
          name: 'editor',
          type: 'richText',
          editor: lexicalEditor({
            BlocksFeature({
              // Same reference can be reused anywhere, even in the lexical editor, without incurred performance hit
              blocks: ['TextBlock'],
            })
          })
        },
      ],
    },
  ],
})
```

## v4.0 Plans

In 4.0, we will remove the `blockReferences` property, and allow string block references to be passed directly to the blocks `property`. Essentially, we'd remove the `blocks` property and rename `blockReferences` to `blocks`.

The reason we opted to a new property in this PR is to avoid breaking changes. Allowing strings to be passed to the `blocks` property will prevent plugins that iterate through fields / blocks from compiling.

## PR Changes

- Testing: This PR introduces a plugin that automatically converts blocks to block references. This is done in the fields__blocks test suite, to run our existing test suite using block references.

- Block References support: Most changes are similar. Everywhere we iterate through blocks, we have to now do the following:
1. Check if `field.blockReferences` is provided. If so, only iterate through that.
2. Check if the block is an object (= actual block), or string
3. If it's a string, pull the actual block from the Payload Config or from `payload.blocks`.

The exception is config sanitization and block type generations. This PR optimizes them so that each block is only handled once, instead of every time the block is referenced.

## Benchmarks

60 Block fields, each block field having the same 600 Blocks.

### Before:
**Initial HTML:** 195 kB
**Generated types:** takes 11 minutes, 461,209 lines

https://github.com/user-attachments/assets/11d49a4e-5414-4579-8050-e6346e552f56

### After:
**Initial HTML:** 73.6 kB
**Generated types:** takes 2 seconds, 35,810 lines

https://github.com/user-attachments/assets/3eab1a99-6c29-489d-add5-698df67780a3

### After Permissions Optimization (follow-up PR)
Initial HTML: 73.6 kB

https://github.com/user-attachments/assets/a909202e-45a8-4bf6-9a38-8c85813f1312


## Future Plans

1. This PR does not yet deduplicate block references during permissions calculation. We'll optimize that in a separate PR, as this one is already large enough
2. The same optimization can be done to deduplicate fields. One common use-case would be link field groups that may be referenced in multiple entities, outside of blocks. We might explore adding a new `fieldReferences` property, that allows you to reference those same `config.blocks`.
2025-02-14 00:08:20 +00:00
Jacob Fletcher
3f550bc0ec feat: route transitions (#9275)
Due to nature of server-side rendering, navigation within the admin
panel can lead to slow page response times. This can lead to the feeling
of an unresponsive app after clicking a link, for example, where the
page remains in a stale state while the server is processing. This is
especially noticeable on slow networks when navigating to data heavy or
process intensive pages.

To alleviate the bad UX that this causes, the user needs immediate
visual indication that _something_ is taking place. This PR renders a
progress bar in the admin panel which is immediately displayed when a
user clicks a link, and incrementally grows in size until the new route
has loaded in.

Inspired by https://github.com/vercel/react-transition-progress.

Old:

https://github.com/user-attachments/assets/1820dad1-3aea-417f-a61d-52244b12dc8d

New:

https://github.com/user-attachments/assets/99f4bb82-61d9-4a4c-9bdf-9e379bbafd31

To tie into the progress bar, you'll need to use Payload's new `Link`
component instead of the one provided by Next.js:

```diff
- import { Link } from 'next/link'
+ import { Link } from '@payloadcms/ui'
```

Here's an example:

```tsx
import { Link } from '@payloadcms/ui'

const MyComponent = () => {
  return (
    <Link href="/somewhere">
      Go Somewhere
    </Link>
  )
}
```

In order to trigger route transitions for a direct router event such as
`router.push`, you'll need to wrap your function calls with the
`startRouteTransition` method provided by the `useRouteTransition` hook.

```ts
'use client'
import React, { useCallback } from 'react'
import { useTransition } from '@payloadcms/ui'
import { useRouter } from 'next/navigation'

const MyComponent: React.FC = () => {
  const router = useRouter()
  const { startRouteTransition } = useRouteTransition()
 
  const redirectSomewhere = useCallback(() => {
    startRouteTransition(() => router.push('/somewhere'))
  }, [startRouteTransition, router])
 
  // ...
}
```

In the future [Next.js might provide native support for
this](https://github.com/vercel/next.js/discussions/41934#discussioncomment-12077414),
and if it does, this implementation can likely be simplified.

Of course there are other ways of achieving this, such as with
[Suspense](https://react.dev/reference/react/Suspense), but they all
come with a different set of caveats. For example with Suspense, you
must provide a fallback component. This means that the user might be
able to immediately navigate to the new page, which is good, but they'd
be presented with a skeleton UI while the other parts of the page stream
in. Not necessarily an improvement to UX as there would be multiple
loading states with this approach.

There are other problems with using Suspense as well. Our default
template, for example, contains the app header and sidebar which are not
rendered within the root layout. This means that they need to stream in
every single time. On fast networks, this would also lead to a
noticeable "blink" unless there is some mechanism by which we can detect
and defer the fallback from ever rendering in such cases. Might still be
worth exploring in the future though.
2025-02-13 09:48:13 -05:00
Jacob Fletcher
2a0094def7 fix(ui): relationship filterOptions not applied within the list view (#11008)
Fixes #10440. When `filterOptions` are set on a relationship field,
those same filters are not applied to the `Filter` component within the
list view. This is because `filterOptions` is not being thread into the
`RelationshipFilter` component responsible for populating the available
options.

To do this, we first need to be resolve the filter options on the server
as they accept functions. Once resolved, they can be prop-drilled into
the proper component and appended onto the client-side "where" query.

Reliant on #11080.
2025-02-11 13:20:55 -05:00