Compare commits

..

34 Commits

Author SHA1 Message Date
Alessio Gravili
6562f08f8a fix jsdocs 2025-04-01 16:20:51 -06:00
Alessio Gravili
51a1ce36e1 docs: fix draft docs and jsdocs for payload.create draft argument 2025-04-01 16:11:59 -06:00
Sasha
e5690fcab9 fix(graphql): respect draft: true when querying joins (#11869)
The same as https://github.com/payloadcms/payload/pull/11763 but also
for GraphQL. The previous fix was working only for the Local API and
REST API due to a different method for querying joins in GraphQL.
2025-04-01 14:41:47 -04:00
Elliot DeNolf
4ac6d21ef6 chore(release): v3.32.0 [skip ci] 2025-04-01 14:27:01 -04:00
Germán Jabloñski
d963e6a54c feat: orderable collections (#11452)
Closes https://github.com/payloadcms/payload/discussions/1413

### What?

Introduces a new `orderable` boolean property on collections that allows
dragging and dropping rows to reorder them:



https://github.com/user-attachments/assets/8ee85cf0-add1-48e5-a0a2-f73ad66aa24a

### Why?

[One of the most requested
features](https://github.com/payloadcms/payload/discussions/1413).
Additionally, poorly implemented it can be very costly in terms of
performance.

This can be especially useful for implementing custom views like kanban.

### How?

We are using fractional indexing. In its simplest form, it consists of
calculating the order of an item to be inserted as the average of its
two adjacent elements.
There is [a famous article by David
Greenspan](https://observablehq.com/@dgreensp/implementing-fractional-indexing)
that solves the problem of running out of keys after several partitions.
We are using his algorithm, implemented [in this
library](https://github.com/rocicorp/fractional-indexing).

This means that if you insert, delete or move documents in the
collection, you do not have to modify the order of the rest of the
documents, making the operation more performant.

---------

Co-authored-by: Dan Ribbens <dan.ribbens@gmail.com>
2025-04-01 14:11:11 -04:00
Dan Ribbens
968a066f45 fix: typescriptSchema override required to false (#11941)
### What?
Previously if you used the typescriptSchema and `returned: false`, the
field would still be required anyways.

### Why?
We were adding fields to be required on the collection without comparing
the returned schema from typescriptSchema functions.

### How?
This changes the order of logic so that `requiredFieldNames` on the
collection is only after running and checking the field schema.
2025-04-01 11:35:31 -04:00
Jacob Fletcher
373f6d1032 fix(ui): nested fields disappear when manipulating rows in form state (#11906)
Continuation of #11867. When rendering custom fields nested within
arrays or blocks, such as the Lexical rich text editor which is treated
as a custom field, these fields will sometimes disappear when form state
requests are invoked sequentially. This is especially reproducible on
slow networks.

This is different from the previous PR in that this issue is caused by
adding _rows_ back-to-back, whereas the previous issue was caused when
adding a single row followed by a change to another field.

Here's a screen recording demonstrating the issue:


https://github.com/user-attachments/assets/5ecfa9ec-b747-49ed-8618-df282e64519d

The problem is that `requiresRender` is never sent in the form state
request for row 2. This is because the [task
queue](https://github.com/payloadcms/payload/pull/11579) processes tasks
within a single `useEffect`. This forces React to batch the results of
these tasks into a single rendering cycle. So if request 1 sets state
that request 2 relies on, request 2 will never use that state since
they'll execute within the same lifecycle.

Here's a play-by-play of the current behavior:

1. The "add row" event is dispatched
    a. This sets `requiresRender: true` in form state
1. A form state request is sent with `requiresRender: true`
1. While that request is processing, another "add row" event is
dispatched
    a. This sets `requiresRender: true` in form state
    b. This adds a form state request into the queue
1. The initial form state request finishes
    a. This sets `requiresRender: false` in form state
1. The next form state request that was queued up in 3b is sent with
`requiresRender: false`
    a. THIS IS EXPECTED, BUT SHOULD ACTUALLY BE `true`!!

To fix this this, we need to ensure that the `requiresRender` property
is persisted into the second request instead of overridden. To do this,
we can add a new `serverPropsToIgnore` to form state which is read when
the processing results from the server. So if `requiresRender` exists in
`serverPropsToIgnore`, we do not merge it. This works because we
actually mutate form state in between requests. So request 2 can read
the results from request 1 without going through an additional rendering
cycle.

Here's a play-by-play of the fix:

1. The "add row" event is dispatched
    a. This sets `requiresRender: true` in form state
b. This adds a task in the queue to mutate form state with
`requiresRender: true`
1. A form state request is sent with `requiresRender: true`
1. While that request is processing, another "add row" event is
dispatched
a. This sets `requiresRender: true` in form state AND
`serverPropsToIgnore: [ "requiresRender" ]`
    c. This adds a form state request into the queue
1. The initial form state request finishes
a. This returns `requiresRender: false` from the form state endpoint BUT
IS IGNORED
1. The next form state request that was queued up in 3c is sent with
`requiresRender: true`
2025-04-01 09:54:22 -04:00
dependabot[bot]
329cd0b876 chore(deps): bump mongodb-github-action (#10921)
Bumps the github_actions group with 1 update in the / directory:
[supercharge/mongodb-github-action](https://github.com/supercharge/mongodb-github-action).
Bumps the github_actions group with 1 update in the /.github/workflows
directory:
[supercharge/mongodb-github-action](https://github.com/supercharge/mongodb-github-action).

Updates `supercharge/mongodb-github-action` from 1.11.0 to 1.12.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/supercharge/mongodb-github-action/releases">supercharge/mongodb-github-action's
releases</a>.</em></p>
<blockquote>
<h2>1.12.0</h2>
<p>Release 1.12.0</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/supercharge/mongodb-github-action/blob/main/CHANGELOG.md">supercharge/mongodb-github-action's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/superchargejs/mongodb-github-action/compare/v1.11.0...v1.12.0">1.12.0</a>
- 2025-01-05</h2>
<h3>Added</h3>
<ul>
<li>added <code>mongodb-image</code> input: this option allows you to
define a custom Docker container image. It uses <code>mongo</code> by
default, but you may specify an image from a different registry than
Docker hub. Please check the Readme for details.</li>
</ul>
<h3>Updated</h3>
<ul>
<li>bump dependencies</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="90004df786"><code>90004df</code></a>
bump node and mongodb versions</li>
<li><a
href="b5fa058527"><code>b5fa058</code></a>
bump version to 1.12.0 in readme</li>
<li><a
href="369a992ac4"><code>369a992</code></a>
update changelog</li>
<li><a
href="08d5bf96ab"><code>08d5bf9</code></a>
bump deps</li>
<li><a
href="cbbc6f8110"><code>cbbc6f8</code></a>
Merge pull request <a
href="https://redirect.github.com/supercharge/mongodb-github-action/issues/64">#64</a>
from Sam-Bate-ITV/feature/alternative_image</li>
<li><a
href="6131e7ff86"><code>6131e7f</code></a>
wording</li>
<li><a
href="1f93cb7bb1"><code>1f93cb7</code></a>
change README based on PR review</li>
<li><a
href="812452b9eb"><code>812452b</code></a>
use docker hub for CI</li>
<li><a
href="4639b459cd"><code>4639b45</code></a>
apply suggested change</li>
<li><a
href="2ae9a450cf"><code>2ae9a45</code></a>
<a
href="https://redirect.github.com/supercharge/mongodb-github-action/issues/62">#62</a>:
add option for specifying image</li>
<li>See full diff in <a
href="https://github.com/supercharge/mongodb-github-action/compare/1.11.0...1.12.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `supercharge/mongodb-github-action` from 1.11.0 to 1.12.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/supercharge/mongodb-github-action/releases">supercharge/mongodb-github-action's
releases</a>.</em></p>
<blockquote>
<h2>1.12.0</h2>
<p>Release 1.12.0</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/supercharge/mongodb-github-action/blob/main/CHANGELOG.md">supercharge/mongodb-github-action's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/superchargejs/mongodb-github-action/compare/v1.11.0...v1.12.0">1.12.0</a>
- 2025-01-05</h2>
<h3>Added</h3>
<ul>
<li>added <code>mongodb-image</code> input: this option allows you to
define a custom Docker container image. It uses <code>mongo</code> by
default, but you may specify an image from a different registry than
Docker hub. Please check the Readme for details.</li>
</ul>
<h3>Updated</h3>
<ul>
<li>bump dependencies</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="90004df786"><code>90004df</code></a>
bump node and mongodb versions</li>
<li><a
href="b5fa058527"><code>b5fa058</code></a>
bump version to 1.12.0 in readme</li>
<li><a
href="369a992ac4"><code>369a992</code></a>
update changelog</li>
<li><a
href="08d5bf96ab"><code>08d5bf9</code></a>
bump deps</li>
<li><a
href="cbbc6f8110"><code>cbbc6f8</code></a>
Merge pull request <a
href="https://redirect.github.com/supercharge/mongodb-github-action/issues/64">#64</a>
from Sam-Bate-ITV/feature/alternative_image</li>
<li><a
href="6131e7ff86"><code>6131e7f</code></a>
wording</li>
<li><a
href="1f93cb7bb1"><code>1f93cb7</code></a>
change README based on PR review</li>
<li><a
href="812452b9eb"><code>812452b</code></a>
use docker hub for CI</li>
<li><a
href="4639b459cd"><code>4639b45</code></a>
apply suggested change</li>
<li><a
href="2ae9a450cf"><code>2ae9a45</code></a>
<a
href="https://redirect.github.com/supercharge/mongodb-github-action/issues/62">#62</a>:
add option for specifying image</li>
<li>See full diff in <a
href="https://github.com/supercharge/mongodb-github-action/compare/1.11.0...1.12.0">compare
view</a></li>
</ul>
</details>
<br />


You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

> **Note**
> Automatic rebases have been disabled on this pull request as it has
been open for over 30 days.

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-01 09:09:06 -04:00
Germán Jabloñski
6badb5ffcf chore(live-preview): enable TypeScript strict (#11840) 2025-04-01 09:03:39 -04:00
Marcus Forsberg
5b0e0ab788 fix(translations): improve Swedish translations for query presets (#11937)
### What?
Minor changes to Swedish translations added in #11330 to keep wording in
line with changes in #11654
2025-04-01 10:31:37 +00:00
Alessio Gravili
c844b4c848 feat: configurable job queue processing order (LIFO/FIFO), allow sequential execution of jobs (#11897)
Previously, jobs were executed in FIFO order on MongoDB, and LIFO on
Postgres, with no way to configure this behavior.

This PR makes FIFO the default on both MongoDB and Postgres and
introduces the following new options to configure the processing order
globally or on a queue-by-queue basis:
- a `processingOrder` property to the jobs config
- a `processingOrder` argument to `payload.jobs.run()` to override
what's set in the jobs config

It also adds a new `sequential` option to `payload.jobs.run()`, which
can be useful for debugging.
2025-03-31 15:00:36 -06:00
Alessio Gravili
9c88af4b20 refactor(drizzle): replace query chaining with dynamic query building (#11923)
This replaces usage of our `chainMethods` helper to dynamically chain
queries with [drizzle dynamic query
building](https://orm.drizzle.team/docs/dynamic-query-building).

This is more type-safe, more readable and requires less code
2025-03-31 20:37:45 +00:00
Alessio Gravili
9a1c3cf4cc fix: support parallel job queue tasks (#11917)
This adds support for running multiple job queue tasks in parallel
within the same workflow while preventing conflicts. Previously, this
would have caused the following issues:
- Job log entries get lost - the final job log is incomplete, despite
all tasks having been executed
- Write conflicts in postgres, leading to unique constraint violation
errors

The solution involves handling job log data updates in a way that avoids
overwriting, and ensuring the final update reflects the latest job log
data. Each job log entry now initializes its own ID, so a given job log
entry’s ID remains the same across multiple, parallel task executions.

## Postgres

In Postgres, we need to enable transactions for the
`payload.db.updateJobs` operation; otherwise, two tasks updating the
same job in parallel can conflict. This happens because Postgres handles
array rows by deleting them all, then re-inserting (rather than
upserting). The rows are stored in a separate table, and the following
scenario can occur:

Op 1: deletes all job log rows
Op 2: deletes all job log rows
Op 1: inserts 200 job log rows
Op 2: insert the same 200 job log rows again => `error: “duplicate key
value violates unique constraint "payload_jobs_log_pkey”`

Because transactions were not used, the rows inserted by Op 1
immediately became visible to Op 2, causing the conflict. Enabling
transactions fixes this. In theory, it can still happen if Op 1 commits
before Op 2 starts inserting (due to the read committed isolation
level), but it should occur far less frequently.

Alongside this change, we should consider inserting the rows using an
upsert (update on conflict), which will get rid of this error
completely. That way, if the insertion of Op 1 is visible to Op 2, Op 2
will simply overwrite it, rather than erroring. Individual job entries
are immutable and job entries cannot be deleted, thus this shouldn't
corrupt any data.

## Mongo

In Mongo, the issue is addressed by ensuring that log row deletions
caused due to different log states in concurrent operations are not
merged back to the client job log, and by making sure the final update
includes all job logs.

There is no duplicate key error in Mongo because the array log resides
in the same document and duplicates are simply upserted. We cannot use
transactions in Mongo, as it appears to lock the document in a way that
prevents reliable parallel updates, leading to:

`MongoServerError: WriteConflict error: this operation conflicted with
another operation. Please retry your operation or multi-document
transaction`
2025-03-31 13:06:05 -06:00
Alessio Gravili
a083d47368 feat(db-*): return database name to unsanitized config (#11913)
You can access the database name from `sanitizedConfig.db.name`. But
currently, it' not possible to access the db name from the unsanitized
config.

Plugins only have access to the unsanitized config. This change allows
db adapters to return the db name early, which will allow plugins to
conditionally initialize db-specific functionality
2025-03-31 12:57:17 -06:00
Patrik
96289bf555 fix(next): block encoded and escaped open redirects in getSafeRedirect (#11907)
### What

This PR improves the `getSafeRedirect` utility to improve security
around open redirect handling.

### How

- Normalizes and decodes the redirect path using `decodeURIComponent`
- Catches malformed encodings with a try/catch fallback
- Blocks open redirects
2025-03-31 13:11:34 -04:00
Alessio Gravili
a6f7ef837a feat(db-*): export types from main export (#11914)
In 3.0, we made the decision to export all types from the main package
export (e.g. `payload/types` => `payload`). This improves type
discoverability by IDEs and simplifies importing types.

This PR does the same for our db adapters, which still have a separate
`/types` subpath export. While those are kept for
backwards-compatibility, we can remove them in 4.0.
2025-03-31 15:45:02 +00:00
Said Akhrarov
03d4c5b2ee test: deflake versions with autosave e2e (#11919)
<!--

Thank you for the PR! Please go through the checklist below and make
sure you've completed all the steps.

Please review the
[CONTRIBUTING.md](https://github.com/payloadcms/payload/blob/main/CONTRIBUTING.md)
document in this repository if you haven't already.

The following items will ensure that your PR is handled as smoothly as
possible:

- PR Title must follow conventional commits format. For example, `feat:
my new feature`, `fix(plugin-seo): my fix`.
- Minimal description explained as if explained to someone not
immediately familiar with the code.
- Provide before/after screenshots or code diffs if applicable.
- Link any related issues/discussions from GitHub or Discord.
- Add review comments if necessary to explain to the reviewer the logic
behind a change

### What?

### Why?

### How?

Fixes #

-->
### What?
This PR aims to deflake the `test/versions/e2e.spec.ts:925:5 › Versions
› Collections with draft validation › - with autosave - shows a prevent
leave alert when form is submitted but invalid` e2e test.

The issue seems to be that the `fill` call followed by a `page.reload`
sometimes conflicts with autosave which may cause the test to flake.

### Why?
To deflake this test in ci.

### How?
Adds a single `waitForAutoSaveToRunAndComplete` function call prior to
the last call to `page.reload`. In my testing, on my local machine,
adding the `waitForAutoSaveToRunAndComplete` function allows the test to
pass every time. Without this, the tests fails on my machine
consistently.
2025-03-31 09:37:43 -03:00
Nate Schneider
af8c7868d6 docs: capitalization error (#11912)
Fixed a capitalized letter at line 180
2025-03-31 10:50:36 +00:00
Alessio Gravili
d1c0989da7 perf: prefer async fs calls (#11918)
Synchronous file system operations such as `readFileSync` block the
event loop, whereas the asynchronous equivalents (like await
`fs.promises.readFile`) do not. This PR replaces certain synchronous fs
calls with their asynchronous counterparts in contexts where async
operations are already in use, improving performance by avoiding event
loop blocking.

Most of the synchronous calls were in our file upload code. Converting
them to async should theoretically free up the event loop and allow
more, other requests to run in parallel without delay
2025-03-29 10:58:54 -06:00
Said Akhrarov
70b9cab393 test: deflake indexed e2e (#11911)
### What?
This PR aims to deflake the indexed fields e2e test in
`test/fields/collections/Indexed/e2e.spec.ts`.

The issue is that this test is setup in a way where sometimes two toasts
will present themselves in the ui. The second toast assertion will fail
with a strict mode violation as the toast locator will resolve to two
elements.

### Why?
To prevent this test from flaking in ci.

### How?
Adding a new `dismissAfterAssertion` flag to the `assertToastErrors`
helper function which dismisses the toasts. This way, the toasts will
not raise the aforementioned error as they will be dismissed from the
ui.

The logic is handled in a separate loop through such that the assertions
occur first. This is done so that dismissing a toast does not surface
errors due to the order of toasts being shown changing.
2025-03-29 01:02:05 +00:00
Maxim Seshuk
4a0bc869dd fix(ui): switching languages does not update cached client config (#11725)
### What?
Fixed client config caching to properly update when switching languages
in the admin UI.

### Why?
Currently, switching languages doesn't fully update the UI because
client config stays cached with previous language translations.

### How?
Created a language-aware caching system that stores separate configs for
each language and only uses cached config when it matches the active
language.

Before:
```typescript
let cachedClientConfig: ClientConfig | null = global._payload_clientConfig

if (!cachedClientConfig) {
  cachedClientConfig = global._payload_clientConfig = null
}

export const getClientConfig = cache(
  (args: { config: SanitizedConfig; i18n: I18nClient; importMap: ImportMap }): ClientConfig => {
    if (cachedClientConfig && !global._payload_doNotCacheClientConfig) {
      return cachedClientConfig
    }
    // ... create new config ...
  }
);
```

After:
```typescript
let cachedClientConfigs: Record<string, ClientConfig> = global._payload_localizedClientConfigs

if (!cachedClientConfigs) {
  cachedClientConfigs = global._payload_localizedClientConfigs = {}
}

export const getClientConfig = cache(
  (args: { config: SanitizedConfig; i18n: I18nClient; importMap: ImportMap }): ClientConfig => {
    const { config, i18n, importMap } = args
    const currentLocale = i18n.language

    if (!global._payload_doNotCacheClientConfig && cachedClientConfigs[currentLocale]) {
      return cachedClientConfigs[currentLocale]
    }
    // ... create new config with correct translations ...
  }
);
```

Also added handling for cache clearing during HMR to ensure
compatibility with the existing system.

Fixes #11406

---------

Co-authored-by: Jacob Fletcher <jacobsfletch@gmail.com>
2025-03-28 17:49:28 -04:00
Jacob Fletcher
62c4e81a1f refactor(ui): replace autosave queue pattern with useQueues hook (#11884)
Replaces the queue pattern used within autosave with the `useQueues`
hook introduced in #11579. To do this, queued tasks now accept an
options object with callbacks which can be used to tie into events of
the process, such as before it begins to prevent it from running, and
after it has finished to perform side effects.

The `useQueues` hook now also maintains an array of queued tasks as
opposed to individual refs.
2025-03-28 13:54:15 -04:00
Alessio Gravili
2b6313ed48 docs: fix invalid react-hooks docs (#11895)
Our current react-docs page is not accessible due to an mdx parsing
error, caused by a recent introduction of invalid syntax. This PR fixes
it
2025-03-28 08:39:06 +02:00
Philipp Schneider
21f7ba7b9d feat: change version view modifiedOnly default to true (#11794)
Replaces a more elaborate approach from
https://github.com/payloadcms/payload/pull/11520 with the simplest
solution, just changing the default.
2025-03-27 19:22:41 -03:00
Pranav
b863fd0915 docs: correct spelling of "it" (#11889)
Correct spelling of "it" in configuration/overview.mdx
2025-03-27 15:58:25 +00:00
Alessio Gravili
f34cc637e3 fix(richtext-lexical): incorrectly hidden fields in drawers due to incorrect permissions handling (#11883)
Lexical nested fields are currently not set-up to handle access control
on the client properly. Despite that, we were passing parent permissions
to `RenderFields`, which causes certain fields to not show up if the
document does not have `create` permission.
2025-03-26 15:04:55 -06:00
Alessio Gravili
59c9feeb45 templates: pin all payload packages, improve gen-templates script (#11841)
This PR comes with a bunch of improvements to our template generation
script that makes it safer and more reliable

- bumps all our templates
- Using `latest` as payload version in our templates has proven to be
unreliable. This updates the gen-templates script to pin all payload
packages to the latest version
- adds the missing `website` entry for our template variations, thus
ensuring its lockfile gets updated
- adds importmap generation to the gen-templates script
- adds new `script:gen-templates:build` script to verify that all
templates still build correctly
2025-03-26 20:52:53 +00:00
Paul
1578cd2425 chore(ui): added selected option as a class to list table cell (#11750)
In the Cell component for a select field such as our `_status` fields it
will now add a class eg. `selected--published` for the selected option
so it can be easily targeted with CSS.

---------

Co-authored-by: Dan Ribbens <dan.ribbens@gmail.com>
2025-03-26 20:32:42 +00:00
Said Akhrarov
5ae5255ba3 perf(ui): download only images and optimize image selection for document edit view, prioritize best-fit size (#11844)
### What?

In the same vein as #11696, this PR optimizes how images are selected
for display in the document edit view. It ensures that only image files
are processed and selects the most appropriate size to minimize
unnecessary downloads and improve performance.

#### Previously:

- Non-image files were being processed unnecessarily, despite not
generating thumbnails.
- Images without a `thumbnailURL` defaulted to their original full size,
even when smaller, optimized versions were available.

#### Now:

- **Only images** are processed for thumbnails, avoiding redundant
requests for non-images.
- **The smallest available image within a target range** (`40px -
180px`) is prioritized for display.
- **If no images fit within this range**, the logic selects:
  - The next smallest larger image (if available).
- The **original** image if it is smaller than the next available larger
size.
  - The largest **smaller** image if no better fit exists.

### Why?

Prevents unnecessary downloads of non-image files, reduces bandwidth
usage by selecting more efficient image sizes and improves load times
and performance in the edit view.

### How?

- **Filters out non-image files** when determining which assets to
display.
- Uses the same algorithm as in #11696 but turns it into a reusable
function to be used in various areas around the codebase. Namely the
upload field hasOne and hasMany components.

Before (4.5mb transfer):

![edit-view-before](https://github.com/user-attachments/assets/ff3513b7-b874-48c3-bce7-8a9425243e00)

After (15.9kb transfer):

![edit-view-after](https://github.com/user-attachments/assets/fce8c463-65ae-4f1d-81b5-8781e89f06f1)
2025-03-26 16:13:52 -04:00
Alessio Gravili
98e4db07c3 fix(plugin-cloud-storage): ensure client handlers are added to import map regardless of enabled state (#11880)
There are cases when a storage plugin is disabled during development and
enabled in production. This will result in import maps that differ
depending on if they're generated during development or production.

In a lot of cases, those import maps are generated during
development-only. During production, we just re-use what was generated
locally. This will cause missing import map entries for those plugins
that are disabled during development.

This PR ensures the import map entries are added regardless of the
enabled state of those plugins. This is necessary for our
generate-templates script to not omit the vercel blob storage import map
entry.
2025-03-26 18:13:32 +00:00
Said Akhrarov
6b56343b97 docs: fix links in custom components and custom features (#11881)
### What?
Fixes a few broken links in `docs/custom-components` and
`docs/rich-text`. Also made some custom component links lowercase.

### Why?
To direct end users to the correct location in the docs.

### How?
Changes to `docs/custom-components/custom-views.mdx`,
`docs/custom-components/list-view.mdx`, and
`docs/rich-text/custom-features.mdx`.
2025-03-26 12:12:01 -06:00
Jacob Fletcher
4fc2eec301 fix(ui): query presets are available for unrelated collections (#11872)
When selecting query presets from the list drawer, all query presets are
available for selection, even if unrelated to the underlying collection.
When selecting one of these presets, the list view will crash with
client-side exceptions because the columns and filters that are applied
are incompatible.

The fix is to the thread `filterOptions` through the query presets
drawer. This will ensure that only related collections are shown.
2025-03-25 23:45:03 -04:00
Jacob Fletcher
10ac9893ad fix(ui): nested custom components sometimes disappear when queued in form state (#11867)
When rendering custom fields nested within arrays or blocks, such as the
Lexical rich text editor which is treated as a custom field, these
fields will sometimes disappear when form state requests are invoked
sequentially. This is especially reproducible on slow networks.

This is because form state invocations are placed into a [task
queue](https://github.com/payloadcms/payload/pull/11579) which aborts
the currently running tasks when a new one arrives. By doing this, local
form state is never dispatched, and the second task in the queue becomes
stale.

The fix is to _not_ abort the currently running task. This will trigger
a complete rendering cycle, and when the second task is invoked, local
state will be up to date.

Fixes #11340, #11425, and #11824.
2025-03-25 20:40:16 -04:00
Elliot DeNolf
35e6cfbdfc chore(release): v3.31.0 [skip ci] 2025-03-25 14:28:01 -04:00
219 changed files with 12459 additions and 3871 deletions

View File

@@ -83,7 +83,7 @@ jobs:
echo "DATABASE_URI=postgresql://$POSTGRES_USER:$POSTGRES_PASSWORD@localhost:5432/$POSTGRES_DB" >> $GITHUB_ENV
- name: Start MongoDB
uses: supercharge/mongodb-github-action@1.11.0
uses: supercharge/mongodb-github-action@1.12.0
with:
mongodb-version: 6.0

View File

@@ -474,7 +474,7 @@ Field: '/path/to/CustomArrayManagerField',
rows={[
[
{
value: '**\\`path\\`**',
value: '**\\\`path\\\`**',
},
{
value: 'The path to the array or block field',
@@ -482,7 +482,7 @@ Field: '/path/to/CustomArrayManagerField',
],
[
{
value: '**\\`rowIndex\\`**',
value: '**\\\`rowIndex\\\`**',
},
{
value: 'The index of the row to remove',
@@ -561,7 +561,7 @@ Field: '/path/to/CustomArrayManagerField',
rows={[
[
{
value: '**\\`path\\`**',
value: '**\\\`path\\\`**',
},
{
value: 'The path to the array or block field',
@@ -569,7 +569,7 @@ Field: '/path/to/CustomArrayManagerField',
],
[
{
value: '**\\`rowIndex\\`**',
value: '**\\\`rowIndex\\\`**',
},
{
value: 'The index of the row to replace',
@@ -577,7 +577,7 @@ Field: '/path/to/CustomArrayManagerField',
],
[
{
value: '**\\`data\\`**',
value: '**\\\`data\\\`**',
},
{
value: 'The data to replace within the row',
@@ -718,7 +718,7 @@ The `useDocumentInfo` hook provides information about the current document being
| **`currentEditor`** | The user currently editing the document. |
| **`docConfig`** | Either the Collection or Global config of the document, depending on what is being edited. |
| **`docPermissions`** | The current document's permissions. Fallback to collection permissions when no id is present. |
| **`documentIsLocked`** | Whether the document is currently locked by another user. [More details](./locked-documents). |
| **`documentIsLocked`** | Whether the document is currently locked by another user. [More details](./locked-documents). |
| **`getDocPermissions`** | Method to retrieve document-level permissions. |
| **`getDocPreferences`** | Method to retrieve document-level user preferences. [More details](./preferences). |
| **`globalSlug`** | The slug of the global if editing a global document. |
@@ -730,7 +730,7 @@ The `useDocumentInfo` hook provides information about the current document being
| **`initialData`** | The initial data of the document. |
| **`isEditing`** | Whether the document is being edited (as opposed to created). |
| **`isInitializing`** | Whether the document info is still initializing. |
| **`isLocked`** | Whether the document is locked. [More details](./locked-documents). |
| **`isLocked`** | Whether the document is locked. [More details](./locked-documents). |
| **`lastUpdateTime`** | Timestamp of the last update to the document. |
| **`mostRecentVersionIsAutosaved`** | Whether the most recent version is an autosaved version. |
| **`preferencesKey`** | The `preferences` key to use when interacting with document-level user preferences. [More details](./preferences). |
@@ -739,9 +739,9 @@ The `useDocumentInfo` hook provides information about the current document being
| **`setDocumentTitle`** | Method to set the document title. |
| **`setHasPublishedDoc`** | Method to update whether the document has been published. |
| **`title`** | The title of the document. |
| **`unlockDocument`** | Method to unlock a document. [More details](./locked-documents). |
| **`unlockDocument`** | Method to unlock a document. [More details](./locked-documents). |
| **`unpublishedVersionCount`** | The number of unpublished versions of the document. |
| **`updateDocumentEditor`** | Method to update who is currently editing the document. [More details](./locked-documents). |
| **`updateDocumentEditor`** | Method to update who is currently editing the document. [More details](./locked-documents). |
| **`updateSavedDocumentData`** | Method to update the saved document data. |
| **`uploadStatus`** | Status of any uploads in progress ('idle', 'uploading', or 'failed'). |
| **`versionCount`** | The current version count of the document. |

View File

@@ -73,6 +73,7 @@ The following options are available:
| `fields` \* | Array of field types that will determine the structure and functionality of the data stored within this Collection. [More details](../fields/overview). |
| `graphQL` | Manage GraphQL-related properties for this collection. [More](#graphql) |
| `hooks` | Entry point for Hooks. [More details](../hooks/overview#collection-hooks). |
| `orderable` | If true, enables custom ordering for the collection, and documents can be reordered via drag and drop. Uses [fractional indexing](https://observablehq.com/@dgreensp/implementing-fractional-indexing) for efficient reordering. |
| `labels` | Singular and plural labels for use in identifying this Collection throughout Payload. Auto-generated from slug if not defined. |
| `enableQueryPresets` | Enable query presets for this Collection. [More details](../query-presets/overview). |
| `lockDocuments` | Enables or disables document locking. By default, document locking is enabled. Set to an object to configure, or set to `false` to disable locking. [More details](../admin/locked-documents). |
@@ -177,7 +178,7 @@ The following options are available:
```ts
import type { CollectionCOnfig } from 'payload'
export const MyCollection: CollectionCOnfig = {
export const MyCollection: CollectionConfig = {
// ...
admin: {
components: {

View File

@@ -147,7 +147,7 @@ _\* Config location detection is different between development and production en
<Banner type="warning">
**Important:** Ensure your `tsconfig.json` is properly configured for Payload
to auto-detect your config location. If if does not exist, or does not specify
to auto-detect your config location. If it does not exist, or does not specify
the proper `compilerOptions`, Payload will default to the current working
directory.
</Banner>

View File

@@ -55,7 +55,7 @@ For more granular control, pass a configuration object instead. Payload exposes
| `exact` | Boolean. When true, will only match if the path matches the `usePathname()` exactly. |
| `strict` | When true, a path that has a trailing slash will only match a `location.pathname` with a trailing slash. This has no effect when there are additional URL segments in the pathname. |
| `sensitive` | When true, will match if the path is case sensitive. |
| `meta` | Page metadata overrides to apply to this view within the Admin Panel. [More details](./metadata). |
| `meta` | Page metadata overrides to apply to this view within the Admin Panel. [More details](../admin/metadata). |
_\* An asterisk denotes that a property is required._

View File

@@ -6,13 +6,13 @@ desc:
keywords: admin, components, custom, documentation, Content Management System, cms, headless, javascript, node, react, nextjs
---
The List View is where users interact with a list of [Collection](../collections/overview) Documents within the [Admin Panel](../admin/overview). This is where they can view, sort, filter, and paginate their documents to find exactly what they're looking for. This is also where users can perform bulk operations on multiple documents at once, such as deleting, editing, or publishing many.
The List View is where users interact with a list of [Collection](../configuration/collections) Documents within the [Admin Panel](../admin/overview). This is where they can view, sort, filter, and paginate their documents to find exactly what they're looking for. This is also where users can perform bulk operations on multiple documents at once, such as deleting, editing, or publishing many.
The List View can be swapped out in its entirety for a Custom View, or it can be injected with a number of Custom Components to add additional functionality or presentational elements without replacing the entire view.
<Banner type="info">
**Note:** Only [Collections](../collections/overview) have a List View.
[Globals](../globals/overview) do not have a List View as they are single
**Note:** Only [Collections](../configuration/collections) have a List View.
[Globals](../configuration/globals) do not have a List View as they are single
documents.
</Banner>
@@ -90,11 +90,11 @@ The following options are available:
| Path | Description |
| ----------------- | ------------------------------------------------------------------------------------------------------------------------- |
| `beforeList` | An array of custom components to inject before the list of documents in the List View. [More details](#beforeList). |
| `beforeListTable` | An array of custom components to inject before the table of documents in the List View. [More details](#beforeListTable). |
| `afterList` | An array of custom components to inject after the list of documents in the List View. [More details](#afterList). |
| `afterListTable` | An array of custom components to inject after the table of documents in the List View. [More details](#afterListTable). |
| `Description` | A component to render a description of the Collection. [More details](#Description). |
| `beforeList` | An array of custom components to inject before the list of documents in the List View. [More details](#beforelist). |
| `beforeListTable` | An array of custom components to inject before the table of documents in the List View. [More details](#beforelisttable). |
| `afterList` | An array of custom components to inject after the list of documents in the List View. [More details](#afterlist). |
| `afterListTable` | An array of custom components to inject after the table of documents in the List View. [More details](#afterlisttable). |
| `Description` | A component to render a description of the Collection. [More details](#description). |
### beforeList

View File

@@ -138,6 +138,7 @@ powerful Admin UI.
| **`name`** \* | To be used as the property name when retrieved from the database. [More](./overview#field-names) |
| **`collection`** \* | The `slug`s having the relationship field or an array of collection slugs. |
| **`on`** \* | The name of the relationship or upload field that relates to the collection document. Use dot notation for nested paths, like 'myGroup.relationName'. If `collection` is an array, this field must exist for all specified collections |
| **`orderable`** | If true, enables custom ordering and joined documents can be reordered via drag and drop. Uses [fractional indexing](https://observablehq.com/@dgreensp/implementing-fractional-indexing) for efficient reordering. |
| **`where`** | A `Where` query to hide related documents from appearing. Will be merged with any `where` specified in the request. |
| **`maxDepth`** | Default is 1, Sets a maximum population depth for this field, regardless of the remaining depth when this field is reached. [Max Depth](../queries/depth#max-depth). |
| **`label`** | Text used as a field label in the Admin Panel or an object with keys for each language. |

View File

@@ -28,7 +28,7 @@ Then, you could configure two different runner strategies:
As mentioned above, you can queue jobs, but the jobs won't run unless a worker picks up your jobs and runs them. This can be done in four ways:
#### Cron jobs
### Cron jobs
You can use the `jobs.autoRun` property to configure cron jobs:
@@ -63,7 +63,7 @@ export default buildConfig({
and should not be used on serverless platforms like Vercel.
</Banner>
#### Endpoint
### Endpoint
You can execute jobs by making a fetch request to the `/api/payload-jobs/run` endpoint:
@@ -130,7 +130,7 @@ This works because Vercel automatically makes the `CRON_SECRET` environment vari
After the project is deployed to Vercel, the Vercel Cron job will automatically trigger the `/api/payload-jobs/run` endpoint in the specified schedule, running the queued jobs in the background.
#### Local API
### Local API
If you want to process jobs programmatically from your server-side code, you can use the Local API:
@@ -156,7 +156,7 @@ const results = await payload.jobs.runByID({
})
```
#### Bin script
### Bin script
Finally, you can process jobs via the bin script that comes with Payload out of the box.
@@ -169,3 +169,76 @@ In addition, the bin script allows you to pass a `--cron` flag to the `jobs:run`
```sh
npx payload jobs:run --cron "*/5 * * * *"
```
## Processing Order
By default, jobs are processed first in, first out (FIFO). This means that the first job added to the queue will be the first one processed. However, you can also configure the order in which jobs are processed.
### Jobs Configuration
You can configure the order in which jobs are processed in the jobs configuration by passing the `processingOrder` property. This mimics the Payload [sort](../queries/sort) property that's used for functionality such as `payload.find()`.
```ts
export default buildConfig({
// Other configurations...
jobs: {
tasks: [
// your tasks here
],
processingOrder: '-createdAt', // Process jobs in reverse order of creation = LIFO
},
})
```
You can also set this on a queue-by-queue basis:
```ts
export default buildConfig({
// Other configurations...
jobs: {
tasks: [
// your tasks here
],
processingOrder: {
default: 'createdAt', // FIFO
queues: {
nightly: '-createdAt', // LIFO
myQueue: '-createdAt', // LIFO
},
},
},
})
```
If you need even more control over the processing order, you can pass a function that returns the processing order - this function will be called every time a queue starts processing jobs.
```ts
export default buildConfig({
// Other configurations...
jobs: {
tasks: [
// your tasks here
],
processingOrder: ({ queue }) => {
if (queue === 'myQueue') {
return '-createdAt' // LIFO
}
return 'createdAt' // FIFO
},
},
})
```
### Local API
You can configure the order in which jobs are processed in the `payload.jobs.queue` method by passing the `processingOrder` property.
```ts
const createdJob = await payload.jobs.queue({
workflow: 'createPostAndUpdate',
input: {
title: 'my title',
},
processingOrder: '-createdAt', // Process jobs in reverse order of creation = LIFO
})
```

View File

@@ -409,7 +409,7 @@ Explore the APIs available through ClientFeature to add the specific functionali
### Adding a client feature to the server feature
Inside of your server feature, you can provide an [import path](/docs/admin/custom-components/overview#component-paths) to the client feature like this:
Inside of your server feature, you can provide an [import path](/docs/custom-components/overview#component-paths) to the client feature like this:
```ts
import { createServerFeature } from '@payloadcms/richtext-lexical'

View File

@@ -49,17 +49,21 @@ Within the Admin UI, if drafts are enabled, a document can be shown with one of
specify if you are interacting with drafts or with live documents.
</Banner>
#### Updating or creating drafts
#### Updating drafts
If you enable drafts on a collection or global, the `create` and `update` operations for REST, GraphQL, and Local APIs expose a new option called `draft` which allows you to specify if you are creating or updating a **draft**, or if you're just sending your changes straight to the published document. For example, if you pass the query parameter `?draft=true` to a REST `create` or `update` operation, your action will be treated as if you are creating a `draft` and not a published document. By default, the `draft` argument is set to `false`.
If you enable drafts on a collection or global, the `update` operation for REST, GraphQL, and Local APIs exposes a new option called `draft` which allows you to specify if you are updating a **draft**, or if you're just sending your changes straight to the published document. For example, if you pass the query parameter `?draft=true` to a REST `update` operation, your action will be treated as if you are updating a `draft` and not a published document. By default, the `draft` argument is set to `false`.
**Required fields**
If `draft` is enabled while creating or updating a document, all fields are considered as not required, so that you can save drafts that are incomplete.
If `draft` is enabled while updating a document, all fields are considered as not required, so that you can save drafts that are incomplete.
#### Creating drafts
By default, draft-enabled collections will create draft documents when you create a new document. In order to create a published document, you need to pass `_status: 'published'` to the document data.
#### Reading drafts vs. published documents
In addition to the `draft` argument within `create` and `update` operations, a `draft` argument is also exposed for `find` and `findByID` operations.
In addition to the `draft` argument within `update` operations, a `draft` argument is also exposed for `find` and `findByID` operations.
If `draft` is set to `true` while reading a document, **Payload will automatically replace returned document(s) with their newest drafts** if any newer drafts are available.

View File

@@ -1,6 +1,6 @@
{
"name": "payload-monorepo",
"version": "3.30.0",
"version": "3.32.0",
"private": true,
"type": "module",
"scripts": {
@@ -87,6 +87,7 @@
"runts": "cross-env NODE_OPTIONS=--no-deprecation node --no-deprecation --import @swc-node/register/esm-register",
"script:build-template-with-local-pkgs": "pnpm --filter scripts build-template-with-local-pkgs",
"script:gen-templates": "pnpm --filter scripts gen-templates",
"script:gen-templates:build": "pnpm --filter scripts gen-templates --build",
"script:license-check": "pnpm --filter scripts license-check",
"script:list-published": "pnpm --filter releaser list-published",
"script:pack": "pnpm --filter scripts pack-all-to-dest",

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/admin-bar",
"version": "3.30.0",
"version": "3.32.0",
"description": "An admin bar for React apps using Payload",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -1,6 +1,6 @@
{
"name": "create-payload-app",
"version": "3.30.0",
"version": "3.32.0",
"homepage": "https://payloadcms.com",
"repository": {
"type": "git",

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/db-mongodb",
"version": "3.30.0",
"version": "3.32.0",
"description": "The officially supported MongoDB database adapter for Payload",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -273,6 +273,7 @@ export function mongooseAdapter({
}
return {
name: 'mongoose',
allowIDOnCreate,
defaultIDType: 'text',
init: adapter,

View File

@@ -4,6 +4,7 @@ import type { BaseJob, UpdateJobs, Where } from 'payload'
import type { MongooseAdapter } from './index.js'
import { buildQuery } from './queries/buildQuery.js'
import { buildSortParam } from './queries/buildSortParam.js'
import { getCollection } from './utilities/getEntity.js'
import { getSession } from './utilities/getSession.js'
import { handleError } from './utilities/handleError.js'
@@ -11,8 +12,11 @@ import { transform } from './utilities/transform.js'
export const updateJobs: UpdateJobs = async function updateMany(
this: MongooseAdapter,
{ id, data, limit, req, returning, where: whereArg },
{ id, data, limit, req, returning, sort: sortArg, where: whereArg },
) {
if (!(data?.log as object[])?.length) {
delete data.log
}
const where = id ? { id: { equals: id } } : (whereArg as Where)
const { collectionConfig, Model } = getCollection({
@@ -20,6 +24,14 @@ export const updateJobs: UpdateJobs = async function updateMany(
collectionSlug: 'payload-jobs',
})
const sort: Record<string, unknown> | undefined = buildSortParam({
adapter: this,
config: this.payload.config,
fields: collectionConfig.flattenedFields,
sort: sortArg || collectionConfig.defaultSort,
timestamps: true,
})
const options: MongooseUpdateQueryOptions = {
lean: true,
new: true,
@@ -51,7 +63,7 @@ export const updateJobs: UpdateJobs = async function updateMany(
const documentsToUpdate = await Model.find(
query,
{},
{ ...options, limit, projection: { _id: 1 } },
{ ...options, limit, projection: { _id: 1 }, sort },
)
if (documentsToUpdate.length === 0) {
return null
@@ -66,7 +78,14 @@ export const updateJobs: UpdateJobs = async function updateMany(
return null
}
result = await Model.find(query, {}, options)
result = await Model.find(
query,
{},
{
...options,
sort,
},
)
}
} catch (error) {
handleError({ collection: collectionConfig.slug, error, req })

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/db-postgres",
"version": "3.30.0",
"version": "3.32.0",
"description": "The officially supported Postgres database adapter for Payload",
"homepage": "https://payloadcms.com",
"repository": {
@@ -25,9 +25,9 @@
"default": "./src/index.ts"
},
"./types": {
"import": "./src/types.ts",
"types": "./src/types.ts",
"default": "./src/types.ts"
"import": "./src/exports/types-deprecated.ts",
"types": "./src/exports/types-deprecated.ts",
"default": "./src/exports/types-deprecated.ts"
},
"./migration-utils": {
"import": "./src/exports/migration-utils.ts",
@@ -56,7 +56,7 @@
}
},
"main": "./src/index.ts",
"types": "./src/types.ts",
"types": "./src/index.ts",
"files": [
"dist",
"mock.js"
@@ -102,9 +102,9 @@
"default": "./dist/index.js"
},
"./types": {
"import": "./dist/types.js",
"types": "./dist/types.d.ts",
"default": "./dist/types.js"
"import": "./dist/exports/types-deprecated.js",
"types": "./dist/exports/types-deprecated.d.ts",
"default": "./dist/exports/types-deprecated.js"
},
"./migration-utils": {
"import": "./dist/exports/migration-utils.js",

View File

@@ -0,0 +1,20 @@
import type {
Args as _Args,
GeneratedDatabaseSchema as _GeneratedDatabaseSchema,
PostgresAdapter as _PostgresAdapter,
} from '../types.js'
/**
* @deprecated - import from `@payloadcms/db-postgres` instead
*/
export type Args = _Args
/**
* @deprecated - import from `@payloadcms/db-postgres` instead
*/
export type GeneratedDatabaseSchema = _GeneratedDatabaseSchema
/**
* @deprecated - import from `@payloadcms/db-postgres` instead
*/
export type PostgresAdapter = _PostgresAdapter

View File

@@ -208,12 +208,18 @@ export function postgresAdapter(args: Args): DatabaseAdapterObj<PostgresAdapter>
}
return {
name: 'postgres',
allowIDOnCreate,
defaultIDType: payloadIDType,
init: adapter,
}
}
export type {
Args as PostgresAdapterArgs,
GeneratedDatabaseSchema,
PostgresAdapter,
} from './types.js'
export type { MigrateDownArgs, MigrateUpArgs } from '@payloadcms/drizzle/postgres'
export { geometryColumn } from '@payloadcms/drizzle/postgres'
export { sql } from 'drizzle-orm'

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/db-sqlite",
"version": "3.30.0",
"version": "3.32.0",
"description": "The officially supported SQLite database adapter for Payload",
"homepage": "https://payloadcms.com",
"repository": {
@@ -25,9 +25,9 @@
"types": "./src/index.ts"
},
"./types": {
"import": "./src/types.ts",
"require": "./src/types.ts",
"types": "./src/types.ts"
"import": "./src/exports/types-deprecated.ts",
"require": "./src/exports/types-deprecated.ts",
"types": "./src/exports/types-deprecated.ts"
},
"./migration-utils": {
"import": "./src/exports/migration-utils.ts",
@@ -56,7 +56,7 @@
}
},
"main": "./src/index.ts",
"types": "./src/types.ts",
"types": "./src/index.ts",
"files": [
"dist",
"mock.js"
@@ -99,9 +99,9 @@
"types": "./dist/index.d.ts"
},
"./types": {
"import": "./dist/types.js",
"require": "./dist/types.js",
"types": "./dist/types.d.ts"
"import": "./dist/exports/types-deprecated.js",
"require": "./dist/exports/types-deprecated.js",
"types": "./dist/exports/types-deprecated.d.ts"
},
"./migration-utils": {
"import": "./dist/exports/migration-utils.js",

View File

@@ -1,6 +1,5 @@
import type { ChainedMethods } from '@payloadcms/drizzle/types'
import type { SQLiteSelect } from 'drizzle-orm/sqlite-core'
import { chainMethods } from '@payloadcms/drizzle'
import { count, sql } from 'drizzle-orm'
import type { CountDistinct, SQLiteAdapter } from './types.js'
@@ -20,30 +19,25 @@ export const countDistinct: CountDistinct = async function countDistinct(
return Number(countResult[0]?.count)
}
const chainedMethods: ChainedMethods = []
let query: SQLiteSelect = db
.select({
count: sql`COUNT(1) OVER()`,
})
.from(this.tables[tableName])
.where(where)
.groupBy(this.tables[tableName].id)
.limit(1)
.$dynamic()
joins.forEach(({ condition, table }) => {
chainedMethods.push({
args: [table, condition],
method: 'leftJoin',
})
query = query.leftJoin(table, condition)
})
// When we have any joins, we need to count each individual ID only once.
// COUNT(*) doesn't work for this well in this case, as it also counts joined tables.
// SELECT (COUNT DISTINCT id) has a very slow performance on large tables.
// Instead, COUNT (GROUP BY id) can be used which is still slower than COUNT(*) but acceptable.
const countResult = await chainMethods({
methods: chainedMethods,
query: db
.select({
count: sql`COUNT(1) OVER()`,
})
.from(this.tables[tableName])
.where(where)
.groupBy(this.tables[tableName].id)
.limit(1),
})
const countResult = await query
return Number(countResult[0]?.count)
}

View File

@@ -0,0 +1,79 @@
import type {
Args as _Args,
CountDistinct as _CountDistinct,
DeleteWhere as _DeleteWhere,
DropDatabase as _DropDatabase,
Execute as _Execute,
GeneratedDatabaseSchema as _GeneratedDatabaseSchema,
GenericColumns as _GenericColumns,
GenericRelation as _GenericRelation,
GenericTable as _GenericTable,
IDType as _IDType,
Insert as _Insert,
MigrateDownArgs as _MigrateDownArgs,
MigrateUpArgs as _MigrateUpArgs,
SQLiteAdapter as _SQLiteAdapter,
SQLiteSchemaHook as _SQLiteSchemaHook,
} from '../types.js'
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type SQLiteAdapter = _SQLiteAdapter
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type Args = _Args
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type CountDistinct = _CountDistinct
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type DeleteWhere = _DeleteWhere
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type DropDatabase = _DropDatabase
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type Execute<T> = _Execute<T>
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type GeneratedDatabaseSchema = _GeneratedDatabaseSchema
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type GenericColumns = _GenericColumns
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type GenericRelation = _GenericRelation
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type GenericTable = _GenericTable
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type IDType = _IDType
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type Insert = _Insert
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type MigrateDownArgs = _MigrateDownArgs
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type MigrateUpArgs = _MigrateUpArgs
/**
* @deprecated - import from `@payloadcms/db-sqlite` instead
*/
export type SQLiteSchemaHook = _SQLiteSchemaHook

View File

@@ -58,10 +58,6 @@ import { init } from './init.js'
import { insert } from './insert.js'
import { requireDrizzleKit } from './requireDrizzleKit.js'
export type { MigrateDownArgs, MigrateUpArgs } from './types.js'
export { sql } from 'drizzle-orm'
const filename = fileURLToPath(import.meta.url)
export function sqliteAdapter(args: Args): DatabaseAdapterObj<SQLiteAdapter> {
@@ -197,8 +193,32 @@ export function sqliteAdapter(args: Args): DatabaseAdapterObj<SQLiteAdapter> {
}
return {
name: 'sqlite',
allowIDOnCreate,
defaultIDType: payloadIDType,
init: adapter,
}
}
/**
* @todo deprecate /types subpath export in 4.0
*/
export type {
Args as SQLiteAdapterArgs,
CountDistinct,
DeleteWhere,
DropDatabase,
Execute,
GeneratedDatabaseSchema,
GenericColumns,
GenericRelation,
GenericTable,
IDType,
Insert,
MigrateDownArgs,
MigrateUpArgs,
SQLiteAdapter,
SQLiteSchemaHook,
} from './types.js'
export { sql } from 'drizzle-orm'

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/db-vercel-postgres",
"version": "3.30.0",
"version": "3.32.0",
"description": "Vercel Postgres adapter for Payload",
"homepage": "https://payloadcms.com",
"repository": {
@@ -25,9 +25,9 @@
"default": "./src/index.ts"
},
"./types": {
"import": "./src/types.ts",
"types": "./src/types.ts",
"default": "./src/types.ts"
"import": "./src/exports/types-deprecated.ts",
"types": "./src/exports/types-deprecated.ts",
"default": "./src/exports/types-deprecated.ts"
},
"./migration-utils": {
"import": "./src/exports/migration-utils.ts",
@@ -56,7 +56,7 @@
}
},
"main": "./src/index.ts",
"types": "./src/types.ts",
"types": "./src/index.ts",
"files": [
"dist",
"mock.js"
@@ -103,9 +103,9 @@
"default": "./dist/index.js"
},
"./types": {
"import": "./dist/types.js",
"types": "./dist/types.d.ts",
"default": "./dist/types.js"
"import": "./dist/exports/types-deprecated.js",
"types": "./dist/exports/types-deprecated.d.ts",
"default": "./dist/exports/types-deprecated.js"
},
"./migration-utils": {
"import": "./dist/exports/migration-utils.js",

View File

@@ -0,0 +1,20 @@
import type {
Args as _Args,
GeneratedDatabaseSchema as _GeneratedDatabaseSchema,
VercelPostgresAdapter as _VercelPostgresAdapter,
} from '../types.js'
/**
* @deprecated - import from `@payloadcms/db-vercel-postgres` instead
*/
export type Args = _Args
/**
* @deprecated - import from `@payloadcms/db-vercel-postgres` instead
*/
export type GeneratedDatabaseSchema = _GeneratedDatabaseSchema
/**
* @deprecated - import from `@payloadcms/db-vercel-postgres` instead
*/
export type VercelPostgresAdapter = _VercelPostgresAdapter

View File

@@ -205,12 +205,21 @@ export function vercelPostgresAdapter(args: Args = {}): DatabaseAdapterObj<Verce
}
return {
name: 'postgres',
allowIDOnCreate,
defaultIDType: payloadIDType,
init: adapter,
}
}
/**
* @todo deprecate /types subpath export in 4.0
*/
export type {
Args as VercelPostgresAdapterArgs,
GeneratedDatabaseSchema,
VercelPostgresAdapter,
} from './types.js'
export type { MigrateDownArgs, MigrateUpArgs } from '@payloadcms/drizzle/postgres'
export { geometryColumn } from '@payloadcms/drizzle/postgres'
export { sql } from 'drizzle-orm'

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/drizzle",
"version": "3.30.0",
"version": "3.32.0",
"description": "A library of shared functions used by different payload database adapters",
"homepage": "https://payloadcms.com",
"repository": {
@@ -30,13 +30,13 @@
"default": "./src/exports/postgres.ts"
},
"./types": {
"import": "./src/types.ts",
"types": "./src/types.ts",
"default": "./src/types.ts"
"import": "./src/exports/types-deprecated.ts",
"types": "./src/exports/types-deprecated.ts",
"default": "./src/exports/types-deprecated.ts"
}
},
"main": "./src/index.ts",
"types": "./src/types.ts",
"types": "./src/index.ts",
"files": [
"dist",
"mock.js"
@@ -81,9 +81,9 @@
"default": "./dist/exports/postgres.js"
},
"./types": {
"import": "./dist/types.js",
"types": "./dist/types.d.ts",
"default": "./dist/types.js"
"import": "./dist/exports/types-deprecated.js",
"types": "./dist/exports/types-deprecated.d.ts",
"default": "./dist/exports/types-deprecated.js"
}
},
"main": "./dist/index.js",

View File

@@ -13,7 +13,7 @@ import { getTransaction } from './utilities/getTransaction.js'
export const deleteOne: DeleteOne = async function deleteOne(
this: DrizzleAdapter,
{ collection: collectionSlug, req, select, where: whereArg, returning },
{ collection: collectionSlug, req, returning, select, where: whereArg },
) {
const db = await getTransaction(this, req)
const collection = this.payload.collections[collectionSlug].config
@@ -32,9 +32,9 @@ export const deleteOne: DeleteOne = async function deleteOne(
const selectDistinctResult = await selectDistinct({
adapter: this,
chainedMethods: [{ args: [1], method: 'limit' }],
db,
joins,
query: ({ query }) => query.limit(1),
selectFields,
tableName,
where,

View File

@@ -0,0 +1,188 @@
import type {
BaseRawColumn as _BaseRawColumn,
BuildDrizzleTable as _BuildDrizzleTable,
BuildQueryJoinAliases as _BuildQueryJoinAliases,
ChainedMethods as _ChainedMethods,
ColumnToCodeConverter as _ColumnToCodeConverter,
CountDistinct as _CountDistinct,
CreateJSONQueryArgs as _CreateJSONQueryArgs,
DeleteWhere as _DeleteWhere,
DrizzleAdapter as _DrizzleAdapter,
DrizzleTransaction as _DrizzleTransaction,
DropDatabase as _DropDatabase,
EnumRawColumn as _EnumRawColumn,
Execute as _Execute,
GenericColumn as _GenericColumn,
GenericColumns as _GenericColumns,
GenericPgColumn as _GenericPgColumn,
GenericRelation as _GenericRelation,
GenericTable as _GenericTable,
IDType as _IDType,
Insert as _Insert,
IntegerRawColumn as _IntegerRawColumn,
Migration as _Migration,
PostgresDB as _PostgresDB,
RawColumn as _RawColumn,
RawForeignKey as _RawForeignKey,
RawIndex as _RawIndex,
RawRelation as _RawRelation,
RawTable as _RawTable,
RelationMap as _RelationMap,
RequireDrizzleKit as _RequireDrizzleKit,
SetColumnID as _SetColumnID,
SQLiteDB as _SQLiteDB,
TimestampRawColumn as _TimestampRawColumn,
TransactionPg as _TransactionPg,
TransactionSQLite as _TransactionSQLite,
UUIDRawColumn as _UUIDRawColumn,
VectorRawColumn as _VectorRawColumn,
} from '../types.js'
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type BaseRawColumn = _BaseRawColumn
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type BuildDrizzleTable = _BuildDrizzleTable
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type BuildQueryJoinAliases = _BuildQueryJoinAliases
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type ChainedMethods = _ChainedMethods
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type ColumnToCodeConverter = _ColumnToCodeConverter
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type CountDistinct = _CountDistinct
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type CreateJSONQueryArgs = _CreateJSONQueryArgs
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type DeleteWhere = _DeleteWhere
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type DrizzleAdapter = _DrizzleAdapter
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type DrizzleTransaction = _DrizzleTransaction
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type DropDatabase = _DropDatabase
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type EnumRawColumn = _EnumRawColumn
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type Execute<T> = _Execute<T>
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type GenericColumn = _GenericColumn
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type GenericColumns<T> = _GenericColumns<T>
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type GenericPgColumn = _GenericPgColumn
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type GenericRelation = _GenericRelation
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type GenericTable = _GenericTable
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type IDType = _IDType
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type Insert = _Insert
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type IntegerRawColumn = _IntegerRawColumn
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type Migration = _Migration
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type PostgresDB = _PostgresDB
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type RawColumn = _RawColumn
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type RawForeignKey = _RawForeignKey
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type RawIndex = _RawIndex
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type RawRelation = _RawRelation
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type RawTable = _RawTable
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type RelationMap = _RelationMap
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type RequireDrizzleKit = _RequireDrizzleKit
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type SetColumnID = _SetColumnID
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type SQLiteDB = _SQLiteDB
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type TimestampRawColumn = _TimestampRawColumn
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type TransactionPg = _TransactionPg
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type TransactionSQLite = _TransactionSQLite
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type UUIDRawColumn = _UUIDRawColumn
/**
* @deprecated - import from `@payloadcms/drizzle` instead
*/
export type VectorRawColumn = _VectorRawColumn

View File

@@ -1,3 +1,6 @@
/**
* @deprecated - will be removed in 4.0. Use query + $dynamic() instead: https://orm.drizzle.team/docs/dynamic-query-building
*/
export type ChainedMethods = {
args: unknown[]
method: string
@@ -7,6 +10,8 @@ export type ChainedMethods = {
* Call and returning methods that would normally be chained together but cannot be because of control logic
* @param methods
* @param query
*
* @deprecated - will be removed in 4.0. Use query + $dynamic() instead: https://orm.drizzle.team/docs/dynamic-query-building
*/
const chainMethods = <T>({ methods, query }: { methods: ChainedMethods; query: T }): T => {
return methods.reduce((query, { args, method }) => {

View File

@@ -3,7 +3,6 @@ import type { FindArgs, FlattenedField, TypeWithID } from 'payload'
import { inArray } from 'drizzle-orm'
import type { DrizzleAdapter } from '../types.js'
import type { ChainedMethods } from './chainMethods.js'
import buildQuery from '../queries/buildQuery.js'
import { selectDistinct } from '../queries/selectDistinct.js'
@@ -62,15 +61,6 @@ export const findMany = async function find({
const orderedIDMap: Record<number | string, number> = {}
let orderedIDs: (number | string)[]
const selectDistinctMethods: ChainedMethods = []
if (orderBy) {
selectDistinctMethods.push({
args: [() => orderBy.map(({ column, order }) => order(column))],
method: 'orderBy',
})
}
const findManyArgs = buildFindManyArgs({
adapter,
collectionSlug,
@@ -84,15 +74,16 @@ export const findMany = async function find({
tableName,
versions,
})
selectDistinctMethods.push({ args: [offset], method: 'offset' })
selectDistinctMethods.push({ args: [limit], method: 'limit' })
const selectDistinctResult = await selectDistinct({
adapter,
chainedMethods: selectDistinctMethods,
db,
joins,
query: ({ query }) => {
if (orderBy) {
query = query.orderBy(() => orderBy.map(({ column, order }) => order(column)))
}
return query.offset(offset).limit(limit)
},
selectFields,
tableName,
where,

View File

@@ -1,5 +1,5 @@
import type { LibSQLDatabase } from 'drizzle-orm/libsql'
import type { SQLiteSelectBase } from 'drizzle-orm/sqlite-core'
import type { SQLiteSelect, SQLiteSelectBase } from 'drizzle-orm/sqlite-core'
import { and, asc, count, desc, eq, or, sql } from 'drizzle-orm'
import {
@@ -16,7 +16,7 @@ import {
import { fieldIsVirtual, fieldShouldBeLocalized } from 'payload/shared'
import toSnakeCase from 'to-snake-case'
import type { BuildQueryJoinAliases, ChainedMethods, DrizzleAdapter } from '../types.js'
import type { BuildQueryJoinAliases, DrizzleAdapter } from '../types.js'
import type { Result } from './buildFindManyArgs.js'
import buildQuery from '../queries/buildQuery.js'
@@ -25,7 +25,6 @@ import { operatorMap } from '../queries/operatorMap.js'
import { getNameFromDrizzleTable } from '../utilities/getNameFromDrizzleTable.js'
import { jsonAggBuildObject } from '../utilities/json.js'
import { rawConstraint } from '../utilities/rawConstraint.js'
import { chainMethods } from './chainMethods.js'
const flattenAllWherePaths = (where: Where, paths: string[]) => {
for (const k in where) {
@@ -612,34 +611,6 @@ export const traverseFields = ({
where: joinQueryWhere,
})
const chainedMethods: ChainedMethods = []
joins.forEach(({ type, condition, table }) => {
chainedMethods.push({
args: [table, condition],
method: type ?? 'leftJoin',
})
})
if (page && limit !== 0) {
const offset = (page - 1) * limit - 1
if (offset > 0) {
chainedMethods.push({
args: [offset],
method: 'offset',
})
}
}
if (limit !== 0) {
chainedMethods.push({
args: [limit],
method: 'limit',
})
}
const db = adapter.drizzle as LibSQLDatabase
for (let key in selectFields) {
const val = selectFields[key]
@@ -654,14 +625,29 @@ export const traverseFields = ({
selectFields.parent = newAliasTable.parent
}
const subQuery = chainMethods({
methods: chainedMethods,
query: db
.select(selectFields as any)
.from(newAliasTable)
.where(subQueryWhere)
.orderBy(() => orderBy.map(({ column, order }) => order(column))),
}).as(subQueryAlias)
let query: SQLiteSelect = db
.select(selectFields as any)
.from(newAliasTable)
.where(subQueryWhere)
.orderBy(() => orderBy.map(({ column, order }) => order(column)))
.$dynamic()
joins.forEach(({ type, condition, table }) => {
query = query[type ?? 'leftJoin'](table, condition)
})
if (page && limit !== 0) {
const offset = (page - 1) * limit - 1
if (offset > 0) {
query = query.offset(offset)
}
}
if (limit !== 0) {
query = query.limit(limit)
}
const subQuery = query.as(subQueryAlias)
if (shouldCount) {
currentArgs.extras[`${columnName}_count`] = sql`${db

View File

@@ -31,6 +31,45 @@ export { buildRawSchema } from './schema/buildRawSchema.js'
export { beginTransaction } from './transactions/beginTransaction.js'
export { commitTransaction } from './transactions/commitTransaction.js'
export { rollbackTransaction } from './transactions/rollbackTransaction.js'
export type {
BaseRawColumn,
BuildDrizzleTable,
BuildQueryJoinAliases,
ChainedMethods,
ColumnToCodeConverter,
CountDistinct,
CreateJSONQueryArgs,
DeleteWhere,
DrizzleAdapter,
DrizzleTransaction,
DropDatabase,
EnumRawColumn,
Execute,
GenericColumn,
GenericColumns,
GenericPgColumn,
GenericRelation,
GenericTable,
IDType,
Insert,
IntegerRawColumn,
Migration,
PostgresDB,
RawColumn,
RawForeignKey,
RawIndex,
RawRelation,
RawTable,
RelationMap,
RequireDrizzleKit,
SetColumnID,
SQLiteDB,
TimestampRawColumn,
TransactionPg,
TransactionSQLite,
UUIDRawColumn,
VectorRawColumn,
} from './types.js'
export { updateGlobal } from './updateGlobal.js'
export { updateGlobalVersion } from './updateGlobalVersion.js'
export { updateJobs } from './updateJobs.js'

View File

@@ -1,10 +1,9 @@
import type { PgTableWithColumns } from 'drizzle-orm/pg-core'
import { count, sql } from 'drizzle-orm'
import type { ChainedMethods } from '../types.js'
import type { BasePostgresAdapter, CountDistinct } from './types.js'
import { chainMethods } from '../find/chainMethods.js'
export const countDistinct: CountDistinct = async function countDistinct(
this: BasePostgresAdapter,
{ db, joins, tableName, where },
@@ -20,30 +19,25 @@ export const countDistinct: CountDistinct = async function countDistinct(
return Number(countResult[0].count)
}
const chainedMethods: ChainedMethods = []
let query = db
.select({
count: sql`COUNT(1) OVER()`,
})
.from(this.tables[tableName])
.where(where)
.groupBy(this.tables[tableName].id)
.limit(1)
.$dynamic()
joins.forEach(({ condition, table }) => {
chainedMethods.push({
args: [table, condition],
method: 'leftJoin',
})
query = query.leftJoin(table as PgTableWithColumns<any>, condition)
})
// When we have any joins, we need to count each individual ID only once.
// COUNT(*) doesn't work for this well in this case, as it also counts joined tables.
// SELECT (COUNT DISTINCT id) has a very slow performance on large tables.
// Instead, COUNT (GROUP BY id) can be used which is still slower than COUNT(*) but acceptable.
const countResult = await chainMethods({
methods: chainedMethods,
query: db
.select({
count: sql`COUNT(1) OVER()`,
})
.from(this.tables[tableName])
.where(where)
.groupBy(this.tables[tableName].id)
.limit(1),
})
const countResult = await query
return Number(countResult[0].count)
}

View File

@@ -1,7 +1,7 @@
import type { QueryPromise, SQL } from 'drizzle-orm'
import type { SQLiteColumn } from 'drizzle-orm/sqlite-core'
import type { PgSelect } from 'drizzle-orm/pg-core'
import type { SQLiteColumn, SQLiteSelect } from 'drizzle-orm/sqlite-core'
import type { ChainedMethods } from '../find/chainMethods.js'
import type {
DrizzleAdapter,
DrizzleTransaction,
@@ -12,13 +12,11 @@ import type {
} from '../types.js'
import type { BuildQueryJoinAliases } from './buildQuery.js'
import { chainMethods } from '../find/chainMethods.js'
type Args = {
adapter: DrizzleAdapter
chainedMethods?: ChainedMethods
db: DrizzleAdapter['drizzle'] | DrizzleTransaction
joins: BuildQueryJoinAliases
query?: (args: { query: SQLiteSelect }) => SQLiteSelect
selectFields: Record<string, GenericColumn>
tableName: string
where: SQL
@@ -29,42 +27,40 @@ type Args = {
*/
export const selectDistinct = ({
adapter,
chainedMethods = [],
db,
joins,
query: queryModifier = ({ query }) => query,
selectFields,
tableName,
where,
}: Args): QueryPromise<{ id: number | string }[] & Record<string, GenericColumn>> => {
if (Object.keys(joins).length > 0) {
if (where) {
chainedMethods.push({ args: [where], method: 'where' })
}
joins.forEach(({ condition, table }) => {
chainedMethods.push({
args: [table, condition],
method: 'leftJoin',
})
})
let query
let query: SQLiteSelect
const table = adapter.tables[tableName]
if (adapter.name === 'postgres') {
query = (db as TransactionPg)
.selectDistinct(selectFields as Record<string, GenericPgColumn>)
.from(table)
.$dynamic() as unknown as SQLiteSelect
}
if (adapter.name === 'sqlite') {
query = (db as TransactionSQLite)
.selectDistinct(selectFields as Record<string, SQLiteColumn>)
.from(table)
.$dynamic()
}
return chainMethods({
methods: chainedMethods,
query,
if (where) {
query = query.where(where)
}
joins.forEach(({ condition, table }) => {
query = query.leftJoin(table, condition)
})
return queryModifier({
query,
}) as unknown as QueryPromise<{ id: number | string }[] & Record<string, GenericColumn>>
}
}

View File

@@ -37,11 +37,8 @@ import type { DrizzleSnapshotJSON } from 'drizzle-kit/api'
import type { SQLiteRaw } from 'drizzle-orm/sqlite-core/query-builders/raw'
import type { QueryResult } from 'pg'
import type { ChainedMethods } from './find/chainMethods.js'
import type { Operators } from './queries/operatorMap.js'
export { ChainedMethods }
export type PostgresDB = NodePgDatabase<Record<string, unknown>>
export type SQLiteDB = LibSQLDatabase<
@@ -377,3 +374,8 @@ export type RelationMap = Map<
type: 'many' | 'one'
}
>
/**
* @deprecated - will be removed in 4.0. Use query + $dynamic() instead: https://orm.drizzle.team/docs/dynamic-query-building
*/
export type { ChainedMethods } from './find/chainMethods.js'

View File

@@ -1,15 +1,10 @@
import type { LibSQLDatabase } from 'drizzle-orm/libsql'
import type { BaseJob, UpdateJobs, Where } from 'payload'
import type { UpdateJobs, Where } from 'payload'
import { inArray } from 'drizzle-orm'
import toSnakeCase from 'to-snake-case'
import type { ChainedMethods, DrizzleAdapter } from './types.js'
import type { DrizzleAdapter } from './types.js'
import { chainMethods } from './find/chainMethods.js'
import { findMany } from './find/findMany.js'
import buildQuery from './queries/buildQuery.js'
import { transform } from './transform/read/index.js'
import { upsertRow } from './upsertRow/index.js'
import { getTransaction } from './utilities/getTransaction.js'
@@ -17,6 +12,9 @@ export const updateJobs: UpdateJobs = async function updateMany(
this: DrizzleAdapter,
{ id, data, limit: limitArg, req, returning, sort: sortArg, where: whereArg },
) {
if (!(data?.log as object[])?.length) {
delete data.log
}
const whereToUse: Where = id ? { id: { equals: id } } : whereArg
const limit = id ? 1 : limitArg
@@ -25,128 +23,6 @@ export const updateJobs: UpdateJobs = async function updateMany(
const tableName = this.tableNameMap.get(toSnakeCase(collection.slug))
const sort = sortArg !== undefined && sortArg !== null ? sortArg : collection.defaultSort
const dataKeys = Object.keys(data)
// The initial update is when all jobs are being updated to processing and fetched
const isInitialUpdate = dataKeys.length === 1 && dataKeys[0] === 'processing'
if (isInitialUpdate) {
// Performance optimization for the initial update - this needs to happen as quickly as possible
const _db = db as LibSQLDatabase
const rowToInsert: {
id?: number | string
processing: boolean
} = data as { processing: boolean }
const { orderBy, where } = buildQuery({
adapter: this,
fields: collection.flattenedFields,
sort,
tableName,
where: whereToUse,
})
const table = this.tables[tableName]
const jobsLogTable = this.tables['payload_jobs_log']
let idsToUpdate: (number | string)[] = []
let docsToUpdate: BaseJob[] = []
// Fetch all jobs that should be updated. This can't be done in the update query, as
// 1) we need to join the logs table to get the logs for each job
// 2) postgres doesn't support limit on update queries
const jobsQuery = _db
.select({
id: table.id,
})
.from(table)
.where(where)
const chainedMethods: ChainedMethods = []
if (typeof limit === 'number' && limit > 0) {
chainedMethods.push({
args: [limit],
method: 'limit',
})
}
if (orderBy) {
chainedMethods.push({
args: [() => orderBy.map(({ column, order }) => order(column))],
method: 'orderBy',
})
}
docsToUpdate = (await chainMethods({
methods: chainedMethods,
query: jobsQuery,
})) as BaseJob[]
idsToUpdate = docsToUpdate?.map((job) => job.id)
// Now fetch all log entries for these jobs
if (idsToUpdate.length) {
const logsQuery = _db
.select({
id: jobsLogTable.id,
completedAt: jobsLogTable.completedAt,
error: jobsLogTable.error,
executedAt: jobsLogTable.executedAt,
input: jobsLogTable.input,
output: jobsLogTable.output,
parentID: jobsLogTable._parentID,
state: jobsLogTable.state,
taskID: jobsLogTable.taskID,
taskSlug: jobsLogTable.taskSlug,
})
.from(jobsLogTable)
.where(inArray(jobsLogTable._parentID, idsToUpdate))
const logs = await logsQuery
// Group logs by parent ID
const logsByParentId = logs.reduce(
(acc, log) => {
const parentId = log.parentID
if (!acc[parentId]) {
acc[parentId] = []
}
acc[parentId].push(log)
return acc
},
{} as Record<number | string, any[]>,
)
// Attach logs to their respective jobs
for (const job of docsToUpdate) {
job.log = logsByParentId[job.id] || []
}
}
// Perform the actual update
const query = _db
.update(table)
.set(rowToInsert)
.where(inArray(table.id, idsToUpdate))
.returning()
const updatedJobs = (await query) as BaseJob[]
return updatedJobs.map((row) => {
// Attach logs to the updated job
row.log = docsToUpdate.find((job) => job.id === row.id)?.log || []
return transform<BaseJob>({
adapter: this,
config: this.payload.config,
data: row,
fields: collection.flattenedFields,
joinQuery: false,
})
})
}
const jobs = await findMany({
adapter: this,
collectionSlug: 'payload-jobs',
@@ -162,7 +38,7 @@ export const updateJobs: UpdateJobs = async function updateMany(
return []
}
const results: BaseJob[] = []
const results = []
// TODO: We need to batch this to reduce the amount of db calls. This can get very slow if we are updating a lot of rows.
for (const job of jobs.docs) {
@@ -171,7 +47,7 @@ export const updateJobs: UpdateJobs = async function updateMany(
...data,
}
const result = await upsertRow<BaseJob>({
const result = await upsertRow({
id: job.id,
adapter: this,
data: updateData,
@@ -182,6 +58,7 @@ export const updateJobs: UpdateJobs = async function updateMany(
req,
tableName,
})
results.push(result)
}

View File

@@ -3,9 +3,8 @@ import type { UpdateMany } from 'payload'
import toSnakeCase from 'to-snake-case'
import type { ChainedMethods, DrizzleAdapter } from './types.js'
import type { DrizzleAdapter } from './types.js'
import { chainMethods } from './find/chainMethods.js'
import buildQuery from './queries/buildQuery.js'
import { selectDistinct } from './queries/selectDistinct.js'
import { upsertRow } from './upsertRow/index.js'
@@ -45,16 +44,10 @@ export const updateMany: UpdateMany = async function updateMany(
const selectDistinctResult = await selectDistinct({
adapter: this,
chainedMethods: orderBy
? [
{
args: [() => orderBy.map(({ column, order }) => order(column))],
method: 'orderBy',
},
]
: [],
db,
joins,
query: ({ query }) =>
orderBy ? query.orderBy(() => orderBy.map(({ column, order }) => order(column))) : query,
selectFields,
tableName,
where,
@@ -69,28 +62,17 @@ export const updateMany: UpdateMany = async function updateMany(
const table = this.tables[tableName]
const query = _db.select({ id: table.id }).from(table).where(where)
const chainedMethods: ChainedMethods = []
let query = _db.select({ id: table.id }).from(table).where(where).$dynamic()
if (typeof limit === 'number' && limit > 0) {
chainedMethods.push({
args: [limit],
method: 'limit',
})
query = query.limit(limit)
}
if (orderBy) {
chainedMethods.push({
args: [() => orderBy.map(({ column, order }) => order(column))],
method: 'orderBy',
})
query = query.orderBy(() => orderBy.map(({ column, order }) => order(column)))
}
const docsToUpdate = await chainMethods({
methods: chainedMethods,
query,
})
const docsToUpdate = await query
idsToUpdate = docsToUpdate?.map((doc) => doc.id)
}

View File

@@ -41,9 +41,9 @@ export const updateOne: UpdateOne = async function updateOne(
// selectDistinct will only return if there are joins
const selectDistinctResult = await selectDistinct({
adapter: this,
chainedMethods: [{ args: [1], method: 'limit' }],
db,
joins,
query: ({ query }) => query.limit(1),
selectFields,
tableName,
where,

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/email-nodemailer",
"version": "3.30.0",
"version": "3.32.0",
"description": "Payload Nodemailer Email Adapter",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/email-resend",
"version": "3.30.0",
"version": "3.32.0",
"description": "Payload Resend Email Adapter",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/graphql",
"version": "3.30.0",
"version": "3.32.0",
"homepage": "https://payloadcms.com",
"repository": {
"type": "git",

View File

@@ -379,6 +379,8 @@ export const fieldToSchemaMap: FieldToSchemaMap = {
const { limit, page, sort, where } = args
const { req } = context
const draft = Boolean(args.draft ?? context.req.query?.draft)
const fullWhere = combineQueries(where, {
[field.on]: { equals: parent._id ?? parent.id },
})
@@ -390,6 +392,7 @@ export const fieldToSchemaMap: FieldToSchemaMap = {
return await req.payload.find({
collection,
depth: 0,
draft,
fallbackLocale: req.fallbackLocale,
limit,
locale: req.locale,

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/live-preview-react",
"version": "3.30.0",
"version": "3.32.0",
"description": "The official React SDK for Payload Live Preview",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/live-preview-vue",
"version": "3.30.0",
"version": "3.32.0",
"description": "The official Vue SDK for Payload Live Preview",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/live-preview",
"version": "3.30.0",
"version": "3.32.0",
"description": "The official live preview JavaScript SDK for Payload",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -1,9 +1,15 @@
import type { FieldSchemaJSON } from 'payload'
import type { LivePreviewMessageEvent } from './types.js'
import { isLivePreviewEvent } from './isLivePreviewEvent.js'
import { mergeData } from './mergeData.js'
const _payloadLivePreview = {
const _payloadLivePreview: {
fieldSchema: FieldSchemaJSON | undefined
// eslint-disable-next-line @typescript-eslint/no-explicit-any
previousData: any
} = {
/**
* For performance reasons, `fieldSchemaJSON` will only be sent once on the initial message
* We need to cache this value so that it can be used across subsequent messages
@@ -18,7 +24,7 @@ const _payloadLivePreview = {
previousData: undefined,
}
export const handleMessage = async <T>(args: {
export const handleMessage = async <T extends Record<string, any>>(args: {
apiRoute?: string
depth?: number
event: LivePreviewMessageEvent<T>

View File

@@ -4,7 +4,15 @@ import type { PopulationsByCollection } from './types.js'
import { traverseFields } from './traverseFields.js'
const defaultRequestHandler = ({ apiPath, endpoint, serverURL }) => {
const defaultRequestHandler = ({
apiPath,
endpoint,
serverURL,
}: {
apiPath: string
endpoint: string
serverURL: string
}) => {
const url = `${serverURL}${apiPath}/${endpoint}`
return fetch(url, {
credentials: 'include',
@@ -19,7 +27,7 @@ const defaultRequestHandler = ({ apiPath, endpoint, serverURL }) => {
// Instead, we keep track of the old locale ourselves and trigger a re-population when it changes
let prevLocale: string | undefined
export const mergeData = async <T>(args: {
export const mergeData = async <T extends Record<string, any>>(args: {
apiRoute?: string
collectionPopulationRequestHandler?: ({
apiPath,
@@ -86,7 +94,7 @@ export const mergeData = async <T>(args: {
if (res?.docs?.length > 0) {
res.docs.forEach((doc) => {
populationsByCollection[collection].forEach((population) => {
populationsByCollection[collection]?.forEach((population) => {
if (population.id === doc.id) {
population.ref[population.accessor] = doc
}

View File

@@ -1,6 +1,6 @@
import { handleMessage } from './handleMessage.js'
export const subscribe = <T>(args: {
export const subscribe = <T extends Record<string, any>>(args: {
apiRoute?: string
callback: (data: T) => void
depth?: number

View File

@@ -1,17 +1,16 @@
import type { DocumentEvent } from 'payload'
import type { fieldSchemaToJSON } from 'payload/shared'
import type { DocumentEvent, FieldSchemaJSON } from 'payload'
import type { PopulationsByCollection } from './types.js'
import { traverseRichText } from './traverseRichText.js'
export const traverseFields = <T>(args: {
export const traverseFields = <T extends Record<string, any>>(args: {
externallyUpdatedRelationship?: DocumentEvent
fieldSchema: ReturnType<typeof fieldSchemaToJSON>
fieldSchema: FieldSchemaJSON
incomingData: T
localeChanged: boolean
populationsByCollection: PopulationsByCollection
result: T
result: Record<string, any>
}): void => {
const {
externallyUpdatedRelationship,
@@ -48,7 +47,7 @@ export const traverseFields = <T>(args: {
traverseFields({
externallyUpdatedRelationship,
fieldSchema: fieldSchema.fields,
fieldSchema: fieldSchema.fields!,
incomingData: incomingRow,
localeChanged,
populationsByCollection,
@@ -64,7 +63,7 @@ export const traverseFields = <T>(args: {
case 'blocks':
if (Array.isArray(incomingData[fieldName])) {
result[fieldName] = incomingData[fieldName].map((incomingBlock, i) => {
const incomingBlockJSON = fieldSchema.blocks[incomingBlock.blockType]
const incomingBlockJSON = fieldSchema.blocks?.[incomingBlock.blockType]
if (!result[fieldName]) {
result[fieldName] = []
@@ -82,7 +81,7 @@ export const traverseFields = <T>(args: {
traverseFields({
externallyUpdatedRelationship,
fieldSchema: incomingBlockJSON.fields,
fieldSchema: incomingBlockJSON!.fields!,
incomingData: incomingBlock,
localeChanged,
populationsByCollection,
@@ -106,7 +105,7 @@ export const traverseFields = <T>(args: {
traverseFields({
externallyUpdatedRelationship,
fieldSchema: fieldSchema.fields,
fieldSchema: fieldSchema.fields!,
incomingData: incomingData[fieldName] || {},
localeChanged,
populationsByCollection,
@@ -166,11 +165,11 @@ export const traverseFields = <T>(args: {
incomingRelation === externallyUpdatedRelationship?.id
if (hasChanged || hasUpdated || localeChanged) {
if (!populationsByCollection[fieldSchema.relationTo]) {
populationsByCollection[fieldSchema.relationTo] = []
if (!populationsByCollection[fieldSchema.relationTo!]) {
populationsByCollection[fieldSchema.relationTo!] = []
}
populationsByCollection[fieldSchema.relationTo].push({
populationsByCollection[fieldSchema.relationTo!]?.push({
id: incomingRelation,
accessor: i,
ref: result[fieldName],
@@ -265,11 +264,11 @@ export const traverseFields = <T>(args: {
// if the new value is not empty, populate it
// otherwise set the value to null
if (newID) {
if (!populationsByCollection[fieldSchema.relationTo]) {
populationsByCollection[fieldSchema.relationTo] = []
if (!populationsByCollection[fieldSchema.relationTo!]) {
populationsByCollection[fieldSchema.relationTo!] = []
}
populationsByCollection[fieldSchema.relationTo].push({
populationsByCollection[fieldSchema.relationTo!]?.push({
id: newID,
accessor: fieldName,
ref: result as Record<string, unknown>,

View File

@@ -79,7 +79,7 @@ export const traverseRichText = ({
populationsByCollection[incomingData.relationTo] = []
}
populationsByCollection[incomingData.relationTo].push({
populationsByCollection[incomingData.relationTo]?.push({
id:
incomingData[key] && typeof incomingData[key] === 'object'
? incomingData[key].id

View File

@@ -1,9 +1,4 @@
{
"extends": "../../tsconfig.base.json",
"compilerOptions": {
/* TODO: remove the following lines */
"strict": false,
"noUncheckedIndexedAccess": false,
},
"references": [{ "path": "../payload" }]
}

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/next",
"version": "3.30.0",
"version": "3.32.0",
"homepage": "https://payloadcms.com",
"repository": {
"type": "git",

View File

@@ -0,0 +1,55 @@
import { getSafeRedirect } from './getSafeRedirect'
const fallback = '/admin' // default fallback if the input is unsafe or invalid
describe('getSafeRedirect', () => {
// Valid - safe redirect paths
it.each([['/dashboard'], ['/admin/settings'], ['/projects?id=123'], ['/hello-world']])(
'should allow safe relative path: %s',
(input) => {
// If the input is a clean relative path, it should be returned as-is
expect(getSafeRedirect(input, fallback)).toBe(input)
},
)
// Invalid types or empty inputs
it.each(['', null, undefined, 123, {}, []])(
'should fallback on invalid or non-string input: %s',
(input) => {
// If the input is not a valid string, it should return the fallback
expect(getSafeRedirect(input as any, fallback)).toBe(fallback)
},
)
// Unsafe redirect patterns
it.each([
'//example.com', // protocol-relative URL
'/javascript:alert(1)', // JavaScript scheme
'/JavaScript:alert(1)', // case-insensitive JavaScript
'/http://unknown.com', // disguised external redirect
'/https://unknown.com', // disguised external redirect
'/%2Funknown.com', // encoded slash — could resolve to //
'/\\/unknown.com', // escaped slash
'/\\\\unknown.com', // double escaped slashes
'/\\unknown.com', // single escaped slash
'%2F%2Funknown.com', // fully encoded protocol-relative path
'%2Fjavascript:alert(1)', // encoded JavaScript scheme
])('should block unsafe redirect: %s', (input) => {
// All of these should return the fallback because theyre unsafe
expect(getSafeRedirect(input, fallback)).toBe(fallback)
})
// Input with extra spaces should still be properly handled
it('should trim whitespace before evaluating', () => {
// A valid path with surrounding spaces should still be accepted
expect(getSafeRedirect(' /dashboard ', fallback)).toBe('/dashboard')
// An unsafe path with spaces should still be rejected
expect(getSafeRedirect(' //example.com ', fallback)).toBe(fallback)
})
// If decoding the input fails (e.g., invalid percent encoding), it should not crash
it('should return fallback on invalid encoding', () => {
expect(getSafeRedirect('%E0%A4%A', fallback)).toBe(fallback)
})
})

View File

@@ -6,14 +6,25 @@ export const getSafeRedirect = (
return fallback
}
// Ensures that any leading or trailing whitespace doesnt affect the checks
const redirectPath = redirectParam.trim()
// Normalize and decode the path
let redirectPath: string
try {
redirectPath = decodeURIComponent(redirectParam.trim())
} catch {
return fallback // invalid encoding
}
const isSafeRedirect =
// Must start with a single forward slash (e.g., "/admin")
redirectPath.startsWith('/') &&
// Prevent protocol-relative URLs (e.g., "//evil.com")
// Prevent protocol-relative URLs (e.g., "//example.com")
!redirectPath.startsWith('//') &&
// Prevent encoded slashes that could resolve to protocol-relative
!redirectPath.startsWith('/%2F') &&
// Prevent backslash-based escape attempts (e.g., "/\\/example.com", "/\\\\example.com", "/\\example.com")
!redirectPath.startsWith('/\\/') &&
!redirectPath.startsWith('/\\\\') &&
!redirectPath.startsWith('/\\') &&
// Prevent javascript-based schemes (e.g., "/javascript:alert(1)")
!redirectPath.toLowerCase().startsWith('/javascript:') &&
// Prevent attempts to redirect to full URLs using "/http:" or "/https:"

View File

@@ -195,6 +195,7 @@ export const renderListView = async (
drawerSlug,
enableRowSelections,
i18n: req.i18n,
orderableFieldName: collectionConfig.orderable === true ? '_order' : undefined,
payload,
useAsTitle: collectionConfig.admin.useAsTitle,
})
@@ -259,6 +260,7 @@ export const renderListView = async (
defaultSort={sort}
listPreferences={listPreferences}
modifySearchParams={!isInDrawer}
orderableFieldName={collectionConfig.orderable === true ? '_order' : undefined}
>
{RenderServerComponent({
clientProps: {

View File

@@ -83,10 +83,10 @@ export const DefaultVersionView: React.FC<DefaultVersionsViewProps> = ({
current.set('localeCodes', JSON.stringify(selectedLocales.map((locale) => locale.value)))
}
if (!modifiedOnly) {
current.delete('modifiedOnly')
if (modifiedOnly === false) {
current.set('modifiedOnly', 'false')
} else {
current.set('modifiedOnly', 'true')
current.delete('modifiedOnly')
}
const search = current.toString()

View File

@@ -40,7 +40,7 @@ export async function VersionView(props: DocumentViewServerProps) {
const comparisonVersionIDFromParams: string = searchParams.compareValue as string
const modifiedOnly: boolean = searchParams.modifiedOnly === 'true'
const modifiedOnly: boolean = searchParams.modifiedOnly === 'false' ? false : true
const { localization } = config

View File

@@ -193,6 +193,7 @@ export async function VersionsView(props: DocumentViewServerProps) {
defaultLimit={limitToUse}
defaultSort={sort as string}
modifySearchParams
orderableFieldName={collectionConfig?.orderable === true ? '_order' : undefined}
>
<VersionsViewClient
baseClass={baseClass}

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/payload-cloud",
"version": "3.30.0",
"version": "3.32.0",
"description": "The official Payload Cloud plugin",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -1,6 +1,6 @@
{
"name": "payload",
"version": "3.30.0",
"version": "3.32.0",
"description": "Node, React, Headless CMS and Application Framework built on Next.js",
"keywords": [
"admin panel",

View File

@@ -49,6 +49,18 @@ export type FieldState = {
passesCondition?: boolean
requiresRender?: boolean
rows?: Row[]
/**
* The `serverPropsToIgnore` obj is used to prevent the various properties from being overridden across form state requests.
* This can happen when queueing a form state request with `requiresRender: true` while the another is already processing.
* For example:
* 1. One "add row" action will set `requiresRender: true` and dispatch a form state request
* 2. Another "add row" action will set `requiresRender: true` and queue a form state request
* 3. The first request will return with `requiresRender: false`
* 4. The second request will be dispatched with `requiresRender: false` but should be `true`
* To fix this, only merge the `requiresRender` property if the previous state has not set it to `true`.
* See the `mergeServerFormState` function for implementation details.
*/
serverPropsToIgnore?: Array<keyof FieldState>
valid?: boolean
validate?: Validate
value?: unknown

View File

@@ -60,6 +60,7 @@ export type BuildTableStateArgs = {
columns?: ColumnPreference[]
docs?: PaginatedDocs['docs']
enableRowSelections?: boolean
orderableFieldName: string
parent?: {
collectionSlug: CollectionSlug
id: number | string

View File

@@ -1,5 +1,5 @@
/* eslint-disable no-console */
import fs from 'fs'
import fs from 'fs/promises'
import process from 'node:process'
import type { PayloadComponent, SanitizedConfig } from '../../config/types.js'
@@ -147,7 +147,7 @@ ${mapKeys.join(',\n')}
if (!force) {
// Read current import map and check in the IMPORTS if there are any new imports. If not, don't write the file.
const currentImportMap = await fs.promises.readFile(importMapFilePath, 'utf-8')
const currentImportMap = await fs.readFile(importMapFilePath, 'utf-8')
if (currentImportMap?.trim() === importMapOutputFile?.trim()) {
if (log) {
@@ -161,5 +161,5 @@ ${mapKeys.join(',\n')}
console.log('Writing import map to', importMapFilePath)
}
await fs.promises.writeFile(importMapFilePath, importMapOutputFile)
await fs.writeFile(importMapFilePath, importMapOutputFile)
}

View File

@@ -95,8 +95,7 @@ export function iterateConfig({
}
if (config?.admin?.dependencies) {
for (const key in config.admin.dependencies) {
const dependency = config.admin.dependencies[key]
for (const dependency of Object.values(config.admin.dependencies)) {
addToImportMap(dependency.path)
}
}

View File

@@ -1,7 +1,7 @@
import type { AcceptedLanguages } from '@payloadcms/translations'
import { initI18n } from '@payloadcms/translations'
import fs from 'fs'
import fs from 'fs/promises'
import { compile } from 'json-schema-to-typescript'
import type { SanitizedConfig } from '../config/types.js'
@@ -58,7 +58,7 @@ export async function generateTypes(
// Diff the compiled types against the existing types file
try {
const existingTypes = fs.readFileSync(outputFile, 'utf-8')
const existingTypes = await fs.readFile(outputFile, 'utf-8')
if (compiled === existingTypes) {
return
@@ -67,7 +67,7 @@ export async function generateTypes(
// swallow err
}
fs.writeFileSync(outputFile, compiled)
await fs.writeFile(outputFile, compiled)
if (shouldLog) {
logger.info(`Types written to ${outputFile}`)
}

View File

@@ -1,4 +1,5 @@
// @ts-strict-ignore
import type { Config, SanitizedConfig } from '../../config/types.js'
import type {
CollectionConfig,

View File

@@ -507,6 +507,17 @@ export type CollectionConfig<TSlug extends CollectionSlug = any> = {
duration: number
}
| false
/**
* If true, enables custom ordering for the collection, and documents in the listView can be reordered via drag and drop.
* New documents are inserted at the end of the list according to this parameter.
*
* Under the hood, a field with {@link https://observablehq.com/@dgreensp/implementing-fractional-indexing|fractional indexing} is used to optimize inserts and reorderings.
*
* @default false
*
* @experimental There may be frequent breaking changes to this API
*/
orderable?: boolean
slug: string
/**
* Add `createdAt` and `updatedAt` fields

View File

@@ -56,7 +56,8 @@ export type Options<TSlug extends CollectionSlug, TSelect extends SelectType> =
*/
disableVerificationEmail?: boolean
/**
* Create a **draft** document. [More](https://payloadcms.com/docs/versions/drafts#draft-api)
* @deprecated this property has no effect on the published status of the created document. It will only control whether validation runs or not. In order to control the draft status of the document, you can pass _status: 'draft' or _status: 'published' in the data object.
* By default, draft-enabled collections will create documents with _status: 'draft'.
*/
draft?: boolean
/**

View File

@@ -0,0 +1,318 @@
// @ts-check
/**
* THIS FILE IS COPIED FROM:
* https://github.com/rocicorp/fractional-indexing/blob/main/src/index.js
*
* I AM NOT INSTALLING THAT LIBRARY BECAUSE JEST COMPLAINS ABOUT THE ESM MODULE AND THE TESTS FAIL.
* DO NOT MODIFY IT
*/
// License: CC0 (no rights reserved).
// This is based on https://observablehq.com/@dgreensp/implementing-fractional-indexing
export const BASE_62_DIGITS = '0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz'
// `a` may be empty string, `b` is null or non-empty string.
// `a < b` lexicographically if `b` is non-null.
// no trailing zeros allowed.
// digits is a string such as '0123456789' for base 10. Digits must be in
// ascending character code order!
/**
* @param {string} a
* @param {string | null | undefined} b
* @param {string} digits
* @returns {string}
*/
function midpoint(a, b, digits) {
const zero = digits[0]
if (b != null && a >= b) {
throw new Error(a + ' >= ' + b)
}
if (a.slice(-1) === zero || (b && b.slice(-1) === zero)) {
throw new Error('trailing zero')
}
if (b) {
// remove longest common prefix. pad `a` with 0s as we
// go. note that we don't need to pad `b`, because it can't
// end before `a` while traversing the common prefix.
let n = 0
while ((a[n] || zero) === b[n]) {
n++
}
if (n > 0) {
return b.slice(0, n) + midpoint(a.slice(n), b.slice(n), digits)
}
}
// first digits (or lack of digit) are different
const digitA = a ? digits.indexOf(a[0]) : 0
const digitB = b != null ? digits.indexOf(b[0]) : digits.length
if (digitB - digitA > 1) {
const midDigit = Math.round(0.5 * (digitA + digitB))
return digits[midDigit]
} else {
// first digits are consecutive
if (b && b.length > 1) {
return b.slice(0, 1)
} else {
// `b` is null or has length 1 (a single digit).
// the first digit of `a` is the previous digit to `b`,
// or 9 if `b` is null.
// given, for example, midpoint('49', '5'), return
// '4' + midpoint('9', null), which will become
// '4' + '9' + midpoint('', null), which is '495'
return digits[digitA] + midpoint(a.slice(1), null, digits)
}
}
}
/**
* @param {string} int
* @return {void}
*/
function validateInteger(int) {
if (int.length !== getIntegerLength(int[0])) {
throw new Error('invalid integer part of order key: ' + int)
}
}
/**
* @param {string} head
* @return {number}
*/
function getIntegerLength(head) {
if (head >= 'a' && head <= 'z') {
return head.charCodeAt(0) - 'a'.charCodeAt(0) + 2
} else if (head >= 'A' && head <= 'Z') {
return 'Z'.charCodeAt(0) - head.charCodeAt(0) + 2
} else {
throw new Error('invalid order key head: ' + head)
}
}
/**
* @param {string} key
* @return {string}
*/
function getIntegerPart(key) {
const integerPartLength = getIntegerLength(key[0])
if (integerPartLength > key.length) {
throw new Error('invalid order key: ' + key)
}
return key.slice(0, integerPartLength)
}
/**
* @param {string} key
* @param {string} digits
* @return {void}
*/
function validateOrderKey(key, digits) {
if (key === 'A' + digits[0].repeat(26)) {
throw new Error('invalid order key: ' + key)
}
// getIntegerPart will throw if the first character is bad,
// or the key is too short. we'd call it to check these things
// even if we didn't need the result
const i = getIntegerPart(key)
const f = key.slice(i.length)
if (f.slice(-1) === digits[0]) {
throw new Error('invalid order key: ' + key)
}
}
// note that this may return null, as there is a largest integer
/**
* @param {string} x
* @param {string} digits
* @return {string | null}
*/
function incrementInteger(x, digits) {
validateInteger(x)
const [head, ...digs] = x.split('')
let carry = true
for (let i = digs.length - 1; carry && i >= 0; i--) {
const d = digits.indexOf(digs[i]) + 1
if (d === digits.length) {
digs[i] = digits[0]
} else {
digs[i] = digits[d]
carry = false
}
}
if (carry) {
if (head === 'Z') {
return 'a' + digits[0]
}
if (head === 'z') {
return null
}
const h = String.fromCharCode(head.charCodeAt(0) + 1)
if (h > 'a') {
digs.push(digits[0])
} else {
digs.pop()
}
return h + digs.join('')
} else {
return head + digs.join('')
}
}
// note that this may return null, as there is a smallest integer
/**
* @param {string} x
* @param {string} digits
* @return {string | null}
*/
function decrementInteger(x, digits) {
validateInteger(x)
const [head, ...digs] = x.split('')
let borrow = true
for (let i = digs.length - 1; borrow && i >= 0; i--) {
const d = digits.indexOf(digs[i]) - 1
if (d === -1) {
digs[i] = digits.slice(-1)
} else {
digs[i] = digits[d]
borrow = false
}
}
if (borrow) {
if (head === 'a') {
return 'Z' + digits.slice(-1)
}
if (head === 'A') {
return null
}
const h = String.fromCharCode(head.charCodeAt(0) - 1)
if (h < 'Z') {
digs.push(digits.slice(-1))
} else {
digs.pop()
}
return h + digs.join('')
} else {
return head + digs.join('')
}
}
// `a` is an order key or null (START).
// `b` is an order key or null (END).
// `a < b` lexicographically if both are non-null.
// digits is a string such as '0123456789' for base 10. Digits must be in
// ascending character code order!
/**
* @param {string | null | undefined} a
* @param {string | null | undefined} b
* @param {string=} digits
* @return {string}
*/
export function generateKeyBetween(a, b, digits = BASE_62_DIGITS) {
if (a != null) {
validateOrderKey(a, digits)
}
if (b != null) {
validateOrderKey(b, digits)
}
if (a != null && b != null && a >= b) {
throw new Error(a + ' >= ' + b)
}
if (a == null) {
if (b == null) {
return 'a' + digits[0]
}
const ib = getIntegerPart(b)
const fb = b.slice(ib.length)
if (ib === 'A' + digits[0].repeat(26)) {
return ib + midpoint('', fb, digits)
}
if (ib < b) {
return ib
}
const res = decrementInteger(ib, digits)
if (res == null) {
throw new Error('cannot decrement any more')
}
return res
}
if (b == null) {
const ia = getIntegerPart(a)
const fa = a.slice(ia.length)
const i = incrementInteger(ia, digits)
return i == null ? ia + midpoint(fa, null, digits) : i
}
const ia = getIntegerPart(a)
const fa = a.slice(ia.length)
const ib = getIntegerPart(b)
const fb = b.slice(ib.length)
if (ia === ib) {
return ia + midpoint(fa, fb, digits)
}
const i = incrementInteger(ia, digits)
if (i == null) {
throw new Error('cannot increment any more')
}
if (i < b) {
return i
}
return ia + midpoint(fa, null, digits)
}
/**
* same preconditions as generateKeysBetween.
* n >= 0.
* Returns an array of n distinct keys in sorted order.
* If a and b are both null, returns [a0, a1, ...]
* If one or the other is null, returns consecutive "integer"
* keys. Otherwise, returns relatively short keys between
* a and b.
* @param {string | null | undefined} a
* @param {string | null | undefined} b
* @param {number} n
* @param {string} digits
* @return {string[]}
*/
export function generateNKeysBetween(a, b, n, digits = BASE_62_DIGITS) {
if (n === 0) {
return []
}
if (n === 1) {
return [generateKeyBetween(a, b, digits)]
}
if (b == null) {
let c = generateKeyBetween(a, b, digits)
const result = [c]
for (let i = 0; i < n - 1; i++) {
c = generateKeyBetween(c, b, digits)
result.push(c)
}
return result
}
if (a == null) {
let c = generateKeyBetween(a, b, digits)
const result = [c]
for (let i = 0; i < n - 1; i++) {
c = generateKeyBetween(a, c, digits)
result.push(c)
}
result.reverse()
return result
}
const mid = Math.floor(n / 2)
const c = generateKeyBetween(a, b, digits)
return [
...generateNKeysBetween(a, c, mid, digits),
c,
...generateNKeysBetween(c, b, n - mid - 1, digits),
]
}

View File

@@ -0,0 +1,278 @@
import type { BeforeChangeHook, CollectionConfig } from '../../collections/config/types.js'
import type { Field } from '../../fields/config/types.js'
import type { Endpoint, PayloadHandler, SanitizedConfig } from '../types.js'
import executeAccess from '../../auth/executeAccess.js'
import { traverseFields } from '../../utilities/traverseFields.js'
import { generateKeyBetween, generateNKeysBetween } from './fractional-indexing.js'
/**
* This function creates:
* - N fields per collection, named `_order` or `_<collection>_<joinField>_order`
* - 1 hook per collection
* - 1 endpoint per app
*
* Also, if collection.defaultSort or joinField.defaultSort is not set, it will be set to the orderable field.
*/
export const setupOrderable = (config: SanitizedConfig) => {
const fieldsToAdd = new Map<CollectionConfig, string[]>()
config.collections.forEach((collection) => {
if (collection.orderable) {
const currentFields = fieldsToAdd.get(collection) || []
fieldsToAdd.set(collection, [...currentFields, '_order'])
collection.defaultSort = collection.defaultSort ?? '_order'
}
traverseFields({
callback: ({ field, parentRef, ref }) => {
if (field.type === 'array' || field.type === 'blocks') {
return false
}
if (field.type === 'group' || field.type === 'tab') {
// @ts-expect-error ref is untyped
const parentPrefix = parentRef?.prefix ? `${parentRef.prefix}_` : ''
// @ts-expect-error ref is untyped
ref.prefix = `${parentPrefix}${field.name}`
}
if (field.type === 'join' && field.orderable === true) {
if (Array.isArray(field.collection)) {
throw new Error('Orderable joins must target a single collection')
}
const relationshipCollection = config.collections.find((c) => c.slug === field.collection)
if (!relationshipCollection) {
return false
}
field.defaultSort = field.defaultSort ?? `_${field.collection}_${field.name}_order`
const currentFields = fieldsToAdd.get(relationshipCollection) || []
// @ts-expect-error ref is untyped
const prefix = parentRef?.prefix ? `${parentRef.prefix}_` : ''
fieldsToAdd.set(relationshipCollection, [
...currentFields,
`_${field.collection}_${prefix}${field.name}_order`,
])
}
},
fields: collection.fields,
})
})
Array.from(fieldsToAdd.entries()).forEach(([collection, orderableFields]) => {
addOrderableFieldsAndHook(collection, orderableFields)
})
if (fieldsToAdd.size > 0) {
addOrderableEndpoint(config)
}
}
export const addOrderableFieldsAndHook = (
collection: CollectionConfig,
orderableFieldNames: string[],
) => {
// 1. Add field
orderableFieldNames.forEach((orderableFieldName) => {
const orderField: Field = {
name: orderableFieldName,
type: 'text',
admin: {
disableBulkEdit: true,
disabled: true,
disableListColumn: true,
disableListFilter: true,
hidden: true,
readOnly: true,
},
index: true,
required: true,
// override the schema to make order fields optional for payload.create()
typescriptSchema: [
() => ({
type: 'string',
required: false,
}),
],
unique: true,
}
collection.fields.unshift(orderField)
})
// 2. Add hook
if (!collection.hooks) {
collection.hooks = {}
}
if (!collection.hooks.beforeChange) {
collection.hooks.beforeChange = []
}
const orderBeforeChangeHook: BeforeChangeHook = async ({ data, operation, req }) => {
// Only set _order on create, not on update (unless explicitly provided)
if (operation === 'create') {
for (const orderableFieldName of orderableFieldNames) {
if (!data[orderableFieldName]) {
const lastDoc = await req.payload.find({
collection: collection.slug,
depth: 0,
limit: 1,
pagination: false,
req,
select: { [orderableFieldName]: true },
sort: `-${orderableFieldName}`,
})
const lastOrderValue = lastDoc.docs[0]?.[orderableFieldName] || null
data[orderableFieldName] = generateKeyBetween(lastOrderValue, null)
}
}
}
return data
}
collection.hooks.beforeChange.push(orderBeforeChangeHook)
}
/**
* The body of the reorder endpoint.
* @internal
*/
export type OrderableEndpointBody = {
collectionSlug: string
docsToMove: string[]
newKeyWillBe: 'greater' | 'less'
orderableFieldName: string
target: {
id: string
key: string
}
}
export const addOrderableEndpoint = (config: SanitizedConfig) => {
// 3. Add endpoint
const reorderHandler: PayloadHandler = async (req) => {
const body = (await req.json?.()) as OrderableEndpointBody
const { collectionSlug, docsToMove, newKeyWillBe, orderableFieldName, target } = body
if (!Array.isArray(docsToMove) || docsToMove.length === 0) {
return new Response(JSON.stringify({ error: 'docsToMove must be a non-empty array' }), {
headers: { 'Content-Type': 'application/json' },
status: 400,
})
}
if (
typeof target !== 'object' ||
typeof target.id !== 'string' ||
typeof target.key !== 'string'
) {
return new Response(JSON.stringify({ error: 'target must be an object with id and key' }), {
headers: { 'Content-Type': 'application/json' },
status: 400,
})
}
if (newKeyWillBe !== 'greater' && newKeyWillBe !== 'less') {
return new Response(JSON.stringify({ error: 'newKeyWillBe must be "greater" or "less"' }), {
headers: { 'Content-Type': 'application/json' },
status: 400,
})
}
const collection = config.collections.find((c) => c.slug === collectionSlug)
if (!collection) {
return new Response(JSON.stringify({ error: `Collection ${collectionSlug} not found` }), {
headers: { 'Content-Type': 'application/json' },
status: 400,
})
}
if (typeof orderableFieldName !== 'string') {
return new Response(JSON.stringify({ error: 'orderableFieldName must be a string' }), {
headers: { 'Content-Type': 'application/json' },
status: 400,
})
}
// Prevent reordering if user doesn't have editing permissions
if (collection.access?.update) {
await executeAccess(
{
// Currently only one doc can be moved at a time. We should review this if we want to allow
// multiple docs to be moved at once in the future.
id: docsToMove[0],
data: {},
req,
},
collection.access.update,
)
}
const targetId = target.id
let targetKey = target.key
// If targetKey = pending, we need to find its current key.
// This can only happen if the user reorders rows quickly with a slow connection.
if (targetKey === 'pending') {
const beforeDoc = await req.payload.findByID({
id: targetId,
collection: collection.slug,
depth: 0,
select: { [orderableFieldName]: true },
})
targetKey = beforeDoc?.[orderableFieldName] || null
}
// The reason the endpoint does not receive this docId as an argument is that there
// are situations where the user may not see or know what the next or previous one is. For
// example, access control restrictions, if docBefore is the last one on the page, etc.
const adjacentDoc = await req.payload.find({
collection: collection.slug,
depth: 0,
limit: 1,
pagination: false,
select: { [orderableFieldName]: true },
sort: newKeyWillBe === 'greater' ? orderableFieldName : `-${orderableFieldName}`,
where: {
[orderableFieldName]: {
[newKeyWillBe === 'greater' ? 'greater_than' : 'less_than']: targetKey,
},
},
})
const adjacentDocKey = adjacentDoc.docs?.[0]?.[orderableFieldName] || null
// Currently N (= docsToMove.length) is always 1. Maybe in the future we will
// allow dragging and reordering multiple documents at once via the UI.
const orderValues =
newKeyWillBe === 'greater'
? generateNKeysBetween(targetKey, adjacentDocKey, docsToMove.length)
: generateNKeysBetween(adjacentDocKey, targetKey, docsToMove.length)
// Update each document with its new order value
for (const [index, id] of docsToMove.entries()) {
await req.payload.update({
id,
collection: collection.slug,
data: {
[orderableFieldName]: orderValues[index],
},
depth: 0,
req,
select: { id: true },
})
}
return new Response(JSON.stringify({ orderValues, success: true }), {
headers: { 'Content-Type': 'application/json' },
status: 200,
})
}
const reorderEndpoint: Endpoint = {
handler: reorderHandler,
method: 'post',
path: '/reorder',
}
if (!config.endpoints) {
config.endpoints = []
}
config.endpoints.push(reorderEndpoint)
}

View File

@@ -36,6 +36,7 @@ import { getDefaultJobsCollection, jobsCollectionSlug } from '../queues/config/i
import { flattenBlock } from '../utilities/flattenAllFields.js'
import { getSchedulePublishTask } from '../versions/schedule/job.js'
import { addDefaultsToConfig } from './defaults.js'
import { setupOrderable } from './orderable/index.js'
const sanitizeAdminConfig = (configToSanitize: Config): Partial<SanitizedConfig> => {
const sanitizedConfig = { ...configToSanitize }
@@ -108,6 +109,9 @@ export const sanitizeConfig = async (incomingConfig: Config): Promise<SanitizedC
const config: Partial<SanitizedConfig> = sanitizeAdminConfig(configWithDefaults)
// Add orderable fields
setupOrderable(config as SanitizedConfig)
if (!config.endpoints) {
config.endpoints = []
}

View File

@@ -793,9 +793,7 @@ export type Config = {
/** Global date format that will be used for all dates in the Admin panel. Any valid date-fns format pattern can be used. */
dateFormat?: string
/**
* Each entry in this map generates an entry in the importMap,
* as well as an entry in the componentMap if the type of the
* dependency is 'component'
* Each entry in this map generates an entry in the importMap.
*/
dependencies?: AdminDependencies
/**

View File

@@ -2,7 +2,6 @@ import type { BaseJob, DatabaseAdapter } from '../index.js'
import type { UpdateJobs } from './types.js'
import { jobsCollectionSlug } from '../queues/config/index.js'
import { sanitizeUpdateData } from '../queues/utilities/sanitizeUpdateData.js'
export const defaultUpdateJobs: UpdateJobs = async function updateMany(
this: DatabaseAdapter,
@@ -42,7 +41,7 @@ export const defaultUpdateJobs: UpdateJobs = async function updateMany(
const updatedJob = await this.updateOne({
id: job.id,
collection: jobsCollectionSlug,
data: sanitizeUpdateData({ data: updateData }),
data: updateData,
req,
returning,
})

View File

@@ -642,6 +642,12 @@ export type DatabaseAdapterResult<T = BaseDatabaseAdapter> = {
allowIDOnCreate?: boolean
defaultIDType: 'number' | 'text'
init: (args: { payload: Payload }) => T
/**
* The name of the database adapter. For example, "postgres" or "mongoose".
*
* @todo make required in 4.0
*/
name?: string
}
export type DBIdentifierName =

View File

@@ -1549,6 +1549,17 @@ export type JoinField = {
* A string for the field in the collection being joined to.
*/
on: string
/**
* If true, enables custom ordering for the collection with the relationship, and joined documents can be reordered via drag and drop.
* New documents are inserted at the end of the list according to this parameter.
*
* Under the hood, a field with {@link https://observablehq.com/@dgreensp/implementing-fractional-indexing|fractional indexing} is used to optimize inserts and reorderings.
*
* @default false
*
* @experimental There may be frequent breaking changes to this API
*/
orderable?: boolean
sanitizedMany?: JoinField[]
type: 'join'
validate?: never
@@ -1562,7 +1573,15 @@ export type JoinFieldClient = {
} & { targetField: Pick<RelationshipFieldClient, 'relationTo'> } & FieldBaseClient &
Pick<
JoinField,
'collection' | 'defaultLimit' | 'defaultSort' | 'index' | 'maxDepth' | 'on' | 'type' | 'where'
| 'collection'
| 'defaultLimit'
| 'defaultSort'
| 'index'
| 'maxDepth'
| 'on'
| 'orderable'
| 'type'
| 'where'
>
export type FlattenedBlock = {

View File

@@ -65,8 +65,11 @@ import type {
} from './types/index.js'
import type { TraverseFieldsCallback } from './utilities/traverseFields.js'
export type * from './admin/types.js'
import type { SupportedLanguages } from '@payloadcms/translations'
import { Cron } from 'croner'
import type { ClientConfig } from './config/client.js'
import type { TypeWithVersion } from './versions/types.js'
import { decrypt, encrypt } from './auth/crypto.js'
@@ -865,10 +868,12 @@ export const reload = async (
}
await payload.db.init()
if (payload.db.connect) {
await payload.db.connect({ hotReload: true })
}
global._payload_clientConfig = null
global._payload_clientConfigs = {} as Record<keyof SupportedLanguages, ClientConfig>
global._payload_schemaMap = null
global._payload_clientSchemaMap = null
global._payload_doNotCacheClientConfig = true // This will help refreshing the client config cache more reliably. If you remove this, please test HMR + client config refreshing (do new fields appear in the document?)
@@ -1085,6 +1090,7 @@ export {
} from './config/client.js'
export { defaults } from './config/defaults.js'
export { type OrderableEndpointBody } from './config/orderable/index.js'
export { sanitizeConfig } from './config/sanitize.js'
export type * from './config/types.js'
export { combineQueries } from './database/combineQueries.js'

View File

@@ -1,5 +1,6 @@
import type { CollectionConfig } from '../../../index.js'
import type { Payload, PayloadRequest } from '../../../types/index.js'
import type { Payload, PayloadRequest, Sort } from '../../../types/index.js'
import type { RunJobsArgs } from '../../operations/runJobs/index.js'
import type { TaskConfig } from './taskTypes.js'
import type { WorkflowConfig } from './workflowTypes.js'
@@ -80,6 +81,22 @@ export type JobsConfig = {
* a new collection.
*/
jobsCollectionOverrides?: (args: { defaultJobsCollection: CollectionConfig }) => CollectionConfig
/**
* Adjust the job processing order using a Payload sort string. This can be set globally or per queue.
*
* FIFO would equal `createdAt` and LIFO would equal `-createdAt`.
*
* @default all jobs for all queues will be executed in FIFO order.
*/
processingOrder?:
| ((args: RunJobsArgs) => Promise<Sort> | Sort)
| {
default?: Sort
queues: {
[queue: string]: Sort
}
}
| Sort
/**
* By default, the job system uses direct database calls for optimal performance.
* If you added custom hooks to your jobs collection, you can set this to true to

View File

@@ -18,7 +18,7 @@ export type JobLog = {
/**
* ID added by the array field when the log is saved in the database
*/
id?: string
id: string
input?: Record<string, any>
output?: Record<string, any>
/**

View File

@@ -5,6 +5,7 @@ import {
type Payload,
type PayloadRequest,
type RunningJob,
type Sort,
type TypedJobs,
type Where,
} from '../index.js'
@@ -99,8 +100,19 @@ export const getJobsLocalAPI = (payload: Payload) => ({
run: async (args?: {
limit?: number
overrideAccess?: boolean
/**
* Adjust the job processing order using a Payload sort string.
*
* FIFO would equal `createdAt` and LIFO would equal `-createdAt`.
*/
processingOrder?: Sort
queue?: string
req?: PayloadRequest
/**
* By default, jobs are run in parallel.
* If you want to run them in sequence, set this to true.
*/
sequential?: boolean
where?: Where
}): Promise<ReturnType<typeof runJobs>> => {
const newReq: PayloadRequest = args?.req ?? (await createLocalReq({}, payload))
@@ -108,8 +120,10 @@ export const getJobsLocalAPI = (payload: Payload) => ({
return await runJobs({
limit: args?.limit,
overrideAccess: args?.overrideAccess !== false,
processingOrder: args?.processingOrder,
queue: args?.queue,
req: newReq,
sequential: args?.sequential,
where: args?.where,
})
},

View File

@@ -1,6 +1,5 @@
// @ts-strict-ignore
import type { PaginatedDocs } from '../../../database/types.js'
import type { PayloadRequest, Where } from '../../../types/index.js'
import type { PayloadRequest, Sort, Where } from '../../../types/index.js'
import type { WorkflowJSON } from '../../config/types/workflowJSONTypes.js'
import type {
BaseJob,
@@ -26,8 +25,21 @@ export type RunJobsArgs = {
id?: number | string
limit?: number
overrideAccess?: boolean
/**
* Adjust the job processing order
*
* FIFO would equal `createdAt` and LIFO would equal `-createdAt`.
*
* @default all jobs for all queues will be executed in FIFO order.
*/
processingOrder?: Sort
queue?: string
req: PayloadRequest
/**
* By default, jobs are run in parallel.
* If you want to run them in sequence, set this to true.
*/
sequential?: boolean
where?: Where
}
@@ -43,14 +55,18 @@ export type RunJobsResult = {
remainingJobsFromQueried: number
}
export const runJobs = async ({
id,
limit = 10,
overrideAccess,
queue,
req,
where: whereFromProps,
}: RunJobsArgs): Promise<RunJobsResult> => {
export const runJobs = async (args: RunJobsArgs): Promise<RunJobsResult> => {
const {
id,
limit = 10,
overrideAccess,
processingOrder,
queue,
req,
sequential,
where: whereFromProps,
} = args
if (!overrideAccess) {
const hasAccess = await req.payload.config.jobs.access.run({ req })
if (!hasAccess) {
@@ -124,6 +140,21 @@ export const runJobs = async ({
}),
]
} else {
let defaultProcessingOrder: Sort =
req.payload.collections[jobsCollectionSlug].config.defaultSort ?? 'createdAt'
const processingOrderConfig = req.payload.config.jobs?.processingOrder
if (typeof processingOrderConfig === 'function') {
defaultProcessingOrder = await processingOrderConfig(args)
} else if (typeof processingOrderConfig === 'object' && !Array.isArray(processingOrderConfig)) {
if (queue && processingOrderConfig.queues && processingOrderConfig.queues[queue]) {
defaultProcessingOrder = processingOrderConfig.queues[queue]
} else if (processingOrderConfig.default) {
defaultProcessingOrder = processingOrderConfig.default
}
} else if (typeof processingOrderConfig === 'string') {
defaultProcessingOrder = processingOrderConfig
}
const updatedDocs = await updateJobs({
data: {
processing: true,
@@ -133,6 +164,7 @@ export const runJobs = async ({
limit,
req,
returning: true,
sort: processingOrder ?? defaultProcessingOrder,
where,
})
@@ -175,7 +207,7 @@ export const runJobs = async ({
? []
: undefined
const jobPromises = jobsQuery.docs.map(async (job) => {
const runSingleJob = async (job) => {
if (!job.workflowSlug && !job.taskSlug) {
throw new Error('Job must have either a workflowSlug or a taskSlug')
}
@@ -257,9 +289,20 @@ export const runJobs = async ({
return { id: job.id, result }
}
})
}
const resultsArray = await Promise.all(jobPromises)
let resultsArray: { id: number | string; result: RunJobResult }[] = []
if (sequential) {
for (const job of jobsQuery.docs) {
const result = await runSingleJob(job)
if (result !== null) {
resultsArray.push(result)
}
}
} else {
const jobPromises = jobsQuery.docs.map(runSingleJob)
resultsArray = await Promise.all(jobPromises)
}
if (jobsToDelete && jobsToDelete.length > 0) {
try {

View File

@@ -1,3 +1,5 @@
import ObjectIdImport from 'bson-objectid'
// @ts-strict-ignore
import type { PayloadRequest } from '../../../../types/index.js'
import type {
@@ -22,6 +24,9 @@ import type { UpdateJobFunction } from './getUpdateJobFunction.js'
import { calculateBackoffWaitUntil } from './calculateBackoffWaitUntil.js'
import { importHandlerPath } from './importHandlerPath.js'
const ObjectId = (ObjectIdImport.default ||
ObjectIdImport) as unknown as typeof ObjectIdImport.default
// Helper object type to force being passed by reference
export type RunTaskFunctionState = {
reachedMaxRetries: boolean
@@ -96,6 +101,7 @@ export async function handleTaskFailed({
}
job.log.push({
id: new ObjectId().toHexString(),
completedAt: new Date().toISOString(),
error: errorJSON,
executedAt: executedAt.toISOString(),
@@ -252,6 +258,7 @@ export const getRunTaskFunction = <TIsInline extends boolean>(
log: [
...job.log,
{
id: new ObjectId().toHexString(),
completedAt: new Date().toISOString(),
error: errorMessage,
executedAt: executedAt.toISOString(),
@@ -350,6 +357,7 @@ export const getRunTaskFunction = <TIsInline extends boolean>(
job.log = []
}
job.log.push({
id: new ObjectId().toHexString(),
completedAt: new Date().toISOString(),
executedAt: executedAt.toISOString(),
input,

View File

@@ -18,7 +18,20 @@ export function getUpdateJobFunction(job: BaseJob, req: PayloadRequest): UpdateJ
// Update job object like this to modify the original object - that way, incoming changes (e.g. taskStatus field that will be re-generated through the hook) will be reflected in the calling function
for (const key in updatedJob) {
job[key] = updatedJob[key]
if (key === 'log') {
if (!job.log) {
job.log = []
}
// Add all new log entries to the original job.log object. Do not delete any existing log entries.
// Do not update existing log entries, as existing log entries should be immutable.
for (const logEntry of updatedJob.log) {
if (!job.log.some((entry) => entry.id === logEntry.id)) {
job.log.push(logEntry)
}
}
} else {
job[key] = updatedJob[key]
}
}
if ((updatedJob.error as Record<string, unknown>)?.cancelled) {

View File

@@ -70,6 +70,7 @@ export const runJob = async ({
await updateJob({
error: errorJSON,
hasError: hasFinalError, // If reached max retries => final error. If hasError is true this job will not be retried
log: job.log,
processing: false,
totalTried: (job.totalTried ?? 0) + 1,
})
@@ -82,6 +83,7 @@ export const runJob = async ({
// Workflow has completed
await updateJob({
completedAt: new Date().toISOString(),
log: job.log,
processing: false,
totalTried: (job.totalTried ?? 0) + 1,
})

View File

@@ -1,28 +0,0 @@
import ObjectIdImport from 'bson-objectid'
import type { BaseJob } from '../config/types/workflowTypes.js'
const ObjectId = (ObjectIdImport.default ||
ObjectIdImport) as unknown as typeof ObjectIdImport.default
/**
* Our payload operations sanitize the input data to, for example, add missing IDs to array rows.
* This function is used to manually sanitize the data for direct db adapter operations
*/
export function sanitizeUpdateData({ data }: { data: Partial<BaseJob> }): Partial<BaseJob> {
if (data.log) {
const sanitizedData = { ...data }
sanitizedData.log = sanitizedData?.log?.map((log) => {
if (log.id) {
return log
}
return {
...log,
id: new ObjectId().toHexString(),
}
})
return sanitizedData
}
return data
}

View File

@@ -4,7 +4,6 @@ import type { PayloadRequest, Sort, Where } from '../../types/index.js'
import type { BaseJob } from '../config/types/workflowTypes.js'
import { jobAfterRead, jobsCollectionSlug } from '../config/index.js'
import { sanitizeUpdateData } from './sanitizeUpdateData.js'
type BaseArgs = {
data: Partial<BaseJob>
@@ -72,17 +71,24 @@ export async function updateJobs({
return result.docs as BaseJob[]
}
const jobReq = {
transactionID:
req.payload.db.name !== 'mongoose'
? ((await req.payload.db.beginTransaction()) as string)
: undefined,
}
const args: UpdateJobsArgs = id
? {
id,
data: sanitizeUpdateData({ data }),
req: disableTransaction === true ? undefined : req,
data,
req: jobReq,
returning,
}
: {
data: sanitizeUpdateData({ data }),
data,
limit,
req: disableTransaction === true ? undefined : req,
req: jobReq,
returning,
sort,
// eslint-disable-next-line @typescript-eslint/no-unnecessary-type-assertion
@@ -91,6 +97,10 @@ export async function updateJobs({
const updatedJobs: BaseJob[] | null = await req.payload.db.updateJobs(args)
if (req.payload.db.name !== 'mongoose' && jobReq.transactionID) {
await req.payload.db.commitTransaction(jobReq.transactionID)
}
if (returning === false || !updatedJobs?.length) {
return null
}

View File

@@ -1,4 +1,4 @@
import fs from 'fs'
import fs from 'fs/promises'
import type { SanitizedCollectionConfig } from '../collections/config/types.js'
import type { SanitizedConfig } from '../config/types.js'
@@ -34,7 +34,7 @@ export const deleteAssociatedFiles: (args: Args) => Promise<void> = async ({
try {
if (await fileExists(fileToDelete)) {
fs.unlinkSync(fileToDelete)
await fs.unlink(fileToDelete)
}
} catch (err) {
throw new ErrorDeletingFile(req.t)
@@ -50,7 +50,7 @@ export const deleteAssociatedFiles: (args: Args) => Promise<void> = async ({
const sizeToDelete = `${staticPath}/${size.filename}`
try {
if (await fileExists(sizeToDelete)) {
fs.unlinkSync(sizeToDelete)
await fs.unlink(sizeToDelete)
}
} catch (err) {
throw new ErrorDeletingFile(req.t)

View File

@@ -1,8 +1,8 @@
import fs from 'fs'
import fs from 'fs/promises'
const fileExists = async (filename: string): Promise<boolean> => {
try {
await fs.promises.stat(filename)
await fs.stat(filename)
return true
} catch (err) {

View File

@@ -2,8 +2,7 @@
import type { OutputInfo, Sharp, SharpOptions } from 'sharp'
import { fileTypeFromBuffer } from 'file-type'
import fs from 'fs'
import { mkdirSync } from 'node:fs'
import fs from 'fs/promises'
import sanitize from 'sanitize-filename'
import type { Collection } from '../collections/config/types.js'
@@ -121,7 +120,7 @@ export const generateFileData = async <T>({
}
if (!disableLocalStorage) {
mkdirSync(staticPath, { recursive: true })
await fs.mkdir(staticPath, { recursive: true })
}
let newData = data
@@ -291,7 +290,7 @@ export const generateFileData = async <T>({
}
if (file.tempFilePath) {
await fs.promises.writeFile(file.tempFilePath, croppedImage) // write fileBuffer to the temp path
await fs.writeFile(file.tempFilePath, croppedImage) // write fileBuffer to the temp path
} else {
req.file = fileForResize
}
@@ -304,7 +303,7 @@ export const generateFileData = async <T>({
// If using temp files and the image is being resized, write the file to the temp path
if (fileBuffer?.data || file.data.length > 0) {
if (file.tempFilePath) {
await fs.promises.writeFile(file.tempFilePath, fileBuffer?.data || file.data) // write fileBuffer to the temp path
await fs.writeFile(file.tempFilePath, fileBuffer?.data || file.data) // write fileBuffer to the temp path
} else {
// Assign the _possibly modified_ file to the request object
req.file = {

View File

@@ -1,6 +1,6 @@
// @ts-strict-ignore
import { fileTypeFromFile } from 'file-type'
import fs from 'fs'
import fs from 'fs/promises'
import path from 'path'
import type { PayloadRequest } from '../types/index.js'
@@ -11,9 +11,9 @@ const mimeTypeEstimate = {
export const getFileByPath = async (filePath: string): Promise<PayloadRequest['file']> => {
if (typeof filePath === 'string') {
const data = fs.readFileSync(filePath)
const data = await fs.readFile(filePath)
const mimetype = fileTypeFromFile(filePath)
const { size } = fs.statSync(filePath)
const { size } = await fs.stat(filePath)
const name = path.basename(filePath)
const ext = path.extname(filePath).slice(1)

View File

@@ -1,5 +1,5 @@
// @ts-strict-ignore
import fs from 'fs'
import fs from 'fs/promises'
import sizeOfImport from 'image-size'
import { promisify } from 'util'
@@ -21,7 +21,7 @@ export async function getImageSize(file: PayloadRequest['file']): Promise<Probed
if (file.mimetype === 'image/tiff') {
const dimensions = await temporaryFileTask(
async (filepath: string) => {
fs.writeFileSync(filepath, file.data)
await fs.writeFile(filepath, file.data)
return imageSizePromise(filepath)
},
{ extension: 'tiff' },

View File

@@ -2,7 +2,7 @@
import type { Sharp, Metadata as SharpMetadata, SharpOptions } from 'sharp'
import { fileTypeFromBuffer } from 'file-type'
import fs from 'fs'
import fs from 'fs/promises'
import sanitize from 'sanitize-filename'
import type { SanitizedCollectionConfig } from '../collections/config/types.js'
@@ -478,7 +478,7 @@ export async function resizeAndTransformImageSizes({
if (await fileExists(imagePath)) {
try {
fs.unlinkSync(imagePath)
await fs.unlink(imagePath)
} catch {
// Ignore unlink errors
}

View File

@@ -1,5 +1,5 @@
// @ts-strict-ignore
import fs from 'fs'
import fs from 'fs/promises'
import { Readable } from 'stream'
/**
@@ -16,7 +16,7 @@ const saveBufferToFile = async (buffer: Buffer, filePath: string): Promise<void>
streamData = null
}
// Setup file system writable stream.
return fs.writeFileSync(filePath, buffer)
return await fs.writeFile(filePath, buffer)
}
export default saveBufferToFile

View File

@@ -1,5 +1,5 @@
// @ts-strict-ignore
import { promises as fsPromises } from 'fs'
import fs from 'fs/promises'
import os from 'node:os'
import path from 'node:path'
import { v4 as uuid } from 'uuid'
@@ -8,7 +8,7 @@ async function runTask(temporaryPath: string, callback) {
try {
return await callback(temporaryPath)
} finally {
await fsPromises.rm(temporaryPath, { force: true, maxRetries: 2, recursive: true })
await fs.rm(temporaryPath, { force: true, maxRetries: 2, recursive: true })
}
}
@@ -41,11 +41,11 @@ async function temporaryFile(options: Options) {
async function temporaryDirectory({ prefix = '' } = {}) {
const directory = await getPath(prefix)
await fsPromises.mkdir(directory)
await fs.mkdir(directory)
return directory
}
async function getPath(prefix = ''): Promise<string> {
const temporaryDirectory = await fsPromises.realpath(os.tmpdir())
const temporaryDirectory = await fs.realpath(os.tmpdir())
return path.join(temporaryDirectory, prefix + uuid())
}

View File

@@ -1,5 +1,4 @@
import fs from 'fs'
import { promisify } from 'util'
import fs from 'fs/promises'
import type { SanitizedCollectionConfig } from '../collections/config/types.js'
import type { SanitizedConfig } from '../config/types.js'
@@ -7,8 +6,6 @@ import type { PayloadRequest } from '../types/index.js'
import { mapAsync } from '../utilities/mapAsync.js'
const unlinkFile = promisify(fs.unlink)
type Args = {
collectionConfig: SanitizedCollectionConfig
config: SanitizedConfig
@@ -28,7 +25,7 @@ export const unlinkTempFiles: (args: Args) => Promise<void> = async ({
await mapAsync(fileArray, async ({ file }) => {
// Still need this check because this will not be populated if using local API
if (file?.tempFilePath) {
await unlinkFile(file.tempFilePath)
await fs.unlink(file.tempFilePath)
}
})
}

View File

@@ -411,4 +411,36 @@ describe('configToJSONSchema', () => {
expect(schema?.definitions?.SharedBlock).toBeDefined()
})
it('should allow overriding required to false', async () => {
// @ts-expect-error
const config: Config = {
collections: [
{
slug: 'test',
fields: [
{
name: 'title',
type: 'text',
required: true,
defaultValue: 'test',
typescriptSchema: [
() => ({
type: 'string',
required: false,
}),
],
},
],
timestamps: false,
},
],
}
const sanitizedConfig = await sanitizeConfig(config)
const schema = configToJSONSchema(sanitizedConfig, 'text')
// @ts-expect-error
expect(schema.definitions.test.properties.title.required).toStrictEqual(false)
})
})

View File

@@ -258,9 +258,6 @@ export function fieldsToJSONSchema(
properties: Object.fromEntries(
fields.reduce((fieldSchemas, field, index) => {
const isRequired = fieldAffectsData(field) && fieldIsRequired(field)
if (isRequired) {
requiredFieldNames.add(field.name)
}
const fieldDescription = entityOrFieldToJsDocs({ entity: field, i18n })
const baseFieldSchema: JSONSchema4 = {}
@@ -706,6 +703,9 @@ export function fieldsToJSONSchema(
}
if (fieldSchema && fieldAffectsData(field)) {
if (isRequired && fieldSchema.required !== false) {
requiredFieldNames.add(field.name)
}
fieldSchemas.set(field.name, fieldSchema)
}

View File

@@ -18,6 +18,11 @@ export const flattenBlock = ({ block }: { block: Block }): FlattenedBlock => {
const flattenedFieldsCache = new Map<Field[], FlattenedField[]>()
/**
* Flattens all fields in a collection, preserving the nested field structure.
* @param cache
* @param fields
*/
export const flattenAllFields = ({
cache,
fields,

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/plugin-cloud-storage",
"version": "3.30.0",
"version": "3.32.0",
"description": "The official cloud storage plugin for Payload CMS",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -53,6 +53,16 @@ export const initClientUploads = <ExtraProps extends Record<string, unknown>, T>
config.admin = {}
}
if (!config.admin.dependencies) {
config.admin.dependencies = {}
}
// Ensure client handler is always part of the import map, to avoid
// import map discrepancies between dev and prod
config.admin.dependencies[clientHandler] = {
type: 'function',
path: clientHandler,
}
if (!config.admin.components) {
config.admin.components = {}
}

Some files were not shown because too many files have changed in this diff Show More