Compare commits

..

54 Commits

Author SHA1 Message Date
Sasha
cbee9c4c4a forgot upsertRow/insertArrays 2025-06-03 01:21:13 +03:00
Sasha
ba011b8933 even more (upsertRow/*) 2025-06-03 01:19:33 +03:00
Sasha
193c051c67 Merge branch 'main' of github.com:payloadcms/payload into chore/strict-drizzle 2025-06-03 01:11:44 +03:00
Sasha
e7749468c2 even even more 2025-06-03 01:09:59 +03:00
Alessio Gravili
319d3355de feat: improve turbopack compatibility (#11376)
This PR introduces a few changes to improve turbopack compatibility and
ensure e2e tests pass with turbopack enabled

## Changes to improve turbopack compatibility
- Use correct sideEffects configuration to fix scss issues
- Import scss directly instead of duplicating our scss rules
- Fix some scss rules that are not supported by turbopack
- Bump Next.js and all other dependencies used to build payload

## Changes to get tests to pass

For an unknown reason, flaky tests flake a lot more often in turbopack.
This PR does the following to get them to pass:
- add more `wait`s
- fix actual flakes by ensuring previous operations are properly awaited

## Blocking turbopack bugs
- [X] https://github.com/vercel/next.js/issues/76464
  - Fix PR: https://github.com/vercel/next.js/pull/76545
  - Once fixed: change `"sideEffectsDisabled":` back to `"sideEffects":`
  
## Non-blocking turbopack bugs
- [ ] https://github.com/vercel/next.js/issues/76956

## Related PRs

https://github.com/payloadcms/payload/pull/12653
https://github.com/payloadcms/payload/pull/12652
2025-06-02 22:01:07 +00:00
Sasha
115af04406 even more 2025-06-03 00:59:53 +03:00
Sasha
2b40e0f21f feat: polymorphic join querying by fields that don't exist in every collection (#12648)
This PR makes it possible to do polymorphic join querying by fields that
don't exist in all collections specified in `field.collection`, for
example:
```
const result = await payload.find({
  collection: 'payload-folders',
  joins: {
    documentsAndFolders: {
      where: {
        and: [
          {
            relationTo: {
              in: ['folderPoly1', 'folderPoly2'],
            },
          },
          {
            folderPoly2Title: { // this field exists only in the folderPoly2 collection, before it'd throw a query error.
              equals: 'Poly 2 Title',
            },
          },
        ],
      },
    },
  },
})
```

---------

Co-authored-by: Jarrod Flesch <jarrodmflesch@gmail.com>
2025-06-03 00:48:07 +03:00
Alessio Gravili
30dd9a23a3 refactor(ui): improve relationship field option loading reliability using queues (#12653)
This PR uses the new `useQueue` hook for relationship react-select field
for loading options. This will reduce flakiness in our CI and ensure the
following:
- most recently triggered options loading request will not have its
result overwritten by a previous, delayed request
- reduce unnecessary, parallel requests - outdated requests are
discarded from the queue if a newer request exist
2025-06-02 21:33:41 +00:00
Sasha
ec115c6eca more 2025-06-03 00:23:38 +03:00
Jacob
c639c5f278 fix(next): cannot override tab of default views (#11789)
### What?

TypeScript says that it is possible to modify the tab of the default
view however, when you specify the path to the custom component, nothing
happens. I fixed it.

### How?

If a Component for the tab of the default view is defined in the config,
I return that Component instead of DocumentTab

### Example Configuration

config.ts
```ts
export const MenuGlobal: GlobalConfig = {
  slug: menuSlug,
  fields: [
    {
      name: 'globalText',
      type: 'text',
    },
  ],
  admin: {
    components: {
      views: {
        edit: {
          api: {
            tab: {
              Component: './TestComponent.tsx',
            },
          },
        },
      },
    },
  },
}
```
./TestComponent.tsx
```tsx
const TestComponent = () => 'example'

export default TestComponent
```


### Before
![Screenshot 2025-03-20 at 08 42
06](https://github.com/user-attachments/assets/2acc0950-847f-44c5-bedf-660c5c3747a0)

### After
![Screenshot 2025-03-20 at 08 43
06](https://github.com/user-attachments/assets/c3917d02-abfb-4f80-9235-cc1ba784586d)

---------

Co-authored-by: Jacob Fletcher <jacobsfletch@gmail.com>
2025-06-02 14:51:33 -04:00
Patrik
05eeddba7c fix: correctly detect glb & gltf mimetypes during upload (#12623)
### What?

The browser was incorrectly setting the mimetype for `.glb` and `.gltf`
files to `application/octet-stream` when uploading when they should be
receiving proper types consistent with `glb` and `gltf`.

This patch adds logic to infer the correct `MIME` type for `.glb` files
(`model/gltf-binary`) & `gltf` files (`model/gltf+json`) based on file
extension during multipart processing, ensuring consistent MIME type
detection regardless of browser behavior.

Fixes #12620
2025-06-02 11:26:26 -07:00
Tobias Odendahl
08a6f88a4b fix(ui): reset columns state throwing errors (#11903)
### What?
Fixes `resetColumnsState` in `useTableColumns` react hook.

### Why?
`resetColumnsState` threw errors when being executed, e.g. `Uncaught (in
promise) TypeError: Cannot read properties of undefined (reading
'findIndex')`

### How?
Removes unnecessary parsing of URL query parameters in
`setActiveColumns` when resetting columns.

---------

Co-authored-by: Jacob Fletcher <jacobsfletch@gmail.com>
2025-06-02 14:24:00 -04:00
Sasha
4b3f1b9c92 Merge branch 'main' of github.com:payloadcms/payload into chore/strict-drizzle 2025-06-02 17:40:24 +03:00
Said Akhrarov
ede5c671b8 fix(plugin-seo): thread allowCreate to meta image component (#12624)
<!--

Thank you for the PR! Please go through the checklist below and make
sure you've completed all the steps.

Please review the
[CONTRIBUTING.md](https://github.com/payloadcms/payload/blob/main/CONTRIBUTING.md)
document in this repository if you haven't already.

The following items will ensure that your PR is handled as smoothly as
possible:

- PR Title must follow conventional commits format. For example, `feat:
my new feature`, `fix(plugin-seo): my fix`.
- Minimal description explained as if explained to someone not
immediately familiar with the code.
- Provide before/after screenshots or code diffs if applicable.
- Link any related issues/discussions from GitHub or Discord.
- Add review comments if necessary to explain to the reviewer the logic
behind a change

### What?

### Why?

### How?

Fixes #

-->
### What?
This PR fixes an issue with `plugin-seo` where the `MetaImageComponent`
would not allow creating a new upload document from the field.

### Why?
To allow users to upload new media documents for use as a meta image.

### How?
Threads `allowCreate` through to the underlying upload input.

Fixes #12616

Before:

![image](https://github.com/user-attachments/assets/44ec32c7-1912-4fc3-9b8a-f5deb167320b)

After:

![image](https://github.com/user-attachments/assets/0dba1f75-78b6-4472-af38-6178f2ab26ea)
2025-06-02 14:11:24 +00:00
Sasha
684c43604a docs: missing dash (#12644) 2025-06-02 14:39:44 +03:00
Germán Jabloñski
8199a7d32a fix(richtext-lexical): export defaultColors for use in client components (#12627)
Fixes #12621

Should be imported from: 
`@payloadcms/richtext-lexical/client`
2025-05-31 00:47:41 +00:00
Anyu Jiang
6f8cff7764 refactor(translations): correct i18n translation for Mandarin (#12561)
Correct translations for Mandarin. Mainly for terms like: locale,
document, item etc.
2025-05-30 14:37:03 -07:00
Germán Jabloñski
89ced5ec6b fix(richtext-lexical): enable select inputs with ctrl+a or cmd+a (#12453)
Fixes #6871

To review this PR, use `pnpm dev lexical` and the auto-created document
in the `lexical fields` collection. Select any input within the blocks
and press `cmd+a`. The selection should contain the entire input.

I made sure that `cmd+a` still works fine inside the editor but outside
of inputs.
2025-05-30 18:28:51 -03:00
Jacob Fletcher
836fd86090 fix(cpa): generate .env when using the --example flag (#12572)
When cloning a new project from the examples dir via create-payload-app,
the corresponding `.env` file is not being generated. This is because
the `--example` flag does not prompt for database credentials, which
ultimately skips this step.

For example:

```bash
npx create-payload-app --example live-preview
```

The result will include the provided `.env.example`, but lacks a `.env`.

We were previously writing to the `.env.example` file, which is
unexpected. We should only be writing to the `.env` file itself. To do
this, we only write the `.env.example` to memory as needed, instead of
the file system.

This PR also simplifies the logic needed to set default vars, and
improves the testing coverage overall.
2025-05-30 14:26:57 -04:00
Sasha
7c094dc572 docs: building without a db connection (#12607)
Closes https://github.com/payloadcms/payload/issues/12605

Adds documentation for one of the most common problems - building a site
without a database connection (and why Payload may even need that).
2025-05-30 11:57:02 -04:00
Jacob Fletcher
c83e791014 fix(live-preview): correct type inference (#12619)
Type inferences broke as a result of migrating to ts strict mode in
#12298. This leads to compile-time errors that may prevent build.

Here is an example:

```ts
export interface Page {
  id: string;
  slug: string;
  title: string;
  // ...
}

/** 
* Type 'Page' does not satisfy the constraint 'Record<string, unknown>'.
* Index signature for type 'string' is missing in type 'Page'.
*/
const { data } = useLivePreview<Page>({
  depth: 2,
  initialData: initialPage,
  serverURL: PAYLOAD_SERVER_URL,
})
```

The problem is that Payload generated type _interfaces_ do not satisfy
the `Record<string, unknown>` type. This is because interfaces are a
possible target for declaration merging, so their properties are not
fully known. More details on this
[here](https://github.com/microsoft/TypeScript/issues/42825).

This PR also cleans up the JSDocs.
2025-05-30 15:40:15 +00:00
Patrik
6119d89fa5 fix(ui): upload action button styles (#12592)
### What

The upload action buttons had extra top & bottom `margin` extended on
them from the `.btn` class which caused the upload-actions container to
be larger than the thumbnail image.

Can be seen below:
![Screenshot 2025-05-28 at 1 04
57 PM](https://github.com/user-attachments/assets/d1a9ff8a-ff69-4c62-bbde-9deda6721ad3)

### Fix

To fix this issue, we've removed the bottom margin to allow the
thumbnail image control the height of the component.

#### Before
![Screenshot 2025-05-28 at 1 04
46 PM](https://github.com/user-attachments/assets/61f6dc9a-bf9d-411e-8d66-d50d27a328e9)

#### After
![Screenshot 2025-05-28 at 1 05
29 PM](https://github.com/user-attachments/assets/7687f3e8-e699-4a16-964d-20072e63d10f)
2025-05-30 15:08:55 +00:00
Paul
d5611953a7 fix: allow unnamed group fields to not set a label at all (#12580)
Technically you could already set `label: undefined` and it would be
supported by group fields but the types didn't reflect this.

So now you can create an unnamed group field like this:

```ts
{
      type: 'group',
      fields: [
        {
          type: 'text',
          name: 'insideGroupWithNoLabel',
        },
      ],
    },
```

This will remove the label while still visually grouping the fields.

![image](https://github.com/user-attachments/assets/ecb0b364-9cff-4d71-bf9f-86961915aecd)

---------

Co-authored-by: Jacob Fletcher <jacobsfletch@gmail.com>
2025-05-29 22:03:39 +00:00
Elliot DeNolf
71df378fb0 templates: bump for v3.40.0 (#12613)
🤖 Automated bump of templates for v3.40.0

Triggered by user: @denolfe

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2025-05-29 16:40:47 -04:00
Germán Jabloñski
5e3a94bbc9 chore: update CONTRIBUTING.md (#12511)
- I corrected the use of `yarn` instead of `pnpm`
- I corrected the URL for previewing the documentation (it was missing
`/local`).
- I removed the incorrect section about what type of commit falls into
the release notes.

---------

Co-authored-by: Jacob Fletcher <jacobsfletch@gmail.com>
2025-05-29 20:11:27 +00:00
Elliot DeNolf
3670886bee chore(release): v3.40.0 [skip ci] 2025-05-29 15:43:10 -04:00
Sasha
6888f13f27 fix(db-postgres): properly escape the ' character (#12590)
Fixes the issue when `defaultValue` contains `'` it'd double the amount
of `'` for the `DEFAULT` statement in the generated migration
2025-05-29 15:27:07 -04:00
Sasha
12395e497b fix(db-*): disable DB connection for payload migrate:create (#12596)
Fixes https://github.com/payloadcms/payload/issues/12582
2025-05-29 15:25:54 -04:00
Jarrod Flesch
a17d84e570 fix(ui): reduces pill sizing in autosave cells (#12606) 2025-05-29 10:08:41 -04:00
James Mikrut
ca6f849b53 feat: adds new canSetHeaders prop to auth strategies (#12591)
Exposes a new argument to authentication strategies which allows the
author to determine if this auth strategy has the capability of setting
response headers or not.

This is useful because some auth strategies may want to set headers, but
in Next.js server components (AKA the admin panel), it's not possible to
set headers. It is, however, possible to set headers within API
responses and similar contexts.

So, an author might decide to only run operations that require setting
headers (i.e. refreshing an access token) if the auth strategy is being
executed in contexts where setting headers is possible.
2025-05-29 09:58:58 -04:00
Jarrod Flesch
7e873a9d63 feat: moves getSafeRedirect into payload package (#12593) 2025-05-29 09:36:09 -04:00
Alessio Gravili
d85909e5ae refactor: use parseCookies method from Next.js (#12599)
Following up on https://github.com/payloadcms/payload/pull/12515, we
could instead use the same `parseCookies` method that Next.js uses. This
handles a few edge-cases differently:
- correctly strips whitespace
- parses attributes without explicit values

I think it's a good idea to match the behavior of Next.js as close as
possible here. [This](https://github.com/vercel/edge-runtime/pull/374)
is a good example of how the Next.js behavior behaves differently.

## Example

Input: `'my_value=true; Secure; HttpOnly'`

Previous Output:
```
Map(3) {
  'my_value' => 'true',
  'Secure' => '',
  'HttpOnly' => '',
}
```

New Output:
```
Map(3) {
  'my_value' => 'true',
  'Secure' => 'true',
  'HttpOnly' => 'true'
}
```
2025-05-29 04:56:24 +00:00
Jacob Fletcher
699af8dc5b feat(ui): export FieldAction type (#12589)
Closes #12356.
2025-05-28 23:00:26 +00:00
Jordy
0c0b0fe0f8 docs: autoLogin codeblock was not nested under 'admin' (#12573)
Additionally changed `process.env.NEXT_PUBLIC_ENABLE_AUTOLOGIN` to `NODE_ENV` since this is a more standard practice.
2025-05-28 22:50:56 +00:00
Alessandro Stoppato
bfdcb51793 fix: parseCookies ignore invalid encoded values (#12515)
this has been already reported here:
https://github.com/payloadcms/payload/issues/10591

`parseCookies.ts` tries to decode cookie's values using `decodeURI()`
and throws an Error when it fails

Since it does that on all cookies set on the current domain, there's no
control on which cookie get evaluated; for instance ads networks,
analytics providers, external fonts, etc... all set cookies with
different encodings.

### Taking in consideration:

- HTTP specs doesn't define a standard way for cookie value encoding but
simply provide recommendations:
[RFC6265](https://httpwg.org/specs/rfc6265.html#sane-set-cookie)
> To maximize compatibility with user agents, servers that wish to store
arbitrary data in a cookie-value SHOULD encode that data, for example,
using Base64

- NextJS does a pretty similar parsing and ignore invalid encoded values

https://github.com/vercel/edge-runtime/blob/main/packages/cookies/src/serialize.ts
`function parseCookie(cookie: string)`
```typescript
try {
      map.set(key, decodeURIComponent(value ?? 'true'))
    } catch {
      // ignore invalid encoded values
    }
```

### With the current implementation:
- it's impossible to login because `parseCookies.ts` get called and if
fails to parse throws and error
- requests to `/api/users/me` fail for the same reason

### Fix
the pull request address these issues by simply ignoring decoding
errors:
CURRENT:
```typescript
 try {
        const decodedValue = decodeURI(encodedValue)
        list.set(key, decodedValue)
      } catch (e) {
        throw new APIError(`Error decoding cookie value for key ${key}: ${e.message}`)
      }
```
AFTER THIS PULL REQUEST
```typescript
      try {
        const decodedValue = decodeURI(encodedValue)
        list.set(key, decodedValue)
      } catch {
        // ignore invalid encoded values
      }
```
2025-05-28 15:42:50 -07:00
Alessio Gravili
ca26402377 fix(richtext-lexical): respect disableBlockName (#12597)
Fixes #12588 

Previously, the new `disableBlockName` was not respected for lexical
blocks. This PR adds a new e2e test and does some clean-up of previous
e2e tests
2025-05-28 22:41:05 +00:00
Alessio Gravili
3022cab8ac fix(ui): oversized column selector pills (#12583)
#10030 adjusted the default `Pill` component size but forgot to set the
column selector pill sizes to small

## Before

![Screenshot 2025-05-27 at 14 34
31@2x](https://github.com/user-attachments/assets/0f7d44e7-343a-4542-9bc5-830f4bd2bd96)

## After

![Screenshot 2025-05-27 at 14 34
25@2x](https://github.com/user-attachments/assets/33f65fb7-130a-405b-820f-e31259b4f950)
2025-05-28 21:13:22 +00:00
Germán Jabloñski
8a7ac784c4 fix(translations): improve Spanish translations (#12555)
There are still things to improve.

- We're inconsistent with our use of capital letters. There are still
sentences where every word starts with a capital letter, and it looks
ugly (this also happens in English, but to a lesser extent).
- We're inconsistent with the use of punctuation at the end.
- Sentences with variables like {{count}} can result in inconsistencies
if it's 1 and the noun is plural.
- The same thing happens in Spanish, but also with gender. It's
impossible to know without the context in which it's used.

---------

Co-authored-by: Paul Popus <paul@payloadcms.com>
2025-05-28 16:50:47 -03:00
Jarrod Flesch
54a04840c7 feat: adds calling of before and after operation hooks to resetPassword (#12581) 2025-05-28 15:18:49 -04:00
Germán Jabloñski
f2b54b5b43 fix(richtext-lexical, ui): opening relationship field with appearance: "drawer" inside rich text inline block (#12529)
To reproduce this bug, insert the following feature into the richtext
editor:


```ts
BlocksFeature({
  inlineBlocks: [
    {
      slug: 'inline-media',
      fields: [
        {
          name: 'media',
          type: 'relationship',
          relationTo: ['media'],
          admin: {
            appearance: 'drawer',
          },
        },
      ],
    },
  ],
}),
```

Then try opening the relationship field drawer. The inline block drawer
will close.

Note: Interestingly, at least in Chrome, this only happens with DevTools
closed. It worked fine with DevTools open. It probably has to do with
capturing events like focus.
The current solution is a 50ms delay. I couldn't test it with CPU
throttle because it disappears when I close the devtools. If you
encounter this bug, please open an issue so we can increase the delay
or, better yet, find a more elegant solution.
2025-05-28 11:31:28 -03:00
Jarrod Flesch
4a41369a00 chore: updates bug template (#12587) 2025-05-28 10:18:25 -04:00
Sean Zubrickas
7fa879c3a0 docs: typos and links (#12484)
- update SantizeCollection to SanitizedCollection in TypeScript section

- fix new issue link in Multi-Tenant plugin section

- correct You can disabled to disable in Core Features

- fix duplicate section links for MongoDB and Postgres in Migrations
section
2025-05-28 06:47:51 -07:00
Jarrod Flesch
166dafe05e fix(ui): filtering on hasMany fields (#12579) 2025-05-28 09:45:22 -04:00
Jessica Rynkar
68ba24d91f fix(templates): update template/plugin and fix import map issue (#12305)
### What?
1. Adds logic to automatically update the `importMap.js` file with the
project name provided by the user.
2. Adds an updated version of the `README.md` file that we had when this
template existed outside of the monorepo
([here](https://github.com/payloadcms/plugin-template/blob/main/README.md))
to provide clear instructions of required steps.

### Why?
1. The plugin template when installed via `npx create-payload-app` asks
the user for a project name, however the exports from `importMap.js` do
not get updated to the provided name. This throws errors when running
the project and prevents it from building.

2. The `/dev` folder requires the `.env.example` to be copied and
renamed to `.env` - the project will not run until this is done. The
template lacks instructions that this is a required step.

### How?
1. Updates
`packages/create-payload-app/src/lib/configure-plugin-project.ts` to
read the `importMap.js` file and replace the placeholder plugin name
with the name provided by the users. Adds a test to
`packages/create-payload-app/src/lib/create-project.spec.ts` to verify
that this file gets updated correctly.
2. Adds instructions on using this template to the `README.md` file,
ensuring key steps (like adding the `.env` file) are clearly stated.

Additional housekeeping updates:
- Removed Jest and replaced it with Vitest for testing
- Updated the base test approach to use Vitest instead of Jest
- Removed `NextRESTClient` in favor of directly creating Request objects
- Abstracted `getCustomEndpointHandler` function
- Added ensureIndexes: true to the mongooseAdapter configuration
- Removed the custom server from the dev folder
- Updated the pnpm dev script to "dev": "next dev dev --turbo"
- Removed `admin.autoLogin`

Fixes #12198
2025-05-27 21:33:23 +00:00
Patrik
20f7017758 feat: show nested fields in named tabs as separate columns in the list view (#12530)
### What

Continuation of #7355 by extending the functionality to named tabs.

Updates `flattenFields` to hoist nested fields inside named tabs to the
top-level field array when `moveSubFieldsToTop` is enabled.

Also fixes an issue where group fields with custom cells were being
flattened out.

Now, group fields with a custom cell components remain available as
top-level columns.

Fixes #12563
2025-05-27 14:15:47 -07:00
Jacob Fletcher
0204f0dcbc feat: filter query preset constraints (#12485)
You can now specify exactly who can change the constraints within a
query preset.

For example, you want to ensure that only "admins" are allowed to set a
preset to "everyone".

To do this, you can use the new `queryPresets.filterConstraints`
property. When a user lacks the permission to change a constraint, the
option will either be hidden from them or disabled if it is already set.

```ts
import { buildConfig } from 'payload'

const config = buildConfig({
  // ...
  queryPresets: {
    // ...
    filterConstraints: ({ req, options }) =>
      !req.user?.roles?.includes('admin')
        ? options.filter(
            (option) =>
              (typeof option === 'string' ? option : option.value) !==
              'everyone',
          )
        : options,
  },
})
```

The `filterConstraints` functions takes the same arguments as
`reduceOptions` property on select fields introduced in #12487.
2025-05-27 16:55:37 -04:00
Said Akhrarov
032375b016 fix(ui): prevent textarea description overlapping fields and not honoring rows attribute (#12406) 2025-05-27 16:27:00 -04:00
Jarrod Flesch
8448e5b6b6 fix(ui): cloudfront removing X-HTTP-Method-Override header (#12571) 2025-05-27 15:48:51 -04:00
Jacob Fletcher
d6f6b05d77 fix(examples): update live-preview example to ESM (#12570)
Partial fix for #12551.

The Live Preview example was unable to boot because it was running
CommonJS instead of ESM.
2025-05-27 14:30:39 -04:00
Jessica Rynkar
feb7e082af chore(ui): finish adding folders e2e tests (#12524) 2025-05-27 13:00:56 -04:00
Jarrod Flesch
dfa0974894 fix(ui): live-preview-tab should show beforeDocumentControls (#12568) 2025-05-27 11:30:02 -04:00
Jarrod Flesch
f2b6c4a707 fix(db-mongodb): exists query on checkbox fields (#12567) 2025-05-27 11:19:09 -04:00
Sasha
180ef3a49d more 2025-03-11 19:08:24 +02:00
Sasha
1066b434c3 chore(drizzle): enable strict true 2025-03-11 18:12:47 +02:00
391 changed files with 7885 additions and 8366 deletions

View File

@@ -43,6 +43,7 @@ body:
- 'plugin: cloud'
- 'plugin: cloud-storage'
- 'plugin: form-builder'
- 'plugin: multi-tenant'
- 'plugin: nested-docs'
- 'plugin: richtext-lexical'
- 'plugin: richtext-slate'
@@ -59,10 +60,7 @@ body:
label: Environment Info
description: Paste output from `pnpm payload info` _or_ Payload, Node.js, and Next.js versions. Please avoid using "latest"—specific version numbers help us accurately diagnose and resolve issues.
render: text
placeholder: |
Payload:
Node.js:
Next.js:
placeholder: Run `pnpm payload info` in your terminal and paste the output here.
validations:
required: true

View File

@@ -87,41 +87,43 @@ You can run the entire test suite using `pnpm test`. If you wish to only run e2e
By default, `pnpm test:int` will only run int test against MongoDB. To run int tests against postgres, you can use `pnpm test:int:postgres`. You will have to have postgres installed on your system for this to work.
### Commits
### Pull Requests
We use [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0/) for our commit messages. Please follow this format when creating commits. Here are some examples:
For all Pull Requests, you should be extremely descriptive about both your problem and proposed solution. If there are any affected open or closed issues, please leave the issue number in your PR description.
- `feat: adds new feature`
- `fix: fixes bug`
- `docs: adds documentation`
- `chore: does chore`
All commits within a PR are squashed when merged, using the PR title as the commit message. For that reason, please use [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0/) for your PR titles.
Here's a breakdown of the format. At the top-level, we use the following types to categorize our commits:
Here are some examples:
- `feat`: new feature that adds functionality. These are automatically added to the changelog when creating new releases.
- `fix`: a fix to an existing feature. These are automatically added to the changelog when creating new releases.
- `docs`: changes to [docs](./docs) only. These do not appear in the changelog.
- `chore`: changes to code that is neither a fix nor a feature (e.g. refactoring, adding tests, etc.). These do not appear in the changelog.
- `feat: add new feature`
- `fix: fix bug`
- `docs: add documentation`
- `test: add/fix tests`
- `refactor: refactor code`
- `chore: anything that does not fit into the above categories`
If applicable, you must indicate the affected packages in parentheses to "scope" the changes. Changes to the payload chore package do not require scoping.
Here are some examples:
- `feat(ui): add new feature`
- `fix(richtext-lexical): fix bug`
If you are committing to [templates](./templates) or [examples](./examples), use the `chore` type with the proper scope, like this:
- `chore(templates): adds feature to template`
- `chore(examples): fixes bug in example`
## Pull Requests
For all Pull Requests, you should be extremely descriptive about both your problem and proposed solution. If there are any affected open or closed issues, please leave the issue number in your PR message.
## Previewing docs
This is how you can preview changes you made locally to the docs:
1. Clone our [website repository](https://github.com/payloadcms/website)
2. Run `yarn install`
2. Run `pnpm install`
3. Duplicate the `.env.example` file and rename it to `.env`
4. Add a `DOCS_DIR` environment variable to the `.env` file which points to the absolute path of your modified docs folder. For example `DOCS_DIR=/Users/yourname/Documents/GitHub/payload/docs`
5. Run `yarn run fetchDocs:local`. If this was successful, you should see no error messages and the following output: _Docs successfully written to /.../website/src/app/docs.json_. There could be error messages if you have incorrect markdown in your local docs folder. In this case, it will tell you how you can fix it
6. You're done! Now you can start the website locally using `yarn run dev` and preview the docs under [http://localhost:3000/docs/](http://localhost:3000/docs/)
5. Run `pnpm fetchDocs:local`. If this was successful, you should see no error messages and the following output: _Docs successfully written to /.../website/src/app/docs.json_. There could be error messages if you have incorrect markdown in your local docs folder. In this case, it will tell you how you can fix it
6. You're done! Now you can start the website locally using `pnpm dev` and preview the docs under [http://localhost:3000/docs/local](http://localhost:3000/docs/local)
## Internationalization (i18n)

View File

@@ -981,7 +981,15 @@ const MyComponent: React.FC = () => {
## useTableColumns
Returns methods to manipulate table columns
Returns properties and methods to manipulate table columns:
| Property | Description |
| ------------------------ | ------------------------------------------------------------------------------------------ |
| **`columns`** | The current state of columns including their active status and configuration |
| **`LinkedCellOverride`** | A component override for linked cells in the table |
| **`moveColumn`** | A method to reorder columns. Accepts `{ fromIndex: number, toIndex: number }` as arguments |
| **`resetColumnsState`** | A method to reset columns back to their default configuration as defined in the collection config |
| **`setActiveColumns`** | A method to set specific columns to active state while preserving the existing column order. Accepts an array of column names to activate |
| **`toggleColumn`** | A method to toggle a single column's visibility. Accepts a column name as string |
```tsx
'use client'
@@ -989,17 +997,30 @@ import { useTableColumns } from '@payloadcms/ui'
const MyComponent: React.FC = () => {
// highlight-start
const { setActiveColumns } = useTableColumns()
const { setActiveColumns, resetColumnsState } = useTableColumns()
const resetColumns = () => {
setActiveColumns(['id', 'createdAt', 'updatedAt'])
const activateSpecificColumns = () => {
// Only activates the id and createdAt columns
// Other columns retain their current active/inactive state
// The original column order is preserved
setActiveColumns(['id', 'createdAt'])
}
const resetToDefaults = () => {
// Resets to the default columns defined in the collection config
resetColumnsState()
}
// highlight-end
return (
<button type="button" onClick={resetColumns}>
Reset columns
</button>
<div>
<button type="button" onClick={activateSpecificColumns}>
Activate Specific Columns
</button>
<button type="button" onClick={resetToDefaults}>
Reset To Defaults
</button>
</div>
)
}
```

View File

@@ -25,11 +25,12 @@ A strategy is made up of the following:
The `authenticate` function is passed the following arguments:
| Argument | Description |
| ---------------- | ------------------------------------------------------------------------------------------------- |
| **`headers`** \* | The headers on the incoming request. Useful for retrieving identifiable information on a request. |
| **`payload`** \* | The Payload class. Useful for authenticating the identifiable information against Payload. |
| **`isGraphQL`** | Whether or not the request was made from a GraphQL endpoint. Default is `false`. |
| Argument | Description |
| ---------------------- | ------------------------------------------------------------------------------------------------------------------- |
| **`canSetHeaders`** \* | Whether or not the strategy is being executed from a context where response headers can be set. Default is `false`. |
| **`headers`** \* | The headers on the incoming request. Useful for retrieving identifiable information on a request. |
| **`payload`** \* | The Payload class. Useful for authenticating the identifiable information against Payload. |
| **`isGraphQL`** | Whether or not the strategy is being executed within the GraphQL endpoint. Default is `false`. |
### Example Strategy

View File

@@ -142,14 +142,17 @@ import { buildConfig } from 'payload'
export default buildConfig({
// ...
// highlight-start
autoLogin:
process.env.NEXT_PUBLIC_ENABLE_AUTOLOGIN === 'true'
? {
email: 'test@example.com',
password: 'test',
prefillOnly: true,
}
: false,
admin: {
autoLogin:
process.env.NODE_ENV === 'development'
? {
email: 'test@example.com',
password: 'test',
prefillOnly: true,
}
: false,
},
// highlight-end
})
```

View File

@@ -183,13 +183,13 @@ Depending on which Database Adapter you use, your migration workflow might diffe
In relational databases, migrations will be **required** for non-development database environments. But with MongoDB, you might only need to run migrations once in a while (or never even need them).
#### MongoDB
#### MongoDB#mongodb-migrations
In MongoDB, you'll only ever really need to run migrations for times where you change your database shape, and you have lots of existing data that you'd like to transform from Shape A to Shape B.
In this case, you can create a migration by running `pnpm payload migrate:create`, and then write the logic that you need to perform to migrate your documents to their new shape. You can then either run your migrations in CI before you build / deploy, or you can run them locally, against your production database, by using your production database connection string on your local computer and running the `pnpm payload migrate` command.
#### Postgres
#### Postgres#postgres-migrations
In relational databases like Postgres, migrations are a bit more important, because each time you add a new field or a new collection, you'll need to update the shape of your database to match your Payload Config (otherwise you'll see errors upon trying to read / write your data).

View File

@@ -37,7 +37,7 @@ export const MyGroupField: Field = {
| ---------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| **`name`** | To be used as the property name when stored and retrieved from the database. [More](/docs/fields/overview#field-names) |
| **`fields`** \* | Array of field types to nest within this Group. |
| **`label`** | Used as a heading in the Admin Panel and to name the generated GraphQL type. Required when name is undefined, defaults to name converted to words. |
| **`label`** | Used as a heading in the Admin Panel and to name the generated GraphQL type. Defaults to the field name, if defined. |
| **`validate`** | Provide a custom validation function that will be executed on both the Admin Panel and the backend. [More](/docs/fields/overview#validation) |
| **`saveToJWT`** | If this field is top-level and nested in a config supporting [Authentication](/docs/authentication/overview), include its data in the user JWT. |
| **`hooks`** | Provide Field Hooks to control logic for this field. [More details](../hooks/fields). |
@@ -113,8 +113,7 @@ export const ExampleCollection: CollectionConfig = {
## Presentational group fields
You can also use the Group field to create a presentational group of fields. This is useful when you want to group fields together visually without affecting the data structure.
The label will be required when a `name` is not provided.
You can also use the Group field to only visually group fields without affecting the data structure. Not defining a label will render just the grouped fields.
```ts
import type { CollectionConfig } from 'payload'

View File

@@ -70,7 +70,7 @@ _\* An asterisk denotes that a property is required._
### filterOptions
Used to dynamically filter which options are available based on the user, data, etc.
Used to dynamically filter which options are available based on the current user, document data, or other criteria.
Some examples of this might include:

View File

@@ -329,7 +329,7 @@ available:
// responseHeaders: { ... } // returned headers from the response
// }
const result = await payload.auth({ headers })
const result = await payload.auth({ headers, canSetHeaders: false })
```
### Login

View File

@@ -16,8 +16,8 @@ This plugin sets up multi-tenancy for your application from within your [Admin P
If you need help, check out our [Community
Help](https://payloadcms.com/community-help). If you think you've found a bug,
please [open a new
issue](https://github.com/payloadcms/payload/issues/new?assignees=&labels=plugin%3A%multi-tenant&template=bug_report.md&title=plugin-multi-tenant%3A)
with as much detail as possible.
issue](https://github.com/payloadcms/payload/issues/new/choose) with as much
detail as possible.
</Banner>
## Core features
@@ -35,7 +35,7 @@ This plugin sets up multi-tenancy for your application from within your [Admin P
By default this plugin cleans up documents when a tenant is deleted. You should ensure you have
strong access control on your tenants collection to prevent deletions by unauthorized users.
You can disabled this behavior by setting `cleanupAfterTenantDelete` to `false` in the plugin options.
You can disable this behavior by setting `cleanupAfterTenantDelete` to `false` in the plugin options.
</Banner>

View File

@@ -0,0 +1,32 @@
---
title: Building without a DB connection
label: Building without a DB connection
order: 10
desc: You don't want to have a DB connection while building your Docker container? Learn how to prevent that!
keywords: deployment, production, config, configuration, documentation, Content Management System, cms, headless, javascript, node, react, nextjs
---
# Building without a DB connection
One of the most common problems when building a site for production, especially with Docker - is the DB connection requirement.
The important note is that Payload by itself does not have this requirement, But [Next.js' SSG ](https://nextjs.org/docs/pages/building-your-application/rendering/static-site-generation) does if any of your route segments have SSG enabled (which is default, unless you opted out or used a [Dynamic API](https://nextjs.org/docs/app/deep-dive/caching#dynamic-apis)) and use the Payload Local API.
Solutions:
## Using the experimental-build-mode Next.js build flag
You can run Next.js build using the `pnpx next build --experimental-build-mode compile` command to only compile the code without static generation, which does not require a DB connection. In that case, your pages will be rendered dynamically, but after that, you can still generate static pages using the `pnpx next build --experimental-build-mode generate` command when you have a DB connection.
[Next.js documentation](https://nextjs.org/docs/pages/api-reference/cli/next#next-build-options)
## Opting-out of SSG
You can opt out of SSG by adding this all the route segment files:
```ts
export const dynamic = 'force-dynamic'
```
**Note that it will disable static optimization and your site will be slower**.
More on [Next.js documentation](https://nextjs.org/docs/app/deep-dive/caching#opting-out-2)

View File

@@ -150,7 +150,7 @@ Follow the docs to configure any one of these storage providers. For local devel
## Docker
This is an example of a multi-stage docker build of Payload for production. Ensure you are setting your environment
variables on deployment, like `PAYLOAD_SECRET`, `PAYLOAD_CONFIG_PATH`, and `DATABASE_URI` if needed.
variables on deployment, like `PAYLOAD_SECRET`, `PAYLOAD_CONFIG_PATH`, and `DATABASE_URI` if needed. If you don't want to have a DB connection and your build requires that, learn [here](./building-without-a-db-connection) how to prevent that.
In your Next.js config, set the `output` property `standalone`.

View File

@@ -46,11 +46,12 @@ const config = buildConfig({
The following options are available for Query Presets:
| Option | Description |
| ------------- | ------------------------------------------------------------------------------------------------------------------------------- |
| `access` | Used to define custom collection-level access control that applies to all presets. [More details](#access-control). |
| `constraints` | Used to define custom document-level access control that apply to individual presets. [More details](#document-access-control). |
| `labels` | Custom labels to use for the Query Presets collection. |
| Option | Description |
| ------------------- | ------------------------------------------------------------------------------------------------------------------------------- |
| `access` | Used to define custom collection-level access control that applies to all presets. [More details](#access-control). |
| `filterConstraints` | Used to define which constraints are available to users when managing presets. [More details](#constraint-access-control). |
| `constraints` | Used to define custom document-level access control that apply to individual presets. [More details](#document-access-control). |
| `labels` | Custom labels to use for the Query Presets collection. |
## Access Control
@@ -59,7 +60,7 @@ Query Presets are subject to the same [Access Control](../access-control/overvie
Access Control for Query Presets can be customized in two ways:
1. [Collection Access Control](#collection-access-control): Applies to all presets. These rules are not controllable by the user and are statically defined in the config.
2. [Document Access Control](#document-access-control): Applies to each individual preset. These rules are controllable by the user and are saved to the document.
2. [Document Access Control](#document-access-control): Applies to each individual preset. These rules are controllable by the user and are dynamically defined on each record in the database.
### Collection Access Control
@@ -97,7 +98,7 @@ This example restricts all Query Presets to users with the role of `admin`.
### Document Access Control
You can also define access control rules that apply to each specific preset. Users have the ability to define and modify these rules on the fly as they manage presets. These are saved dynamically in the database on each document.
You can also define access control rules that apply to each specific preset. Users have the ability to define and modify these rules on the fly as they manage presets. These are saved dynamically in the database on each record.
When a user manages a preset, document-level access control options will be available to them in the Admin Panel for each operation.
@@ -150,8 +151,8 @@ const config = buildConfig({
}),
},
],
// highlight-end
},
// highlight-end
},
})
```
@@ -171,3 +172,39 @@ The following options are available for each constraint:
| `value` | The value to store in the database when this constraint is selected. |
| `fields` | An array of fields to render when this constraint is selected. |
| `access` | A function that determines the access control rules for this constraint. |
### Constraint Access Control
Used to dynamically filter which constraints are available based on the current user, document data, or other criteria.
Some examples of this might include:
- Ensuring that only "admins" are allowed to make a preset available to "everyone"
- Preventing the "onlyMe" option from being selected based on a hypothetical "disablePrivatePresets" checkbox
When a user lacks the permission to set a constraint, the option will either be hidden from them, or disabled if it is already saved to that preset.
To do this, you can use the `filterConstraints` property in your [Payload Config](../configuration/overview):
```ts
import { buildConfig } from 'payload'
const config = buildConfig({
// ...
queryPresets: {
// ...
// highlight-start
filterConstraints: ({ req, options }) =>
!req.user?.roles?.includes('admin')
? options.filter(
(option) =>
(typeof option === 'string' ? option : option.value) !==
'everyone',
)
: options,
// highlight-end
},
})
```
The `filterConstraints` function receives the same arguments as [`filterOptions`](../fields/select#filterOptions) in the [Select field](../fields/select).

View File

@@ -738,7 +738,7 @@ Payload supports a method override feature that allows you to send GET requests
### How to Use
To use this feature, include the `X-HTTP-Method-Override` header set to `GET` in your POST request. The parameters should be sent in the body of the request with the `Content-Type` set to `application/x-www-form-urlencoded`.
To use this feature, include the `X-Payload-HTTP-Method-Override` header set to `GET` in your POST request. The parameters should be sent in the body of the request with the `Content-Type` set to `application/x-www-form-urlencoded`.
### Example
@@ -753,7 +753,7 @@ const res = await fetch(`${api}/${collectionSlug}`, {
headers: {
'Accept-Language': i18n.language,
'Content-Type': 'application/x-www-form-urlencoded',
'X-HTTP-Method-Override': 'GET',
'X-Payload-HTTP-Method-Override': 'GET',
},
body: qs.stringify({
depth: 1,

View File

@@ -3,14 +3,8 @@
width: var(--base);
.stroke {
stroke-width: 1px;
stroke-width: 2px;
fill: none;
stroke: currentColor;
}
&:local() {
.stroke {
stroke-width: 2px;
}
}
}

View File

@@ -2,4 +2,4 @@
/// <reference types="next/image-types/global" />
// NOTE: This file should not be edited
// see https://nextjs.org/docs/app/building-your-application/configuring/typescript for more information.
// see https://nextjs.org/docs/app/api-reference/config/typescript for more information.

View File

@@ -3,6 +3,7 @@
"version": "1.0.0",
"description": "Payload Live Preview example.",
"license": "MIT",
"type": "module",
"main": "dist/server.js",
"scripts": {
"build": "cross-env NODE_OPTIONS=--no-deprecation next build",

File diff suppressed because it is too large Load Diff

View File

@@ -1,8 +1,9 @@
import type { MigrateUpArgs } from '@payloadcms/db-mongodb'
import type { Page } from '../payload-types'
import { DefaultDocumentIDType } from 'payload'
export const home: Partial<Page> = {
export const home = (id: DefaultDocumentIDType): Partial<Page> => ({
slug: 'home',
richText: [
{
@@ -41,11 +42,26 @@ export const home: Partial<Page> = {
{
text: ' you can edit this page in the admin panel and see the changes reflected here in real time.',
},
...(id
? [
{
text: ' To get started, visit ',
},
{
type: 'link',
children: [{ text: 'this page' }],
linkType: 'custom',
newTab: true,
url: `/admin/collections/pages/${id}/preview`,
},
{ text: '.' },
]
: []),
],
},
],
title: 'Home',
}
})
export const examplePage: Partial<Page> = {
slug: 'example-page',
@@ -83,11 +99,18 @@ export async function up({ payload }: MigrateUpArgs): Promise<void> {
data: examplePage as any, // eslint-disable-line
})
const homepageJSON = JSON.parse(JSON.stringify(home))
const { id: homePageID } = await payload.create({
const { id: ogHomePageID } = await payload.create({
collection: 'pages',
data: homepageJSON,
data: {
title: 'Home',
richText: [],
},
})
const { id: homePageID } = await payload.update({
id: ogHomePageID,
collection: 'pages',
data: home(ogHomePageID),
})
await payload.updateGlobal({
@@ -121,7 +144,7 @@ export async function up({ payload }: MigrateUpArgs): Promise<void> {
type: 'custom',
label: 'Dashboard',
reference: undefined,
url: 'http://localhost:3000/admin',
url: '/admin',
},
},
],

View File

@@ -6,10 +6,66 @@
* and re-run `payload generate:types` to regenerate this file.
*/
/**
* Supported timezones in IANA format.
*
* This interface was referenced by `Config`'s JSON-Schema
* via the `definition` "supportedTimezones".
*/
export type SupportedTimezones =
| 'Pacific/Midway'
| 'Pacific/Niue'
| 'Pacific/Honolulu'
| 'Pacific/Rarotonga'
| 'America/Anchorage'
| 'Pacific/Gambier'
| 'America/Los_Angeles'
| 'America/Tijuana'
| 'America/Denver'
| 'America/Phoenix'
| 'America/Chicago'
| 'America/Guatemala'
| 'America/New_York'
| 'America/Bogota'
| 'America/Caracas'
| 'America/Santiago'
| 'America/Buenos_Aires'
| 'America/Sao_Paulo'
| 'Atlantic/South_Georgia'
| 'Atlantic/Azores'
| 'Atlantic/Cape_Verde'
| 'Europe/London'
| 'Europe/Berlin'
| 'Africa/Lagos'
| 'Europe/Athens'
| 'Africa/Cairo'
| 'Europe/Moscow'
| 'Asia/Riyadh'
| 'Asia/Dubai'
| 'Asia/Baku'
| 'Asia/Karachi'
| 'Asia/Tashkent'
| 'Asia/Calcutta'
| 'Asia/Dhaka'
| 'Asia/Almaty'
| 'Asia/Jakarta'
| 'Asia/Bangkok'
| 'Asia/Shanghai'
| 'Asia/Singapore'
| 'Asia/Tokyo'
| 'Asia/Seoul'
| 'Australia/Brisbane'
| 'Australia/Sydney'
| 'Pacific/Guam'
| 'Pacific/Noumea'
| 'Pacific/Auckland'
| 'Pacific/Fiji';
export interface Config {
auth: {
users: UserAuthOperations;
};
blocks: {};
collections: {
pages: Page;
users: User;

View File

@@ -1,6 +1,6 @@
{
"name": "payload-monorepo",
"version": "3.39.1",
"version": "3.40.0",
"private": true,
"type": "module",
"scripts": {
@@ -84,7 +84,7 @@
"publish-prerelease": "pnpm --filter releaser publish-prerelease",
"reinstall": "pnpm clean:all && pnpm install",
"release": "pnpm --filter releaser release --tag latest",
"runts": "cross-env NODE_OPTIONS=--no-deprecation node --no-deprecation --import @swc-node/register/esm-register",
"runts": "cross-env NODE_OPTIONS=\"--no-deprecation --no-experimental-strip-types\" node --no-deprecation --no-experimental-strip-types --import @swc-node/register/esm-register",
"script:build-template-with-local-pkgs": "pnpm --filter scripts build-template-with-local-pkgs",
"script:gen-templates": "pnpm --filter scripts gen-templates",
"script:gen-templates:build": "pnpm --filter scripts gen-templates --build",
@@ -93,17 +93,19 @@
"script:pack": "pnpm --filter scripts pack-all-to-dest",
"pretest": "pnpm build",
"test": "pnpm test:int && pnpm test:components && pnpm test:e2e",
"test:components": "cross-env NODE_OPTIONS=\" --no-deprecation\" jest --config=jest.components.config.js",
"test:components": "cross-env NODE_OPTIONS=\" --no-deprecation --no-experimental-strip-types\" jest --config=jest.components.config.js",
"test:e2e": "pnpm runts ./test/runE2E.ts",
"test:e2e:debug": "cross-env NODE_OPTIONS=--no-deprecation NODE_NO_WARNINGS=1 PWDEBUG=1 DISABLE_LOGGING=true playwright test",
"test:e2e:headed": "cross-env NODE_OPTIONS=--no-deprecation NODE_NO_WARNINGS=1 DISABLE_LOGGING=true playwright test --headed",
"test:e2e:debug": "cross-env NODE_OPTIONS=\"--no-deprecation --no-experimental-strip-types\" NODE_NO_WARNINGS=1 PWDEBUG=1 DISABLE_LOGGING=true playwright test",
"test:e2e:headed": "cross-env NODE_OPTIONS=\"--no-deprecation --no-experimental-strip-types\" NODE_NO_WARNINGS=1 DISABLE_LOGGING=true playwright test --headed",
"test:e2e:prod": "pnpm prepare-run-test-against-prod && pnpm runts ./test/runE2E.ts --prod",
"test:e2e:prod:ci": "pnpm prepare-run-test-against-prod:ci && pnpm runts ./test/runE2E.ts --prod",
"test:int": "cross-env NODE_OPTIONS=\"--no-deprecation\" NODE_NO_WARNINGS=1 DISABLE_LOGGING=true jest --forceExit --detectOpenHandles --config=test/jest.config.js --runInBand",
"test:int:postgres": "cross-env NODE_OPTIONS=\"--no-deprecation\" NODE_NO_WARNINGS=1 PAYLOAD_DATABASE=postgres DISABLE_LOGGING=true jest --forceExit --detectOpenHandles --config=test/jest.config.js --runInBand",
"test:int:sqlite": "cross-env NODE_OPTIONS=\"--no-deprecation\" NODE_NO_WARNINGS=1 PAYLOAD_DATABASE=sqlite DISABLE_LOGGING=true jest --forceExit --detectOpenHandles --config=test/jest.config.js --runInBand",
"test:e2e:prod:ci:turbo": "pnpm prepare-run-test-against-prod:ci && pnpm runts ./test/runE2E.ts --prod --turbo",
"test:e2e:turbo": "pnpm runts ./test/runE2E.ts --turbo",
"test:int": "cross-env NODE_OPTIONS=\"--no-deprecation --no-experimental-strip-types\" NODE_NO_WARNINGS=1 DISABLE_LOGGING=true jest --forceExit --detectOpenHandles --config=test/jest.config.js --runInBand",
"test:int:postgres": "cross-env NODE_OPTIONS=\"--no-deprecation --no-experimental-strip-types\" NODE_NO_WARNINGS=1 PAYLOAD_DATABASE=postgres DISABLE_LOGGING=true jest --forceExit --detectOpenHandles --config=test/jest.config.js --runInBand",
"test:int:sqlite": "cross-env NODE_OPTIONS=\"--no-deprecation --no-experimental-strip-types\" NODE_NO_WARNINGS=1 PAYLOAD_DATABASE=sqlite DISABLE_LOGGING=true jest --forceExit --detectOpenHandles --config=test/jest.config.js --runInBand",
"test:types": "tstyche",
"test:unit": "cross-env NODE_OPTIONS=\"--no-deprecation\" NODE_NO_WARNINGS=1 DISABLE_LOGGING=true jest --forceExit --detectOpenHandles --config=jest.config.js --runInBand",
"test:unit": "cross-env NODE_OPTIONS=\"--no-deprecation --no-experimental-strip-types\" NODE_NO_WARNINGS=1 DISABLE_LOGGING=true jest --forceExit --detectOpenHandles --config=jest.config.js --runInBand",
"translateNewKeys": "pnpm --filter translations run translateNewKeys"
},
"lint-staged": {
@@ -120,7 +122,7 @@
"devDependencies": {
"@jest/globals": "29.7.0",
"@libsql/client": "0.14.0",
"@next/bundle-analyzer": "15.3.0",
"@next/bundle-analyzer": "15.3.2",
"@payloadcms/db-postgres": "workspace:*",
"@payloadcms/eslint-config": "workspace:*",
"@payloadcms/eslint-plugin": "workspace:*",
@@ -128,9 +130,9 @@
"@playwright/test": "1.50.0",
"@sentry/nextjs": "^8.33.1",
"@sentry/node": "^8.33.1",
"@swc-node/register": "1.10.9",
"@swc/cli": "0.6.0",
"@swc/jest": "0.2.37",
"@swc-node/register": "1.10.10",
"@swc/cli": "0.7.7",
"@swc/jest": "0.2.38",
"@types/fs-extra": "^11.0.2",
"@types/jest": "29.5.12",
"@types/minimist": "1.2.5",
@@ -145,8 +147,6 @@
"cross-env": "7.0.3",
"dotenv": "16.4.7",
"drizzle-kit": "0.28.0",
"drizzle-orm": "0.36.1",
"escape-html": "^1.0.3",
"execa": "5.1.1",
"form-data": "3.0.1",
"fs-extra": "10.1.0",
@@ -156,7 +156,7 @@
"lint-staged": "15.2.7",
"minimist": "1.2.8",
"mongodb-memory-server": "^10",
"next": "15.3.0",
"next": "15.3.2",
"open": "^10.1.0",
"p-limit": "^5.0.0",
"playwright": "1.50.0",
@@ -169,7 +169,7 @@
"shelljs": "0.8.5",
"slash": "3.0.0",
"sort-package-json": "^2.10.0",
"swc-plugin-transform-remove-imports": "3.1.0",
"swc-plugin-transform-remove-imports": "4.0.4",
"tempy": "1.0.1",
"tstyche": "^3.1.1",
"tsx": "4.19.2",
@@ -186,7 +186,6 @@
"copyfiles": "$copyfiles",
"cross-env": "$cross-env",
"dotenv": "$dotenv",
"drizzle-orm": "$drizzle-orm",
"graphql": "^16.8.1",
"mongodb-memory-server": "$mongodb-memory-server",
"react": "$react",

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/admin-bar",
"version": "3.39.1",
"version": "3.40.0",
"description": "An admin bar for React apps using Payload",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -1,6 +1,6 @@
{
"name": "create-payload-app",
"version": "3.39.1",
"version": "3.40.0",
"homepage": "https://payloadcms.com",
"repository": {
"type": "git",
@@ -16,6 +16,7 @@
"url": "https://payloadcms.com"
}
],
"sideEffects": false,
"type": "module",
"exports": {
"./types": {
@@ -60,7 +61,7 @@
"dependencies": {
"@clack/prompts": "^0.7.0",
"@sindresorhus/slugify": "^1.1.0",
"@swc/core": "1.10.12",
"@swc/core": "1.11.29",
"arg": "^5.0.0",
"chalk": "^4.1.0",
"comment-json": "^4.2.3",

View File

@@ -16,12 +16,15 @@ export const configurePluginProject = ({
const devPayloadConfigPath = path.resolve(projectDirPath, './dev/payload.config.ts')
const devTsConfigPath = path.resolve(projectDirPath, './dev/tsconfig.json')
const indexTsPath = path.resolve(projectDirPath, './src/index.ts')
const devImportMapPath = path.resolve(projectDirPath, './dev/app/(payload)/admin/importMap.js')
const devPayloadConfig = fse.readFileSync(devPayloadConfigPath, 'utf8')
const devTsConfig = fse.readFileSync(devTsConfigPath, 'utf8')
const indexTs = fse.readFileSync(indexTsPath, 'utf-8')
const devImportMap = fse.readFileSync(devImportMapPath, 'utf-8')
const updatedTsConfig = devTsConfig.replaceAll('plugin-package-name-placeholder', projectName)
const updatedImportMap = devImportMap.replaceAll('plugin-package-name-placeholder', projectName)
let updatedIndexTs = indexTs.replaceAll('plugin-package-name-placeholder', projectName)
const pluginExportVariableName = toCamelCase(projectName)
@@ -43,4 +46,5 @@ export const configurePluginProject = ({
fse.writeFileSync(devPayloadConfigPath, updatedPayloadConfig)
fse.writeFileSync(devTsConfigPath, updatedTsConfig)
fse.writeFileSync(indexTsPath, updatedIndexTs)
fse.writeFileSync(devImportMapPath, updatedImportMap)
}

View File

@@ -10,10 +10,10 @@ import type { CliArgs, DbType, ProjectExample, ProjectTemplate } from '../types.
import { createProject } from './create-project.js'
import { dbReplacements } from './replacements.js'
import { getValidTemplates } from './templates.js'
import { manageEnvFiles } from './manage-env-files.js'
describe('createProject', () => {
let projectDir: string
beforeAll(() => {
// eslint-disable-next-line no-console
console.log = jest.fn()
@@ -63,6 +63,30 @@ describe('createProject', () => {
expect(packageJson.name).toStrictEqual(projectName)
})
it('updates project name in plugin template importMap file', async () => {
const projectName = 'my-custom-plugin'
const template: ProjectTemplate = {
name: 'plugin',
type: 'plugin',
description: 'Template for creating a Payload plugin',
url: 'https://github.com/payloadcms/payload/templates/plugin',
}
await createProject({
cliArgs: { ...args, '--local-template': 'plugin' } as CliArgs,
packageManager,
projectDir,
projectName,
template,
})
const importMapPath = path.resolve(projectDir, './dev/app/(payload)/admin/importMap.js')
const importMapFile = fse.readFileSync(importMapPath, 'utf-8')
expect(importMapFile).not.toContain('plugin-package-name-placeholder')
expect(importMapFile).toContain('my-custom-plugin')
})
it('creates example', async () => {
const projectName = 'custom-server-example'
const example: ProjectExample = {
@@ -155,75 +179,5 @@ describe('createProject', () => {
expect(content).toContain(dbReplacement.configReplacement().join('\n'))
})
})
describe('managing env files', () => {
it('updates .env files without overwriting existing data', async () => {
const envFilePath = path.join(projectDir, '.env')
const envExampleFilePath = path.join(projectDir, '.env.example')
fse.ensureDirSync(projectDir)
fse.ensureFileSync(envFilePath)
fse.ensureFileSync(envExampleFilePath)
const initialEnvContent = `CUSTOM_VAR=custom-value\nDATABASE_URI=old-connection\n`
const initialEnvExampleContent = `CUSTOM_VAR=custom-value\nDATABASE_URI=old-connection\nPAYLOAD_SECRET=YOUR_SECRET_HERE\n`
fse.writeFileSync(envFilePath, initialEnvContent)
fse.writeFileSync(envExampleFilePath, initialEnvExampleContent)
await manageEnvFiles({
cliArgs: {
'--debug': true,
} as CliArgs,
databaseType: 'mongodb',
databaseUri: 'mongodb://localhost:27017/test',
payloadSecret: 'test-secret',
projectDir,
template: undefined,
})
const updatedEnvContent = fse.readFileSync(envFilePath, 'utf-8')
expect(updatedEnvContent).toContain('CUSTOM_VAR=custom-value')
expect(updatedEnvContent).toContain('DATABASE_URI=mongodb://localhost:27017/test')
expect(updatedEnvContent).toContain('PAYLOAD_SECRET=test-secret')
const updatedEnvExampleContent = fse.readFileSync(envExampleFilePath, 'utf-8')
expect(updatedEnvExampleContent).toContain('CUSTOM_VAR=custom-value')
expect(updatedEnvContent).toContain('DATABASE_URI=mongodb://localhost:27017/test')
expect(updatedEnvContent).toContain('PAYLOAD_SECRET=test-secret')
})
it('creates .env and .env.example if they do not exist', async () => {
const envFilePath = path.join(projectDir, '.env')
const envExampleFilePath = path.join(projectDir, '.env.example')
fse.ensureDirSync(projectDir)
if (fse.existsSync(envFilePath)) fse.removeSync(envFilePath)
if (fse.existsSync(envExampleFilePath)) fse.removeSync(envExampleFilePath)
await manageEnvFiles({
cliArgs: {
'--debug': true,
} as CliArgs,
databaseUri: '',
payloadSecret: '',
projectDir,
template: undefined,
})
expect(fse.existsSync(envFilePath)).toBe(true)
expect(fse.existsSync(envExampleFilePath)).toBe(true)
const updatedEnvContent = fse.readFileSync(envFilePath, 'utf-8')
expect(updatedEnvContent).toContain('DATABASE_URI=your-connection-string-here')
expect(updatedEnvContent).toContain('PAYLOAD_SECRET=YOUR_SECRET_HERE')
const updatedEnvExampleContent = fse.readFileSync(envExampleFilePath, 'utf-8')
expect(updatedEnvExampleContent).toContain('DATABASE_URI=your-connection-string-here')
expect(updatedEnvExampleContent).toContain('PAYLOAD_SECRET=YOUR_SECRET_HERE')
})
})
})
})

View File

@@ -144,17 +144,14 @@ export async function createProject(
}
}
// Call manageEnvFiles before initializing Git
if (dbDetails) {
await manageEnvFiles({
cliArgs,
databaseType: dbDetails.type,
databaseUri: dbDetails.dbUri,
payloadSecret: generateSecret(),
projectDir,
template: 'template' in args ? args.template : undefined,
})
}
await manageEnvFiles({
cliArgs,
databaseType: dbDetails?.type,
databaseUri: dbDetails?.dbUri,
payloadSecret: generateSecret(),
projectDir,
template: 'template' in args ? args.template : undefined,
})
// Remove yarn.lock file. This is only desired in Payload Cloud.
const lockPath = path.resolve(projectDir, 'pnpm-lock.yaml')

View File

@@ -0,0 +1,165 @@
import { jest } from '@jest/globals'
import fs from 'fs'
import fse from 'fs-extra'
import * as os from 'node:os'
import path from 'path'
import type { CliArgs } from '../types.js'
import { manageEnvFiles } from './manage-env-files.js'
describe('createProject', () => {
let projectDir: string
let envFilePath = ''
let envExampleFilePath = ''
beforeAll(() => {
// eslint-disable-next-line no-console
console.log = jest.fn()
})
beforeEach(() => {
const tempDirectory = fs.realpathSync(os.tmpdir())
projectDir = `${tempDirectory}/${Math.random().toString(36).substring(7)}`
envFilePath = path.join(projectDir, '.env')
envExampleFilePath = path.join(projectDir, '.env.example')
if (fse.existsSync(envFilePath)) {
fse.removeSync(envFilePath)
}
fse.ensureDirSync(projectDir)
})
afterEach(() => {
if (fse.existsSync(projectDir)) {
fse.rmSync(projectDir, { recursive: true })
}
})
it('generates .env using defaults (not from .env.example)', async () => {
// ensure no .env.example exists so that the default values are used
// the `manageEnvFiles` function will look for .env.example in the file system
if (fse.existsSync(envExampleFilePath)) {
fse.removeSync(envExampleFilePath)
}
await manageEnvFiles({
cliArgs: {
'--debug': true,
} as CliArgs,
databaseUri: '', // omitting this will ensure the default vars are used
payloadSecret: '', // omitting this will ensure the default vars are used
projectDir,
template: undefined,
})
expect(fse.existsSync(envFilePath)).toBe(true)
const updatedEnvContent = fse.readFileSync(envFilePath, 'utf-8')
expect(updatedEnvContent).toBe(
`# Added by Payload\nPAYLOAD_SECRET=YOUR_SECRET_HERE\nDATABASE_URI=your-connection-string-here`,
)
})
it('generates .env from .env.example', async () => {
// create or override the .env.example file with a connection string that will NOT be overridden
fse.ensureFileSync(envExampleFilePath)
fse.writeFileSync(
envExampleFilePath,
`DATABASE_URI=example-connection-string\nCUSTOM_VAR=custom-value\n`,
)
await manageEnvFiles({
cliArgs: {
'--debug': true,
} as CliArgs,
databaseUri: '', // omitting this will ensure the `.env.example` vars are used
payloadSecret: '', // omitting this will ensure the `.env.example` vars are used
projectDir,
template: undefined,
})
expect(fse.existsSync(envFilePath)).toBe(true)
const updatedEnvContent = fse.readFileSync(envFilePath, 'utf-8')
expect(updatedEnvContent).toBe(
`DATABASE_URI=example-connection-string\nCUSTOM_VAR=custom-value\nPAYLOAD_SECRET=YOUR_SECRET_HERE\n# Added by Payload`,
)
})
it('updates existing .env without overriding vars', async () => {
// create an existing .env file with some custom variables that should NOT be overridden
fse.ensureFileSync(envFilePath)
fse.writeFileSync(
envFilePath,
`CUSTOM_VAR=custom-value\nDATABASE_URI=example-connection-string\n`,
)
// create an .env.example file to ensure that its contents DO NOT override existing .env vars
fse.ensureFileSync(envExampleFilePath)
fse.writeFileSync(
envExampleFilePath,
`CUSTOM_VAR=custom-value-2\nDATABASE_URI=example-connection-string-2\n`,
)
await manageEnvFiles({
cliArgs: {
'--debug': true,
} as CliArgs,
databaseUri: '', // omitting this will ensure the `.env` vars are kept
payloadSecret: '', // omitting this will ensure the `.env` vars are kept
projectDir,
template: undefined,
})
expect(fse.existsSync(envFilePath)).toBe(true)
const updatedEnvContent = fse.readFileSync(envFilePath, 'utf-8')
expect(updatedEnvContent).toBe(
`# Added by Payload\nPAYLOAD_SECRET=YOUR_SECRET_HERE\nDATABASE_URI=example-connection-string\nCUSTOM_VAR=custom-value`,
)
})
it('sanitizes .env based on selected database type', async () => {
await manageEnvFiles({
cliArgs: {
'--debug': true,
} as CliArgs,
databaseType: 'mongodb', // this mimics the CLI selection and will be used as the DATABASE_URI
databaseUri: 'mongodb://localhost:27017/test', // this mimics the CLI selection and will be used as the DATABASE_URI
payloadSecret: 'test-secret', // this mimics the CLI selection and will be used as the PAYLOAD_SECRET
projectDir,
template: undefined,
})
const updatedEnvContent = fse.readFileSync(envFilePath, 'utf-8')
expect(updatedEnvContent).toBe(
`# Added by Payload\nPAYLOAD_SECRET=test-secret\nDATABASE_URI=mongodb://localhost:27017/test`,
)
// delete the generated .env file and do it again, but this time, omit the databaseUri to ensure the default is generated
fse.removeSync(envFilePath)
await manageEnvFiles({
cliArgs: {
'--debug': true,
} as CliArgs,
databaseType: 'mongodb', // this mimics the CLI selection and will be used as the DATABASE_URI
databaseUri: '', // omit this to ensure the default is generated based on the selected database type
payloadSecret: 'test-secret',
projectDir,
template: undefined,
})
const updatedEnvContentWithDefault = fse.readFileSync(envFilePath, 'utf-8')
expect(updatedEnvContentWithDefault).toBe(
`# Added by Payload\nPAYLOAD_SECRET=test-secret\nDATABASE_URI=mongodb://127.0.0.1/your-database-name`,
)
})
})

View File

@@ -6,21 +6,42 @@ import type { CliArgs, DbType, ProjectTemplate } from '../types.js'
import { debug, error } from '../utils/log.js'
import { dbChoiceRecord } from './select-db.js'
const updateEnvExampleVariables = (
contents: string,
databaseType: DbType | undefined,
payloadSecret?: string,
databaseUri?: string,
): string => {
const sanitizeEnv = ({
contents,
databaseType,
databaseUri,
payloadSecret,
}: {
contents: string
databaseType: DbType | undefined
databaseUri?: string
payloadSecret?: string
}): string => {
const seenKeys = new Set<string>()
const updatedEnv = contents
// add defaults
let withDefaults = contents
if (
!contents.includes('DATABASE_URI') &&
!contents.includes('POSTGRES_URL') &&
!contents.includes('MONGODB_URI')
) {
withDefaults += '\nDATABASE_URI=your-connection-string-here'
}
if (!contents.includes('PAYLOAD_SECRET')) {
withDefaults += '\nPAYLOAD_SECRET=YOUR_SECRET_HERE'
}
let updatedEnv = withDefaults
.split('\n')
.map((line) => {
if (line.startsWith('#') || !line.includes('=')) {
return line
}
const [key] = line.split('=')
const [key, value] = line.split('=')
if (!key) {
return
@@ -28,6 +49,7 @@ const updateEnvExampleVariables = (
if (key === 'DATABASE_URI' || key === 'POSTGRES_URL' || key === 'MONGODB_URI') {
const dbChoice = databaseType ? dbChoiceRecord[databaseType] : null
if (dbChoice) {
const placeholderUri = databaseUri
? databaseUri
@@ -36,6 +58,8 @@ const updateEnvExampleVariables = (
databaseType === 'vercel-postgres'
? `POSTGRES_URL=${placeholderUri}`
: `DATABASE_URI=${placeholderUri}`
} else {
line = `${key}=${value}`
}
}
@@ -56,6 +80,10 @@ const updateEnvExampleVariables = (
.reverse()
.join('\n')
if (!updatedEnv.includes('# Added by Payload')) {
updatedEnv = `# Added by Payload\n${updatedEnv}`
}
return updatedEnv
}
@@ -63,7 +91,7 @@ const updateEnvExampleVariables = (
export async function manageEnvFiles(args: {
cliArgs: CliArgs
databaseType?: DbType
databaseUri: string
databaseUri?: string
payloadSecret: string
projectDir: string
template?: ProjectTemplate
@@ -77,70 +105,63 @@ export async function manageEnvFiles(args: {
return
}
const envExamplePath = path.join(projectDir, '.env.example')
const pathToEnvExample = path.join(projectDir, '.env.example')
const envPath = path.join(projectDir, '.env')
const emptyEnvContent = `# Added by Payload\nDATABASE_URI=your-connection-string-here\nPAYLOAD_SECRET=YOUR_SECRET_HERE\n`
try {
let updatedExampleContents: string
let exampleEnv: null | string = ''
try {
if (template?.type === 'plugin') {
if (debugFlag) {
debug(`plugin template detected - no .env added .env.example added`)
}
return
}
if (!fs.existsSync(envExamplePath)) {
updatedExampleContents = updateEnvExampleVariables(
emptyEnvContent,
databaseType,
payloadSecret,
databaseUri,
)
// If there's a .env.example file, use it to create or update the .env file
if (fs.existsSync(pathToEnvExample)) {
const envExampleContents = await fs.readFile(pathToEnvExample, 'utf8')
await fs.writeFile(envExamplePath, updatedExampleContents)
if (debugFlag) {
debug(`.env.example file successfully created`)
}
} else {
const envExampleContents = await fs.readFile(envExamplePath, 'utf8')
const mergedEnvs = envExampleContents + '\n' + emptyEnvContent
updatedExampleContents = updateEnvExampleVariables(
mergedEnvs,
exampleEnv = sanitizeEnv({
contents: envExampleContents,
databaseType,
payloadSecret,
databaseUri,
)
payloadSecret,
})
await fs.writeFile(envExamplePath, updatedExampleContents)
if (debugFlag) {
debug(`.env.example file successfully updated`)
debug(`.env.example file successfully read`)
}
}
// If there's no .env file, create it using the .env.example content (if it exists)
if (!fs.existsSync(envPath)) {
const envContent = updateEnvExampleVariables(
emptyEnvContent,
const envContent = sanitizeEnv({
contents: exampleEnv,
databaseType,
payloadSecret,
databaseUri,
)
payloadSecret,
})
await fs.writeFile(envPath, envContent)
if (debugFlag) {
debug(`.env file successfully created`)
}
} else {
// If the .env file already exists, sanitize it as-is
const envContents = await fs.readFile(envPath, 'utf8')
const mergedEnvs = envContents + '\n' + emptyEnvContent
const updatedEnvContents = updateEnvExampleVariables(
mergedEnvs,
const updatedEnvContents = sanitizeEnv({
contents: envContents,
databaseType,
payloadSecret,
databaseUri,
)
payloadSecret,
})
await fs.writeFile(envPath, updatedEnvContents)
if (debugFlag) {
debug(`.env file successfully updated`)
}

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/db-mongodb",
"version": "3.39.1",
"version": "3.40.0",
"description": "The officially supported MongoDB database adapter for Payload",
"homepage": "https://payloadcms.com",
"repository": {
@@ -17,6 +17,7 @@
"url": "https://payloadcms.com"
}
],
"sideEffects": false,
"type": "module",
"exports": {
".": {

View File

@@ -417,7 +417,7 @@ export const sanitizeQueryValue = ({
return buildExistsQuery(
formattedValue,
path,
!['relationship', 'upload'].includes(field.type),
!['checkbox', 'relationship', 'upload'].includes(field.type),
)
}
}

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/db-postgres",
"version": "3.39.1",
"version": "3.40.0",
"description": "The officially supported Postgres database adapter for Payload",
"homepage": "https://payloadcms.com",
"repository": {
@@ -17,6 +17,7 @@
"url": "https://payloadcms.com"
}
],
"sideEffects": false,
"type": "module",
"exports": {
".": {
@@ -85,10 +86,10 @@
"uuid": "10.0.0"
},
"devDependencies": {
"@hyrious/esbuild-plugin-commonjs": "^0.2.4",
"@hyrious/esbuild-plugin-commonjs": "0.2.6",
"@payloadcms/eslint-config": "workspace:*",
"@types/to-snake-case": "1.0.0",
"esbuild": "0.24.2",
"esbuild": "0.25.5",
"payload": "workspace:*"
},
"peerDependencies": {

View File

@@ -51,13 +51,6 @@ export const connect: Connect = async function connect(
) {
const { hotReload } = options
this.schema = {
pgSchema: this.pgSchema,
...this.tables,
...this.relations,
...this.enums,
}
try {
if (!this.pool) {
this.pool = new this.pg.Pool(this.poolOptions)

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/db-sqlite",
"version": "3.39.1",
"version": "3.40.0",
"description": "The officially supported SQLite database adapter for Payload",
"homepage": "https://payloadcms.com",
"repository": {
@@ -17,6 +17,7 @@
"url": "https://payloadcms.com"
}
],
"sideEffects": false,
"type": "module",
"exports": {
".": {

View File

@@ -15,11 +15,6 @@ export const connect: Connect = async function connect(
) {
const { hotReload } = options
this.schema = {
...this.tables,
...this.relations,
}
try {
if (!this.client) {
this.client = createClient(this.clientConfig)

View File

@@ -36,4 +36,9 @@ export const init: Init = async function init(this: SQLiteAdapter) {
})
await executeSchemaHooks({ type: 'afterSchemaInit', adapter: this })
this.schema = {
...this.tables,
...this.relations,
}
}

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/db-vercel-postgres",
"version": "3.39.1",
"version": "3.40.0",
"description": "Vercel Postgres adapter for Payload",
"homepage": "https://payloadcms.com",
"repository": {
@@ -17,6 +17,7 @@
"url": "https://payloadcms.com"
}
],
"sideEffects": false,
"type": "module",
"exports": {
".": {
@@ -85,11 +86,11 @@
"uuid": "10.0.0"
},
"devDependencies": {
"@hyrious/esbuild-plugin-commonjs": "^0.2.4",
"@hyrious/esbuild-plugin-commonjs": "0.2.6",
"@payloadcms/eslint-config": "workspace:*",
"@types/pg": "8.10.2",
"@types/to-snake-case": "1.0.0",
"esbuild": "0.24.2",
"esbuild": "0.25.5",
"payload": "workspace:*"
},
"peerDependencies": {

View File

@@ -16,13 +16,6 @@ export const connect: Connect = async function connect(
) {
const { hotReload } = options
this.schema = {
pgSchema: this.pgSchema,
...this.tables,
...this.relations,
...this.enums,
}
try {
const logger = this.logger || false

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/drizzle",
"version": "3.39.1",
"version": "3.40.0",
"description": "A library of shared functions used by different payload database adapters",
"homepage": "https://payloadcms.com",
"repository": {
@@ -17,6 +17,7 @@
"url": "https://payloadcms.com"
}
],
"sideEffects": false,
"type": "module",
"exports": {
".": {
@@ -63,6 +64,7 @@
"@libsql/client": "0.14.0",
"@payloadcms/eslint-config": "workspace:*",
"@types/pg": "8.10.2",
"@types/prompts": "^2.4.5",
"@types/to-snake-case": "1.0.0",
"payload": "workspace:*"
},

View File

@@ -1,19 +1,16 @@
import type { Count, SanitizedCollectionConfig } from 'payload'
import toSnakeCase from 'to-snake-case'
import type { Count } from 'payload'
import type { DrizzleAdapter } from './types.js'
import { buildQuery } from './queries/buildQuery.js'
import { getCollection } from './utilities/getEntity.js'
import { getTransaction } from './utilities/getTransaction.js'
export const count: Count = async function count(
this: DrizzleAdapter,
{ collection, locale, req, where: whereArg },
{ collection: collectionSlug, locale, req, where: whereArg = {} },
) {
const collectionConfig: SanitizedCollectionConfig = this.payload.collections[collection].config
const tableName = this.tableNameMap.get(toSnakeCase(collectionConfig.slug))
const { collectionConfig, tableName } = getCollection({ adapter: this, collectionSlug })
const db = await getTransaction(this, req)

View File

@@ -1,24 +1,18 @@
import type { CountGlobalVersions, SanitizedGlobalConfig } from 'payload'
import type { CountGlobalVersions } from 'payload'
import { buildVersionGlobalFields } from 'payload'
import toSnakeCase from 'to-snake-case'
import type { DrizzleAdapter } from './types.js'
import { buildQuery } from './queries/buildQuery.js'
import { getGlobal } from './utilities/getEntity.js'
import { getTransaction } from './utilities/getTransaction.js'
export const countGlobalVersions: CountGlobalVersions = async function countGlobalVersions(
this: DrizzleAdapter,
{ global, locale, req, where: whereArg },
{ global: globalSlug, locale, req, where: whereArg = {} },
) {
const globalConfig: SanitizedGlobalConfig = this.payload.globals.config.find(
({ slug }) => slug === global,
)
const tableName = this.tableNameMap.get(
`_${toSnakeCase(globalConfig.slug)}${this.versionsSuffix}`,
)
const { globalConfig, tableName } = getGlobal({ adapter: this, globalSlug, versions: true })
const db = await getTransaction(this, req)

View File

@@ -1,22 +1,22 @@
import type { CountVersions, SanitizedCollectionConfig } from 'payload'
import type { CountVersions } from 'payload'
import { buildVersionCollectionFields } from 'payload'
import toSnakeCase from 'to-snake-case'
import type { DrizzleAdapter } from './types.js'
import { buildQuery } from './queries/buildQuery.js'
import { getCollection } from './utilities/getEntity.js'
import { getTransaction } from './utilities/getTransaction.js'
export const countVersions: CountVersions = async function countVersions(
this: DrizzleAdapter,
{ collection, locale, req, where: whereArg },
{ collection: collectionSlug, locale, req, where: whereArg = {} },
) {
const collectionConfig: SanitizedCollectionConfig = this.payload.collections[collection].config
const tableName = this.tableNameMap.get(
`_${toSnakeCase(collectionConfig.slug)}${this.versionsSuffix}`,
)
const { collectionConfig, tableName } = getCollection({
adapter: this,
collectionSlug,
versions: true,
})
const db = await getTransaction(this, req)

View File

@@ -1,10 +1,9 @@
import type { Create } from 'payload'
import toSnakeCase from 'to-snake-case'
import type { DrizzleAdapter } from './types.js'
import { upsertRow } from './upsertRow/index.js'
import { getCollection } from './utilities/getEntity.js'
import { getTransaction } from './utilities/getTransaction.js'
export const create: Create = async function create(
@@ -12,15 +11,13 @@ export const create: Create = async function create(
{ collection: collectionSlug, data, req, returning, select },
) {
const db = await getTransaction(this, req)
const collection = this.payload.collections[collectionSlug].config
const tableName = this.tableNameMap.get(toSnakeCase(collection.slug))
const { collectionConfig, tableName } = getCollection({ adapter: this, collectionSlug })
const result = await upsertRow({
adapter: this,
data,
db,
fields: collection.flattenedFields,
fields: collectionConfig.flattenedFields,
ignoreResult: returning === false,
operation: 'create',
req,

View File

@@ -1,20 +1,17 @@
import type { CreateGlobalArgs } from 'payload'
import toSnakeCase from 'to-snake-case'
import type { DrizzleAdapter } from './types.js'
import { upsertRow } from './upsertRow/index.js'
import { getGlobal } from './utilities/getEntity.js'
import { getTransaction } from './utilities/getTransaction.js'
export async function createGlobal<T extends Record<string, unknown>>(
this: DrizzleAdapter,
{ slug, data, req, returning }: CreateGlobalArgs,
{ slug: globalSlug, data, req, returning }: CreateGlobalArgs,
): Promise<T> {
const db = await getTransaction(this, req)
const globalConfig = this.payload.globals.config.find((config) => config.slug === slug)
const tableName = this.tableNameMap.get(toSnakeCase(globalConfig.slug))
const { globalConfig, tableName } = getGlobal({ adapter: this, globalSlug })
data.createdAt = new Date().toISOString()
@@ -30,10 +27,10 @@ export async function createGlobal<T extends Record<string, unknown>>(
})
if (returning === false) {
return null
return null as unknown as T
}
result.globalType = slug
result.globalType = globalSlug
return result
}

View File

@@ -2,11 +2,11 @@ import type { CreateGlobalVersionArgs, TypeWithID, TypeWithVersion } from 'paylo
import { sql } from 'drizzle-orm'
import { buildVersionGlobalFields } from 'payload'
import toSnakeCase from 'to-snake-case'
import type { DrizzleAdapter } from './types.js'
import { upsertRow } from './upsertRow/index.js'
import { getGlobal } from './utilities/getEntity.js'
import { getTransaction } from './utilities/getTransaction.js'
export async function createGlobalVersion<T extends TypeWithID>(
@@ -25,9 +25,7 @@ export async function createGlobalVersion<T extends TypeWithID>(
}: CreateGlobalVersionArgs,
) {
const db = await getTransaction(this, req)
const global = this.payload.globals.config.find(({ slug }) => slug === globalSlug)
const tableName = this.tableNameMap.get(`_${toSnakeCase(global.slug)}${this.versionsSuffix}`)
const { globalConfig, tableName } = getGlobal({ adapter: this, globalSlug, versions: true })
const result = await upsertRow<TypeWithVersion<T>>({
adapter: this,
@@ -41,7 +39,7 @@ export async function createGlobalVersion<T extends TypeWithID>(
version: versionData,
},
db,
fields: buildVersionGlobalFields(this.payload.config, global, true),
fields: buildVersionGlobalFields(this.payload.config, globalConfig, true),
ignoreResult: returning === false ? 'idOnly' : false,
operation: 'create',
req,
@@ -50,7 +48,7 @@ export async function createGlobalVersion<T extends TypeWithID>(
})
const table = this.tables[tableName]
if (global.versions.drafts) {
if (globalConfig.versions.drafts) {
await this.execute({
db,
sql: sql`

View File

@@ -2,11 +2,11 @@ import type { CreateVersionArgs, TypeWithID, TypeWithVersion } from 'payload'
import { sql } from 'drizzle-orm'
import { buildVersionCollectionFields } from 'payload'
import toSnakeCase from 'to-snake-case'
import type { DrizzleAdapter } from './types.js'
import { upsertRow } from './upsertRow/index.js'
import { getCollection } from './utilities/getEntity.js'
import { getTransaction } from './utilities/getTransaction.js'
export async function createVersion<T extends TypeWithID>(
@@ -26,12 +26,13 @@ export async function createVersion<T extends TypeWithID>(
}: CreateVersionArgs<T>,
) {
const db = await getTransaction(this, req)
const collection = this.payload.collections[collectionSlug].config
const defaultTableName = toSnakeCase(collection.slug)
const { collectionConfig, tableName } = getCollection({
adapter: this,
collectionSlug,
versions: true,
})
const tableName = this.tableNameMap.get(`_${defaultTableName}${this.versionsSuffix}`)
const version = { ...versionData }
const version: Partial<TypeWithID> = { ...versionData }
if (version.id) {
delete version.id
}
@@ -51,7 +52,7 @@ export async function createVersion<T extends TypeWithID>(
adapter: this,
data,
db,
fields: buildVersionCollectionFields(this.payload.config, collection, true),
fields: buildVersionCollectionFields(this.payload.config, collectionConfig, true),
operation: 'create',
req,
select,
@@ -60,7 +61,7 @@ export async function createVersion<T extends TypeWithID>(
const table = this.tables[tableName]
if (collection.versions.drafts) {
if (collectionConfig.versions.drafts) {
await this.execute({
db,
sql: sql`

View File

@@ -1,28 +1,26 @@
import type { DeleteMany } from 'payload'
import { inArray } from 'drizzle-orm'
import toSnakeCase from 'to-snake-case'
import type { DrizzleAdapter } from './types.js'
import { findMany } from './find/findMany.js'
import { getCollection } from './utilities/getEntity.js'
import { getTransaction } from './utilities/getTransaction.js'
export const deleteMany: DeleteMany = async function deleteMany(
this: DrizzleAdapter,
{ collection, req, where },
{ collection: collectionSlug, req, where },
) {
const db = await getTransaction(this, req)
const collectionConfig = this.payload.collections[collection].config
const tableName = this.tableNameMap.get(toSnakeCase(collectionConfig.slug))
const { collectionConfig, tableName } = getCollection({ adapter: this, collectionSlug })
const result = await findMany({
adapter: this,
fields: collectionConfig.flattenedFields,
joins: false,
limit: 0,
locale: req?.locale,
locale: req?.locale ?? undefined,
page: 1,
pagination: false,
req,
@@ -30,9 +28,9 @@ export const deleteMany: DeleteMany = async function deleteMany(
where,
})
const ids = []
const ids: (number | string)[] = []
result.docs.forEach((data) => {
result.docs.forEach((data: any) => {
ids.push(data.id)
})

View File

@@ -1,7 +1,6 @@
import type { DeleteOne } from 'payload'
import { eq } from 'drizzle-orm'
import toSnakeCase from 'to-snake-case'
import type { DrizzleAdapter } from './types.js'
@@ -9,6 +8,7 @@ import { buildFindManyArgs } from './find/buildFindManyArgs.js'
import { buildQuery } from './queries/buildQuery.js'
import { selectDistinct } from './queries/selectDistinct.js'
import { transform } from './transform/read/index.js'
import { getCollection, getTableQuery } from './utilities/getEntity.js'
import { getTransaction } from './utilities/getTransaction.js'
export const deleteOne: DeleteOne = async function deleteOne(
@@ -16,16 +16,14 @@ export const deleteOne: DeleteOne = async function deleteOne(
{ collection: collectionSlug, req, returning, select, where: whereArg },
) {
const db = await getTransaction(this, req)
const collection = this.payload.collections[collectionSlug].config
const { collectionConfig, tableName } = getCollection({ adapter: this, collectionSlug })
const tableName = this.tableNameMap.get(toSnakeCase(collection.slug))
let docToDelete: Record<string, unknown>
let docToDelete: Record<string, unknown> | undefined = undefined
const { joins, selectFields, where } = buildQuery({
adapter: this,
fields: collection.flattenedFields,
locale: req?.locale,
fields: collectionConfig.flattenedFields,
locale: req?.locale ?? undefined,
tableName,
where: whereArg,
})
@@ -40,15 +38,17 @@ export const deleteOne: DeleteOne = async function deleteOne(
where,
})
const queryTable = getTableQuery({ adapter: this, tableName })
if (selectDistinctResult?.[0]?.id) {
docToDelete = await db.query[tableName].findFirst({
docToDelete = await queryTable.findFirst({
where: eq(this.tables[tableName].id, selectDistinctResult[0].id),
})
} else {
const findManyArgs = buildFindManyArgs({
adapter: this,
depth: 0,
fields: collection.flattenedFields,
fields: collectionConfig.flattenedFields,
joinQuery: false,
select,
tableName,
@@ -56,7 +56,7 @@ export const deleteOne: DeleteOne = async function deleteOne(
findManyArgs.where = where
docToDelete = await db.query[tableName].findFirst(findManyArgs)
docToDelete = await queryTable.findFirst(findManyArgs)
}
if (!docToDelete) {
@@ -70,7 +70,7 @@ export const deleteOne: DeleteOne = async function deleteOne(
adapter: this,
config: this.payload.config,
data: docToDelete,
fields: collection.flattenedFields,
fields: collectionConfig.flattenedFields,
joinQuery: false,
tableName,
})

View File

@@ -1,24 +1,24 @@
import type { DeleteVersions, SanitizedCollectionConfig } from 'payload'
import type { DeleteVersions } from 'payload'
import { inArray } from 'drizzle-orm'
import { buildVersionCollectionFields } from 'payload'
import toSnakeCase from 'to-snake-case'
import type { DrizzleAdapter } from './types.js'
import { findMany } from './find/findMany.js'
import { getCollection } from './utilities/getEntity.js'
import { getTransaction } from './utilities/getTransaction.js'
export const deleteVersions: DeleteVersions = async function deleteVersion(
this: DrizzleAdapter,
{ collection, locale, req, where: where },
{ collection: collectionSlug, locale, req, where: where },
) {
const db = await getTransaction(this, req)
const collectionConfig: SanitizedCollectionConfig = this.payload.collections[collection].config
const tableName = this.tableNameMap.get(
`_${toSnakeCase(collectionConfig.slug)}${this.versionsSuffix}`,
)
const { collectionConfig, tableName } = getCollection({
adapter: this,
collectionSlug,
versions: true,
})
const fields = buildVersionCollectionFields(this.payload.config, collectionConfig, true)
@@ -35,9 +35,9 @@ export const deleteVersions: DeleteVersions = async function deleteVersion(
where,
})
const ids = []
const ids: (number | string)[] = []
docs.forEach((doc) => {
docs.forEach((doc: any) => {
ids.push(doc.id)
})

View File

@@ -1,15 +1,14 @@
import type { Find, SanitizedCollectionConfig } from 'payload'
import toSnakeCase from 'to-snake-case'
import type { Find } from 'payload'
import type { DrizzleAdapter } from './types.js'
import { findMany } from './find/findMany.js'
import { getCollection } from './utilities/getEntity.js'
export const find: Find = async function find(
this: DrizzleAdapter,
{
collection,
collection: collectionSlug,
draftsEnabled,
joins,
limit,
@@ -22,11 +21,9 @@ export const find: Find = async function find(
where,
},
) {
const collectionConfig: SanitizedCollectionConfig = this.payload.collections[collection].config
const { collectionConfig, tableName } = getCollection({ adapter: this, collectionSlug })
const sort = sortArg !== undefined && sortArg !== null ? sortArg : collectionConfig.defaultSort
const tableName = this.tableNameMap.get(toSnakeCase(collectionConfig.slug))
return findMany({
adapter: this,
collectionSlug: collectionConfig.slug,

View File

@@ -44,7 +44,7 @@ export const buildFindManyArgs = ({
select,
tableName,
versions,
}: BuildFindQueryArgs): Record<string, unknown> => {
}: BuildFindQueryArgs): any => {
const result: Result = {
extras: {},
with: {},

View File

@@ -1,18 +1,15 @@
import type { FindGlobal } from 'payload'
import toSnakeCase from 'to-snake-case'
import type { DrizzleAdapter } from './types.js'
import { findMany } from './find/findMany.js'
import { getGlobal } from './utilities/getEntity.js'
export const findGlobal: FindGlobal = async function findGlobal(
this: DrizzleAdapter,
{ slug, locale, req, select, where },
{ slug: globalSlug, locale, req, select, where },
) {
const globalConfig = this.payload.globals.config.find((config) => config.slug === slug)
const tableName = this.tableNameMap.get(toSnakeCase(globalConfig.slug))
const { globalConfig, tableName } = getGlobal({ adapter: this, globalSlug })
const {
docs: [doc],
@@ -29,7 +26,7 @@ export const findGlobal: FindGlobal = async function findGlobal(
})
if (doc) {
doc.globalType = slug
doc.globalType = globalSlug
return doc
}

View File

@@ -1,25 +1,19 @@
import type { FindGlobalVersions, SanitizedGlobalConfig } from 'payload'
import type { FindGlobalVersions } from 'payload'
import { buildVersionGlobalFields } from 'payload'
import toSnakeCase from 'to-snake-case'
import type { DrizzleAdapter } from './types.js'
import { findMany } from './find/findMany.js'
import { getGlobal } from './utilities/getEntity.js'
export const findGlobalVersions: FindGlobalVersions = async function findGlobalVersions(
this: DrizzleAdapter,
{ global, limit, locale, page, pagination, req, select, skip, sort: sortArg, where },
{ global: globalSlug, limit, locale, page, pagination, req, select, skip, sort: sortArg, where },
) {
const globalConfig: SanitizedGlobalConfig = this.payload.globals.config.find(
({ slug }) => slug === global,
)
const { globalConfig, tableName } = getGlobal({ adapter: this, globalSlug, versions: true })
const sort = sortArg !== undefined && sortArg !== null ? sortArg : '-createdAt'
const tableName = this.tableNameMap.get(
`_${toSnakeCase(globalConfig.slug)}${this.versionsSuffix}`,
)
const fields = buildVersionGlobalFields(this.payload.config, globalConfig, true)
return findMany({

View File

@@ -1,22 +1,19 @@
import type { FindOneArgs, SanitizedCollectionConfig, TypeWithID } from 'payload'
import toSnakeCase from 'to-snake-case'
import type { FindOneArgs, TypeWithID } from 'payload'
import type { DrizzleAdapter } from './types.js'
import { findMany } from './find/findMany.js'
import { getCollection } from './utilities/getEntity.js'
export async function findOne<T extends TypeWithID>(
this: DrizzleAdapter,
{ collection, draftsEnabled, joins, locale, req, select, where }: FindOneArgs,
{ collection: collectionSlug, draftsEnabled, joins, locale, req, select, where }: FindOneArgs,
): Promise<T> {
const collectionConfig: SanitizedCollectionConfig = this.payload.collections[collection].config
const tableName = this.tableNameMap.get(toSnakeCase(collectionConfig.slug))
const { collectionConfig, tableName } = getCollection({ adapter: this, collectionSlug })
const { docs } = await findMany({
adapter: this,
collectionSlug: collection,
collectionSlug,
draftsEnabled,
fields: collectionConfig.flattenedFields,
joins,

View File

@@ -1,23 +1,34 @@
import type { FindVersions, SanitizedCollectionConfig } from 'payload'
import type { FindVersions } from 'payload'
import { buildVersionCollectionFields } from 'payload'
import toSnakeCase from 'to-snake-case'
import type { DrizzleAdapter } from './types.js'
import { findMany } from './find/findMany.js'
import { getCollection } from './utilities/getEntity.js'
export const findVersions: FindVersions = async function findVersions(
this: DrizzleAdapter,
{ collection, limit, locale, page, pagination, req, select, skip, sort: sortArg, where },
{
collection: collectionSlug,
limit,
locale,
page,
pagination,
req,
select,
skip,
sort: sortArg,
where,
},
) {
const collectionConfig: SanitizedCollectionConfig = this.payload.collections[collection].config
const { collectionConfig, tableName } = getCollection({
adapter: this,
collectionSlug,
versions: true,
})
const sort = sortArg !== undefined && sortArg !== null ? sortArg : collectionConfig.defaultSort
const tableName = this.tableNameMap.get(
`_${toSnakeCase(collectionConfig.slug)}${this.versionsSuffix}`,
)
const fields = buildVersionCollectionFields(this.payload.config, collectionConfig, true)
return findMany({

View File

@@ -1,4 +1,4 @@
import type { Payload } from 'payload'
import type { JsonObject, Payload, TypeWithID } from 'payload'
import {
commitTransaction,
@@ -32,7 +32,7 @@ export const migrate: DrizzleAdapter['migrate'] = async function migrate(
}
let latestBatch = 0
let migrationsInDB = []
let migrationsInDB: (JsonObject & TypeWithID)[] = []
const hasMigrationTable = await migrationTableExists(this)

View File

@@ -52,7 +52,7 @@ export async function migrateDown(this: DrizzleAdapter): Promise<void> {
const tableExists = await migrationTableExists(this, db)
if (tableExists) {
if (tableExists && migration.id) {
await payload.delete({
id: migration.id,
collection: 'payload-migrations',

View File

@@ -48,7 +48,7 @@ export async function migrateReset(this: DrizzleAdapter): Promise<void> {
})
const tableExists = await migrationTableExists(this, db)
if (tableExists) {
if (tableExists && migration.id) {
await payload.delete({
id: migration.id,
collection: 'payload-migrations',
@@ -58,7 +58,10 @@ export async function migrateReset(this: DrizzleAdapter): Promise<void> {
await commitTransaction(req)
} catch (err: unknown) {
let msg = `Error running migration ${migrationFile.name}.`
let msg = `Error running migration`
if (migrationFile) {
msg += ` ${migrationFile.name}.`
}
if (err instanceof Error) {
msg += ` ${err.message}`

View File

@@ -1,3 +1,5 @@
import type { MigrationData } from 'payload'
import { Table } from 'console-table-printer'
import { getMigrations, readMigrationFiles } from 'payload'
@@ -13,7 +15,7 @@ export async function migrateStatus(this: DrizzleAdapter): Promise<void> {
msg: `Found ${migrationFiles.length} migration files.`,
})
let existingMigrations = []
let existingMigrations: MigrationData[] = []
const hasMigrationTable = await migrationTableExists(this)
if (hasMigrationTable) {

View File

@@ -35,4 +35,11 @@ export const init: Init = async function init(this: BasePostgresAdapter) {
})
await executeSchemaHooks({ type: 'afterSchemaInit', adapter: this })
this.schema = {
pgSchema: this.pgSchema,
...this.tables,
...this.relations,
...this.enums,
}
}

View File

@@ -1,27 +1,39 @@
import type { QueryDrafts, SanitizedCollectionConfig } from 'payload'
import type { QueryDrafts } from 'payload'
import { buildVersionCollectionFields, combineQueries } from 'payload'
import toSnakeCase from 'to-snake-case'
import type { DrizzleAdapter } from './types.js'
import { findMany } from './find/findMany.js'
import { getCollection } from './utilities/getEntity.js'
export const queryDrafts: QueryDrafts = async function queryDrafts(
this: DrizzleAdapter,
{ collection, joins, limit, locale, page = 1, pagination, req, select, sort, where },
{
collection: collectionSlug,
joins,
limit,
locale,
page = 1,
pagination,
req,
select,
sort,
where,
},
) {
const collectionConfig: SanitizedCollectionConfig = this.payload.collections[collection].config
const tableName = this.tableNameMap.get(
`_${toSnakeCase(collectionConfig.slug)}${this.versionsSuffix}`,
)
const { collectionConfig, tableName } = getCollection({
adapter: this,
collectionSlug,
versions: true,
})
const fields = buildVersionCollectionFields(this.payload.config, collectionConfig, true)
const combinedWhere = combineQueries({ latest: { equals: true } }, where)
const combinedWhere = combineQueries({ latest: { equals: true } }, where ?? {})
const result = await findMany({
adapter: this,
collectionSlug: collection,
collectionSlug,
fields,
joins,
limit,
@@ -38,7 +50,7 @@ export const queryDrafts: QueryDrafts = async function queryDrafts(
return {
...result,
docs: result.docs.map((doc) => {
docs: result.docs.map((doc: any) => {
doc = {
id: doc.parent,
...doc.version,

View File

@@ -7,14 +7,6 @@ export const withDefault = (column: RawColumn, field: FieldAffectingData): RawCo
return column
}
if (typeof field.defaultValue === 'string' && field.defaultValue.includes("'")) {
const escapedString = field.defaultValue.replaceAll("'", "''")
return {
...column,
default: escapedString,
}
}
return {
...column,
default: field.defaultValue,

View File

@@ -1,21 +1,19 @@
import type { UpdateGlobalArgs } from 'payload'
import toSnakeCase from 'to-snake-case'
import type { DrizzleAdapter } from './types.js'
import { upsertRow } from './upsertRow/index.js'
import { getGlobal, getTableQuery } from './utilities/getEntity.js'
import { getTransaction } from './utilities/getTransaction.js'
export async function updateGlobal<T extends Record<string, unknown>>(
this: DrizzleAdapter,
{ slug, data, req, returning, select }: UpdateGlobalArgs,
{ slug: globalSlug, data, req, returning, select }: UpdateGlobalArgs,
): Promise<T> {
const db = await getTransaction(this, req)
const globalConfig = this.payload.globals.config.find((config) => config.slug === slug)
const tableName = this.tableNameMap.get(toSnakeCase(globalConfig.slug))
const existingGlobal = await db.query[tableName].findFirst({})
const { globalConfig, tableName } = getGlobal({ adapter: this, globalSlug })
const queryTable = getTableQuery({ adapter: this, tableName })
const existingGlobal = await queryTable.findFirst({})
const result = await upsertRow<{ globalType: string } & T>({
...(existingGlobal ? { id: existingGlobal.id, operation: 'update' } : { operation: 'create' }),
@@ -30,10 +28,11 @@ export async function updateGlobal<T extends Record<string, unknown>>(
})
if (returning === false) {
// @ts-expect-error dont want to change public api response type
return null
}
result.globalType = slug
result.globalType = globalSlug
return result
}

View File

@@ -6,19 +6,19 @@ import type {
} from 'payload'
import { buildVersionGlobalFields } from 'payload'
import toSnakeCase from 'to-snake-case'
import type { DrizzleAdapter } from './types.js'
import { buildQuery } from './queries/buildQuery.js'
import { upsertRow } from './upsertRow/index.js'
import { getGlobal } from './utilities/getEntity.js'
import { getTransaction } from './utilities/getTransaction.js'
export async function updateGlobalVersion<T extends TypeWithID>(
this: DrizzleAdapter,
{
id,
global,
global: globalSlug,
locale,
req,
returning,
@@ -28,15 +28,9 @@ export async function updateGlobalVersion<T extends TypeWithID>(
}: UpdateGlobalVersionArgs<T>,
) {
const db = await getTransaction(this, req)
const globalConfig: SanitizedGlobalConfig = this.payload.globals.config.find(
({ slug }) => slug === global,
)
const { globalConfig, tableName } = getGlobal({ adapter: this, globalSlug, versions: true })
const whereToUse = whereArg || { id: { equals: id } }
const tableName = this.tableNameMap.get(
`_${toSnakeCase(globalConfig.slug)}${this.versionsSuffix}`,
)
const fields = buildVersionGlobalFields(this.payload.config, globalConfig, true)
const { where } = buildQuery({

View File

@@ -1,11 +1,10 @@
import type { UpdateJobs, Where } from 'payload'
import toSnakeCase from 'to-snake-case'
import type { DrizzleAdapter } from './types.js'
import { findMany } from './find/findMany.js'
import { upsertRow } from './upsertRow/index.js'
import { getCollection } from './utilities/getEntity.js'
import { getTransaction } from './utilities/getTransaction.js'
export const updateJobs: UpdateJobs = async function updateMany(
@@ -15,18 +14,20 @@ export const updateJobs: UpdateJobs = async function updateMany(
if (!(data?.log as object[])?.length) {
delete data.log
}
const whereToUse: Where = id ? { id: { equals: id } } : whereArg
const whereToUse: Where = id ? { id: { equals: id } } : (whereArg ?? {})
const limit = id ? 1 : limitArg
const db = await getTransaction(this, req)
const collection = this.payload.collections['payload-jobs'].config
const tableName = this.tableNameMap.get(toSnakeCase(collection.slug))
const sort = sortArg !== undefined && sortArg !== null ? sortArg : collection.defaultSort
const { collectionConfig, tableName } = getCollection({
adapter: this,
collectionSlug: 'payload-jobs',
})
const sort = sortArg !== undefined && sortArg !== null ? sortArg : collectionConfig.defaultSort
const jobs = await findMany({
adapter: this,
collectionSlug: 'payload-jobs',
fields: collection.flattenedFields,
fields: collectionConfig.flattenedFields,
limit: id ? 1 : limit,
pagination: false,
req,
@@ -52,7 +53,7 @@ export const updateJobs: UpdateJobs = async function updateMany(
adapter: this,
data: updateData,
db,
fields: collection.flattenedFields,
fields: collectionConfig.flattenedFields,
ignoreResult: returning === false,
operation: 'update',
req,

View File

@@ -8,6 +8,7 @@ import type { DrizzleAdapter } from './types.js'
import { buildQuery } from './queries/buildQuery.js'
import { selectDistinct } from './queries/selectDistinct.js'
import { upsertRow } from './upsertRow/index.js'
import { getCollection } from './utilities/getEntity.js'
import { getTransaction } from './utilities/getTransaction.js'
export const updateMany: UpdateMany = async function updateMany(
@@ -26,14 +27,13 @@ export const updateMany: UpdateMany = async function updateMany(
},
) {
const db = await getTransaction(this, req)
const collection = this.payload.collections[collectionSlug].config
const tableName = this.tableNameMap.get(toSnakeCase(collection.slug))
const { collectionConfig, tableName } = getCollection({ adapter: this, collectionSlug })
const sort = sortArg !== undefined && sortArg !== null ? sortArg : collection.defaultSort
const sort = sortArg !== undefined && sortArg !== null ? sortArg : collectionConfig.defaultSort
const { joins, orderBy, selectFields, where } = buildQuery({
adapter: this,
fields: collection.flattenedFields,
fields: collectionConfig.flattenedFields,
locale,
sort,
tableName,
@@ -90,7 +90,7 @@ export const updateMany: UpdateMany = async function updateMany(
adapter: this,
data,
db,
fields: collection.flattenedFields,
fields: collectionConfig.flattenedFields,
ignoreResult: returning === false,
joinQuery,
operation: 'update',

View File

@@ -8,6 +8,7 @@ import type { DrizzleAdapter } from './types.js'
import { buildQuery } from './queries/buildQuery.js'
import { selectDistinct } from './queries/selectDistinct.js'
import { upsertRow } from './upsertRow/index.js'
import { getCollection } from './utilities/getEntity.js'
import { getTransaction } from './utilities/getTransaction.js'
export const updateOne: UpdateOne = async function updateOne(
@@ -25,17 +26,16 @@ export const updateOne: UpdateOne = async function updateOne(
},
) {
const db = await getTransaction(this, req)
const collection = this.payload.collections[collectionSlug].config
const tableName = this.tableNameMap.get(toSnakeCase(collection.slug))
const { collectionConfig, tableName } = getCollection({ adapter: this, collectionSlug })
let idToUpdate = id
if (!idToUpdate) {
const { joins, selectFields, where } = buildQuery({
adapter: this,
fields: collection.flattenedFields,
fields: collectionConfig.flattenedFields,
locale,
tableName,
where: whereArg,
where: whereArg ?? {},
})
// selectDistinct will only return if there are joins
@@ -71,7 +71,7 @@ export const updateOne: UpdateOne = async function updateOne(
adapter: this,
data,
db,
fields: collection.flattenedFields,
fields: collectionConfig.flattenedFields,
ignoreResult: returning === false,
joinQuery,
operation: 'update',

View File

@@ -1,24 +1,19 @@
import type {
SanitizedCollectionConfig,
TypeWithID,
TypeWithVersion,
UpdateVersionArgs,
} from 'payload'
import type { TypeWithID, TypeWithVersion, UpdateVersionArgs } from 'payload'
import { buildVersionCollectionFields } from 'payload'
import toSnakeCase from 'to-snake-case'
import type { DrizzleAdapter } from './types.js'
import { buildQuery } from './queries/buildQuery.js'
import { upsertRow } from './upsertRow/index.js'
import { getCollection } from './utilities/getEntity.js'
import { getTransaction } from './utilities/getTransaction.js'
export async function updateVersion<T extends TypeWithID>(
this: DrizzleAdapter,
{
id,
collection,
collection: collectionSlug,
locale,
req,
returning,
@@ -28,11 +23,12 @@ export async function updateVersion<T extends TypeWithID>(
}: UpdateVersionArgs<T>,
) {
const db = await getTransaction(this, req)
const collectionConfig: SanitizedCollectionConfig = this.payload.collections[collection].config
const { collectionConfig, tableName } = getCollection({
adapter: this,
collectionSlug,
versions: true,
})
const whereToUse = whereArg || { id: { equals: id } }
const tableName = this.tableNameMap.get(
`_${toSnakeCase(collectionConfig.slug)}${this.versionsSuffix}`,
)
const fields = buildVersionCollectionFields(this.payload.config, collectionConfig, true)

View File

@@ -22,6 +22,6 @@ export const deleteExistingArrayRows = async ({
await adapter.deleteWhere({
db,
tableName,
where: and(...whereConstraints),
where: and(...whereConstraints)!,
})
}

View File

@@ -49,7 +49,7 @@ export const deleteExistingRowsByPath = async ({
await adapter.deleteWhere({
db,
tableName,
where: and(...whereConstraints),
where: and(...whereConstraints)!,
})
}
@@ -63,7 +63,7 @@ export const deleteExistingRowsByPath = async ({
await adapter.deleteWhere({
db,
tableName,
where: and(...whereConstraints),
where: and(...whereConstraints)!,
})
}
}

View File

@@ -9,6 +9,7 @@ import type { Args } from './types.js'
import { buildFindManyArgs } from '../find/buildFindManyArgs.js'
import { transform } from '../transform/read/index.js'
import { transformForWrite } from '../transform/write/index.js'
import { getTableQuery } from '../utilities/getEntity.js'
import { deleteExistingArrayRows } from './deleteExistingArrayRows.js'
import { deleteExistingRowsByPath } from './deleteExistingRowsByPath.js'
import { insertArrays } from './insertArrays.js'
@@ -43,7 +44,7 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
})
// First, we insert the main row
let insertedRow: Record<string, unknown>
let insertedRow: Record<string, unknown> | undefined
try {
if (operation === 'update') {
@@ -86,7 +87,9 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
// If there are locale rows with data, add the parent and locale to each
if (Object.keys(rowToInsert.locales).length > 0) {
Object.entries(rowToInsert.locales).forEach(([locale, localeRow]) => {
localeRow._parentID = insertedRow.id
if (insertedRow) {
localeRow._parentID = insertedRow.id
}
localeRow._locale = locale
localesToInsert.push(localeRow)
})
@@ -95,7 +98,9 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
// If there are relationships, add parent to each
if (rowToInsert.relationships.length > 0) {
rowToInsert.relationships.forEach((relation) => {
relation.parent = insertedRow.id
if (insertedRow) {
relation.parent = insertedRow.id
}
relationsToInsert.push(relation)
})
}
@@ -103,7 +108,9 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
// If there are texts, add parent to each
if (rowToInsert.texts.length > 0) {
rowToInsert.texts.forEach((textRow) => {
textRow.parent = insertedRow.id
if (insertedRow) {
textRow.parent = insertedRow.id
}
textsToInsert.push(textRow)
})
}
@@ -111,7 +118,9 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
// If there are numbers, add parent to each
if (rowToInsert.numbers.length > 0) {
rowToInsert.numbers.forEach((numberRow) => {
numberRow.parent = insertedRow.id
if (insertedRow) {
numberRow.parent = insertedRow.id
}
numbersToInsert.push(numberRow)
})
}
@@ -124,10 +133,12 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
selectRows.forEach((row) => {
if (typeof row.parent === 'undefined') {
row.parent = insertedRow.id
if (insertedRow) {
row.parent = insertedRow.id
}
}
selectsToInsert[selectTableName].push(row)
selectsToInsert[selectTableName]?.push(row)
})
})
}
@@ -135,8 +146,10 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
// If there are blocks, add parent to each, and then
// store by table name and rows
Object.keys(rowToInsert.blocks).forEach((tableName) => {
rowToInsert.blocks[tableName].forEach((blockRow) => {
blockRow.row._parentID = insertedRow.id
rowToInsert.blocks[tableName]?.forEach((blockRow) => {
if (insertedRow) {
blockRow.row._parentID = insertedRow.id
}
if (!blocksToInsert[tableName]) {
blocksToInsert[tableName] = []
}
@@ -155,7 +168,7 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
const localeTableName = `${tableName}${adapter.localesSuffix}`
const localeTable = adapter.tables[`${tableName}${adapter.localesSuffix}`]
if (operation === 'update') {
if (operation === 'update' && insertedRow) {
await adapter.deleteWhere({
db,
tableName: localeTableName,
@@ -176,7 +189,7 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
const relationshipsTableName = `${tableName}${adapter.relationshipsSuffix}`
if (operation === 'update') {
if (operation === 'update' && insertedRow) {
await deleteExistingRowsByPath({
adapter,
db,
@@ -203,7 +216,7 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
const textsTableName = `${tableName}_texts`
if (operation === 'update') {
if (operation === 'update' && insertedRow) {
await deleteExistingRowsByPath({
adapter,
db,
@@ -230,7 +243,7 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
const numbersTableName = `${tableName}_numbers`
if (operation === 'update') {
if (operation === 'update' && insertedRow) {
await deleteExistingRowsByPath({
adapter,
db,
@@ -257,7 +270,7 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
const insertedBlockRows: Record<string, Record<string, unknown>[]> = {}
if (operation === 'update') {
if (operation === 'update' && insertedRow) {
for (const tableName of rowToInsert.blocksToDelete) {
const blockTable = adapter.tables[tableName]
await adapter.deleteWhere({
@@ -279,7 +292,9 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
})
insertedBlockRows[tableName].forEach((row, i) => {
blockRows[i].row = row
if (blockRows[i]) {
blockRows[i].row = row
}
if (
typeof row._uuid === 'string' &&
(typeof row.id === 'string' || typeof row.id === 'number')
@@ -290,20 +305,23 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
const blockLocaleIndexMap: number[] = []
const blockLocaleRowsToInsert = blockRows.reduce((acc, blockRow, i) => {
if (Object.entries(blockRow.locales).length > 0) {
Object.entries(blockRow.locales).forEach(([blockLocale, blockLocaleData]) => {
if (Object.keys(blockLocaleData).length > 0) {
blockLocaleData._parentID = blockRow.row.id
blockLocaleData._locale = blockLocale
acc.push(blockLocaleData)
blockLocaleIndexMap.push(i)
}
})
}
const blockLocaleRowsToInsert = blockRows.reduce<Record<string, unknown>[]>(
(acc, blockRow, i) => {
if (Object.entries(blockRow.locales).length > 0) {
Object.entries(blockRow.locales).forEach(([blockLocale, blockLocaleData]) => {
if (Object.keys(blockLocaleData).length > 0) {
blockLocaleData._parentID = blockRow.row.id
blockLocaleData._locale = blockLocale
acc.push(blockLocaleData)
blockLocaleIndexMap.push(i)
}
})
}
return acc
}, [])
return acc
},
[],
)
if (blockLocaleRowsToInsert.length > 0) {
await adapter.insert({
@@ -326,7 +344,7 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
// INSERT ARRAYS RECURSIVELY
// //////////////////////////////////
if (operation === 'update') {
if (operation === 'update' && insertedRow) {
for (const arrayTableName of Object.keys(rowToInsert.arrays)) {
await deleteExistingArrayRows({
adapter,
@@ -337,13 +355,15 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
}
}
await insertArrays({
adapter,
arrays: [rowToInsert.arrays],
db,
parentRows: [insertedRow],
uuidMap: arraysBlocksUUIDMap,
})
if (insertedRow) {
await insertArrays({
adapter,
arrays: [rowToInsert.arrays],
db,
parentRows: [insertedRow],
uuidMap: arraysBlocksUUIDMap,
})
}
// //////////////////////////////////
// INSERT hasMany SELECTS
@@ -351,7 +371,7 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
for (const [selectTableName, tableRows] of Object.entries(selectsToInsert)) {
const selectTable = adapter.tables[selectTableName]
if (operation === 'update') {
if (operation === 'update' && insertedRow) {
await adapter.deleteWhere({
db,
tableName: selectTableName,
@@ -379,18 +399,20 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
// //////////////////////////////////
// Error Handling
// //////////////////////////////////
} catch (error) {
if (error.code === '23505') {
} catch (error: unknown) {
if (error && typeof error === 'object' && 'code' in error && error.code === '23505') {
const constraint =
('constraint' in error && typeof error.constraint === 'string' && error.constraint) || ''
let fieldName: null | string = null
// We need to try and find the right constraint for the field but if we can't we fallback to a generic message
if (adapter.fieldConstraints?.[tableName]) {
if (adapter.fieldConstraints[tableName]?.[error.constraint]) {
fieldName = adapter.fieldConstraints[tableName]?.[error.constraint]
if (adapter.fieldConstraints[tableName]?.[constraint]) {
fieldName = adapter.fieldConstraints[tableName]?.[constraint]
} else {
const replacement = `${tableName}_`
if (error.constraint.includes(replacement)) {
const replacedConstraint = error.constraint.replace(replacement, '')
if (constraint.includes(replacement)) {
const replacedConstraint = constraint.replace(replacement, '')
if (replacedConstraint && adapter.fieldConstraints[tableName]?.[replacedConstraint]) {
fieldName = adapter.fieldConstraints[tableName][replacedConstraint]
@@ -401,14 +423,15 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
if (!fieldName) {
// Last case scenario we extract the key and value from the detail on the error
const detail = error.detail
const detail = ('detail' in error && typeof error.detail === 'string' && error.detail) || ''
// eslint-disable-next-line regexp/no-unused-capturing-group
const regex = /Key \(([^)]+)\)=\(([^)]+)\)/
const match = detail.match(regex)
if (match) {
const key = match[1]
fieldName = key
fieldName = key ?? null
}
}
@@ -418,7 +441,7 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
errors: [
{
message: req?.t ? req.t('error:valueMustBeUnique') : 'Value must be unique',
path: fieldName,
path: fieldName ?? '',
},
],
req,
@@ -430,7 +453,7 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
}
}
if (ignoreResult === 'idOnly') {
if (ignoreResult === 'idOnly' && insertedRow) {
return { id: insertedRow.id } as T
}
@@ -451,9 +474,12 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
tableName,
})
findManyArgs.where = eq(adapter.tables[tableName].id, insertedRow.id)
if (insertedRow) {
findManyArgs.where = eq(adapter.tables[tableName].id, insertedRow.id)
}
const doc = await db.query[tableName].findFirst(findManyArgs)
const queryTable = getTableQuery({ adapter, tableName })
const doc = await queryTable.findFirst(findManyArgs)
// //////////////////////////////////
// TRANSFORM DATA
@@ -462,7 +488,7 @@ export const upsertRow = async <T extends Record<string, unknown> | TypeWithID>(
const result = transform<T>({
adapter,
config: adapter.payload.config,
data: doc,
data: doc ?? {},
fields,
joinQuery: false,
tableName,

View File

@@ -42,26 +42,26 @@ export const insertArrays = async ({
}
}
const parentID = parentRows[parentRowIndex].id
const parentID = parentRows[parentRowIndex]?.id
// Add any sub arrays that need to be created
// We will call this recursively below
arrayRows.forEach((arrayRow) => {
if (Object.keys(arrayRow.arrays).length > 0) {
rowsByTable[tableName].arrays.push(arrayRow.arrays)
rowsByTable[tableName]?.arrays.push(arrayRow.arrays)
}
// Set up parent IDs for both row and locale row
arrayRow.row._parentID = parentID
rowsByTable[tableName].rows.push(arrayRow.row)
rowsByTable[tableName]?.rows.push(arrayRow.row)
Object.entries(arrayRow.locales).forEach(([arrayRowLocale, arrayRowLocaleData]) => {
arrayRowLocaleData._parentID = arrayRow.row.id
arrayRowLocaleData._locale = arrayRowLocale
rowsByTable[tableName].locales.push(arrayRowLocaleData)
rowsByTable[tableName]?.locales.push(arrayRowLocaleData)
if (!arrayRow.row.id) {
arrayRowLocaleData._getParentID = (rows: { _uuid: string; id: number }[]) => {
const { id } = rows.find((each) => each._uuid === arrayRow.row._uuid)
const { id } = rows.find((each) => each._uuid === arrayRow.row._uuid) ?? {}
return id
}
}
@@ -74,7 +74,7 @@ export const insertArrays = async ({
// (one insert per array table)
for (const [tableName, row] of Object.entries(rowsByTable)) {
// the nested arrays need the ID for the parentID foreign key
let insertedRows: Args['parentRows']
let insertedRows: Args['parentRows'] | null = null
if (row.rows.length > 0) {
insertedRows = await adapter.insert({
db,
@@ -94,7 +94,7 @@ export const insertArrays = async ({
// Insert locale rows
if (adapter.tables[`${tableName}${adapter.localesSuffix}`] && row.locales.length > 0) {
if (!row.locales[0]._parentID) {
if (!row.locales[0]?._parentID) {
row.locales = row.locales.map((localeRow) => {
if (typeof localeRow._getParentID === 'function') {
localeRow._parentID = localeRow._getParentID(insertedRows)
@@ -111,7 +111,7 @@ export const insertArrays = async ({
}
// If there are sub arrays, call this function recursively
if (row.arrays.length > 0) {
if (row.arrays.length > 0 && insertedRows) {
await insertArrays({
adapter,
arrays: row.arrays,

View File

@@ -1,5 +1,5 @@
export const appendPrefixToObjectKeys = <T>(obj: Record<string, unknown>, prefix: string): T =>
Object.entries(obj).reduce((res, [key, val]) => {
Object.entries(obj).reduce((res: any, [key, val]) => {
res[`${prefix}_${key}`] = val
return res
}, {} as T)

View File

@@ -34,11 +34,11 @@ export const buildCreateMigration = ({
const drizzleJsonAfter = await generateDrizzleJson(this.schema)
const [yyymmdd, hhmmss] = new Date().toISOString().split('T')
const formattedDate = yyymmdd.replace(/\D/g, '')
const formattedTime = hhmmss.split('.')[0].replace(/\D/g, '')
let imports: string = ''
let downSQL: string
let upSQL: string
const formattedDate = yyymmdd?.replace(/\D/g, '')
const formattedTime = hhmmss?.split('.')[0]?.replace(/\D/g, '')
let imports: string | undefined = ''
let downSQL: string | undefined
let upSQL: string | undefined
;({ downSQL, imports, upSQL } = await getPredefinedMigration({
dirname,
file,

View File

@@ -9,7 +9,11 @@ export const createBlocksMap = (data: Record<string, unknown>): BlocksMap => {
if (key.startsWith('_blocks_') && Array.isArray(rows)) {
let blockType = key.replace('_blocks_', '')
const parsed = blockType.split('_')
if (parsed.length === 2 && Number.isInteger(Number(parsed[1]))) {
if (
parsed.length === 2 &&
Number.isInteger(Number(parsed[1])) &&
typeof parsed[0] === 'string'
) {
blockType = parsed[0]
}
@@ -20,7 +24,7 @@ export const createBlocksMap = (data: Record<string, unknown>): BlocksMap => {
}
row.blockType = blockType
blocksMap[row._path].push(row)
blocksMap[row._path]?.push(row)
delete row._path
}
@@ -30,7 +34,7 @@ export const createBlocksMap = (data: Record<string, unknown>): BlocksMap => {
}
})
Object.entries(blocksMap).reduce((sortedBlocksMap, [path, blocks]) => {
Object.entries(blocksMap).reduce((sortedBlocksMap: any, [path, blocks]) => {
sortedBlocksMap[path] = blocks.sort((a, b) => {
if (typeof a._order === 'number' && typeof b._order === 'number') {
return a._order - b._order

View File

@@ -75,7 +75,7 @@ export const createSchemaGenerator = ({
let schemaDeclaration: null | string = null
if (this.schemaName) {
if (this.schemaName && schemaImport) {
addImport(corePackage, schemaImport)
schemaDeclaration = `export const db_schema = ${schemaImport}('${this.schemaName}')`
}
@@ -113,11 +113,18 @@ export const createSchemaGenerator = ({
for (const tableName in this.rawTables) {
const table = this.rawTables[tableName]
if (!table) {
continue
}
const extrasDeclarations: string[] = []
if (table.indexes) {
for (const key in table.indexes) {
const index = table.indexes[key]
if (!index) {
continue
}
let indexDeclaration = `${sanitizeObjectKey(key)}: ${index.unique ? 'uniqueIndex' : 'index'}('${index.name}')`
indexDeclaration += `.on(${typeof index.on === 'string' ? `${accessProperty('columns', index.on)}` : `${index.on.map((on) => `${accessProperty('columns', on)}`).join(', ')}`}),`
extrasDeclarations.push(indexDeclaration)
@@ -127,7 +134,9 @@ export const createSchemaGenerator = ({
if (table.foreignKeys) {
for (const key in table.foreignKeys) {
const foreignKey = table.foreignKeys[key]
if (!foreignKey) {
continue
}
let foreignKeyDeclaration = `${sanitizeObjectKey(key)}: foreignKey({
columns: [${foreignKey.columns.map((col) => `columns['${col}']`).join(', ')}],
foreignColumns: [${foreignKey.foreignColumns.map((col) => `${accessProperty(col.table, col.name)}`).join(', ')}],
@@ -179,10 +188,16 @@ ${Object.entries(table.columns)
for (const tableName in this.rawRelations) {
const relations = this.rawRelations[tableName]
if (!relations) {
continue
}
const properties: string[] = []
for (const key in relations) {
const relation = relations[key]
if (!relation) {
continue
}
let declaration: string
if (relation.type === 'one') {
@@ -221,7 +236,7 @@ ${Object.entries(table.columns)
relationsDeclarations.push(declaration)
}
if (enumDeclarations.length && !this.schemaName) {
if (enumDeclarations.length && !this.schemaName && enumImport) {
addImport(corePackage, enumImport)
}
@@ -229,6 +244,9 @@ ${Object.entries(table.columns)
for (const moduleName in importDeclarations) {
const moduleImports = importDeclarations[moduleName]
if (!moduleImports) {
continue
}
importDeclarationsSanitized.push(
`import { ${Array.from(moduleImports).join(', ')} } from '${moduleName}'`,

View File

@@ -26,7 +26,9 @@ type Args = {
/**
* Extends the passed table with additional columns / extra config
*/
export const extendDrizzleTable = ({ columns, extraConfig, table }: Args): void => {
export const extendDrizzleTable = ({ columns, extraConfig, table: tableFromArgs }: Args): void => {
// hard drizzle types
const table: any = tableFromArgs
const InlineForeignKeys = Object.getOwnPropertySymbols(table).find((symbol) => {
return symbol.description?.includes('InlineForeignKeys')
})
@@ -53,7 +55,7 @@ export const extendDrizzleTable = ({ columns, extraConfig, table }: Args): void
if (extraConfig) {
const originalExtraConfigBuilder = table[DrizzleSymbol.ExtraConfigBuilder]
table[DrizzleSymbol.ExtraConfigBuilder] = (t) => {
table[DrizzleSymbol.ExtraConfigBuilder] = (t: any) => {
return {
...originalExtraConfigBuilder(t),
...extraConfig(t),

View File

@@ -0,0 +1,90 @@
import type { LibSQLDatabase } from 'drizzle-orm/libsql'
import type { RelationalQueryBuilder } from 'drizzle-orm/sqlite-core/query-builders/query'
import type { CollectionSlug, GlobalSlug } from 'payload'
import { APIError } from 'payload'
import toSnakeCase from 'to-snake-case'
import type { DrizzleAdapter } from '../types.js'
export const getCollection = ({
adapter,
collectionSlug,
versions = false,
}: {
adapter: DrizzleAdapter
collectionSlug: CollectionSlug
versions?: boolean
}) => {
const collection = adapter.payload.collections[collectionSlug]
if (!collection) {
throw new APIError(`Collection with the slug ${collectionSlug} was not found in the config.`)
}
const tableNameKey = versions
? `_${toSnakeCase(collection.config.slug)}${adapter.versionsSuffix}`
: toSnakeCase(collection.config.slug)
const tableName = adapter.tableNameMap.get(tableNameKey)
if (!tableName) {
throw new APIError(
`Table for collection with the slug ${collectionSlug} ${tableName} was not found.`,
)
}
return {
collectionConfig: collection.config,
tableName,
}
}
export const getGlobal = ({
adapter,
globalSlug,
versions = false,
}: {
adapter: DrizzleAdapter
globalSlug: GlobalSlug
versions?: boolean
}) => {
const globalConfig = adapter.payload.config.globals.find((each) => each.slug === globalSlug)
if (!globalConfig) {
throw new APIError(`Global with the slug ${globalSlug} was not found in the config.`)
}
const tableNameKey = versions
? `_${toSnakeCase(globalConfig.slug)}${adapter.versionsSuffix}`
: toSnakeCase(globalConfig.slug)
const tableName = adapter.tableNameMap.get(tableNameKey)
if (!tableName) {
throw new APIError(`Table for global with the slug ${globalSlug} ${tableName} was not found.`)
}
return {
globalConfig,
tableName,
}
}
export const getTableQuery = ({
adapter,
tableName,
}: {
adapter: DrizzleAdapter
tableName: string
}) => {
const drizzle = adapter.drizzle
// @ts-expect-error we don't have drizzle schema types
const table = drizzle.query[tableName] as RelationalQueryBuilder<any, any, any, any> | undefined
if (!table) {
throw new APIError(`Table with the name ${tableName} was not found.`)
}
return table
}

View File

@@ -14,10 +14,10 @@ export const getMigrationTemplate = ({
}: MigrationTemplateArgs): string => `import { MigrateUpArgs, MigrateDownArgs, sql } from '${packageName}'
${imports ? `${imports}\n` : ''}
export async function up({ db, payload, req }: MigrateUpArgs): Promise<void> {
${indent(upSQL)}
${upSQL ? indent(upSQL) : ''}
}
export async function down({ db, payload, req }: MigrateDownArgs): Promise<void> {
${indent(downSQL)}
${downSQL ? indent(downSQL) : ''}
}
`

View File

@@ -1,9 +1,9 @@
import type { Table } from 'drizzle-orm'
export const getNameFromDrizzleTable = (table: Table): string => {
const symbol = Object.getOwnPropertySymbols(table).find((symb) =>
symb.description.includes('Name'),
const symbol = Object.getOwnPropertySymbols(table).find(
(symb) => symb && symb.description?.includes('Name'),
)
return table[symbol]
return (table as any)[symbol!]
}

View File

@@ -4,7 +4,7 @@ import { fieldAffectsData, fieldHasSubFields, fieldShouldBeLocalized } from 'pay
export const hasLocalesTable = ({
fields,
parentIsLocalized,
parentIsLocalized = false,
}: {
fields: Field[]
/**

View File

@@ -6,11 +6,11 @@ export const isPolymorphicRelationship = (
relationTo: CollectionSlug
value: number | string
} => {
return (
return Boolean(
value &&
typeof value === 'object' &&
'relationTo' in value &&
typeof value.relationTo === 'string' &&
'value' in value
typeof value === 'object' &&
'relationTo' in value &&
typeof value.relationTo === 'string' &&
'value' in value,
)
}

View File

@@ -31,5 +31,5 @@ export const migrationTableExists = async (
const [row] = result.rows
return row && typeof row === 'object' && 'exists' in row && !!row.exists
return Boolean(row && typeof row === 'object' && 'exists' in row && !!row.exists)
}

View File

@@ -35,14 +35,14 @@ export const pushDevSchema = async (adapter: DrizzleAdapter) => {
return
} else {
previousSchema.localeCodes = localeCodes
previousSchema.localeCodes = localeCodes || null
previousSchema.rawTables = adapter.rawTables
}
}
const { pushSchema } = adapter.requireDrizzleKit()
const { extensions = {}, tablesFilter } = adapter as BasePostgresAdapter
const { extensions = {}, tablesFilter } = adapter as unknown as BasePostgresAdapter
// This will prompt if clarifications are needed for Drizzle to push new schema
const { apply, hasDataLoss, warnings } = await pushSchema(

View File

@@ -9,5 +9,7 @@ export const rawConstraint = (value: unknown) => ({
})
export const isRawConstraint = (value: unknown): value is ReturnType<typeof rawConstraint> => {
return value && typeof value === 'object' && 'type' in value && value.type === RawConstraintSymbol
return Boolean(
value && typeof value === 'object' && 'type' in value && value.type === RawConstraintSymbol,
)
}

View File

@@ -27,7 +27,7 @@ const getFlattenedFieldNames = (args: {
prefix?: string
}): { localized?: boolean; name: string }[] => {
const { fields, parentIsLocalized, prefix = '' } = args
return fields.reduce((fieldsToUse, field) => {
return fields.reduce<{ localized?: boolean; name: string }[]>((fieldsToUse, field) => {
let fieldPrefix = prefix
if (
@@ -43,7 +43,8 @@ const getFlattenedFieldNames = (args: {
...fieldsToUse,
...getFlattenedFieldNames({
fields: field.fields,
parentIsLocalized: parentIsLocalized || ('localized' in field && field.localized),
parentIsLocalized:
(parentIsLocalized || ('localized' in field && field.localized)) ?? false,
prefix: fieldPrefix,
}),
]
@@ -52,7 +53,7 @@ const getFlattenedFieldNames = (args: {
if (field.type === 'tabs') {
return [
...fieldsToUse,
...field.tabs.reduce((tabFields, tab) => {
...field.tabs.reduce<{ localized?: boolean; name: string }[]>((tabFields, tab) => {
fieldPrefix = 'name' in tab ? `${prefix}_${tab.name}` : prefix
return [
...tabFields,
@@ -60,7 +61,7 @@ const getFlattenedFieldNames = (args: {
? [{ ...tab, type: 'tab' }]
: getFlattenedFieldNames({
fields: tab.fields,
parentIsLocalized: parentIsLocalized || tab.localized,
parentIsLocalized: (parentIsLocalized || tab.localized) ?? false,
prefix: fieldPrefix,
})),
]
@@ -119,13 +120,13 @@ export const validateExistingBlockIsIdentical = ({
export const InternalBlockTableNameIndex = Symbol('InternalBlockTableNameIndex')
export const setInternalBlockIndex = (block: FlattenedBlock, index: number) => {
block[InternalBlockTableNameIndex] = index
;(block as any)[InternalBlockTableNameIndex] = index
}
export const resolveBlockTableName = (block: FlattenedBlock, originalTableName: string) => {
if (!block[InternalBlockTableNameIndex]) {
if (!(block as any)[InternalBlockTableNameIndex]) {
return originalTableName
}
return `${originalTableName}_${block[InternalBlockTableNameIndex]}`
return `${originalTableName}_${(block as any)[InternalBlockTableNameIndex]}`
}

View File

@@ -1,9 +1,4 @@
{
"extends": "../../tsconfig.base.json",
"compilerOptions": {
/* TODO: remove the following lines */
"strict": false,
"noUncheckedIndexedAccess": false,
},
"references": [{ "path": "../payload" }, { "path": "../translations" }]
}

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/email-nodemailer",
"version": "3.39.1",
"version": "3.40.0",
"description": "Payload Nodemailer Email Adapter",
"homepage": "https://payloadcms.com",
"repository": {
@@ -17,6 +17,7 @@
"url": "https://payloadcms.com"
}
],
"sideEffects": false,
"type": "module",
"exports": {
".": {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/email-resend",
"version": "3.39.1",
"version": "3.40.0",
"description": "Payload Resend Email Adapter",
"homepage": "https://payloadcms.com",
"repository": {
@@ -17,6 +17,7 @@
"url": "https://payloadcms.com"
}
],
"sideEffects": false,
"type": "module",
"exports": {
".": {

View File

@@ -36,7 +36,7 @@
"eslint-plugin-jest-dom": "5.5.0",
"eslint-plugin-jsx-a11y": "6.10.2",
"eslint-plugin-perfectionist": "3.9.1",
"eslint-plugin-react-compiler": "19.0.0-beta-e993439-20250405",
"eslint-plugin-react-compiler": "19.1.0-rc.2",
"eslint-plugin-react-hooks": "0.0.0-experimental-d331ba04-20250307",
"eslint-plugin-regexp": "2.7.0",
"globals": "16.0.0",

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/graphql",
"version": "3.39.1",
"version": "3.40.0",
"homepage": "https://payloadcms.com",
"repository": {
"type": "git",
@@ -16,6 +16,7 @@
"url": "https://payloadcms.com"
}
],
"sideEffects": false,
"type": "module",
"exports": {
".": {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/live-preview-react",
"version": "3.39.1",
"version": "3.40.0",
"description": "The official React SDK for Payload Live Preview",
"homepage": "https://payloadcms.com",
"repository": {
@@ -17,6 +17,7 @@
"url": "https://payloadcms.com"
}
],
"sideEffects": false,
"type": "module",
"exports": {
".": {

View File

@@ -2,18 +2,27 @@
import { ready, subscribe, unsubscribe } from '@payloadcms/live-preview'
import { useCallback, useEffect, useRef, useState } from 'react'
// To prevent the flicker of missing data on initial load,
// you can pass in the initial page data from the server
// To prevent the flicker of stale data while the post message is being sent,
// you can conditionally render loading UI based on the `isLoading` state
export const useLivePreview = <T extends Record<string, unknown>>(props: {
/**
* This is a React hook to implement {@link https://payloadcms.com/docs/live-preview/overview Payload Live Preview}.
*
* @link https://payloadcms.com/docs/live-preview/frontend
*/
// NOTE: cannot use Record<string, unknown> here bc generated interfaces will not satisfy the type constraint
export const useLivePreview = <T extends Record<string, any>>(props: {
apiRoute?: string
depth?: number
/**
* To prevent the flicker of missing data on initial load,
* you can pass in the initial page data from the server.
*/
initialData: T
serverURL: string
}): {
data: T
/**
* To prevent the flicker of stale data while the post message is being sent,
* you can conditionally render loading UI based on the `isLoading` state.
*/
isLoading: boolean
} => {
const { apiRoute, depth, initialData, serverURL } = props

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/live-preview-vue",
"version": "3.39.1",
"version": "3.40.0",
"description": "The official Vue SDK for Payload Live Preview",
"homepage": "https://payloadcms.com",
"repository": {
@@ -17,6 +17,7 @@
"url": "https://payloadcms.com"
}
],
"sideEffects": false,
"type": "module",
"exports": {
".": {

View File

@@ -4,17 +4,25 @@ import { ready, subscribe, unsubscribe } from '@payloadcms/live-preview'
import { onMounted, onUnmounted, ref } from 'vue'
/**
* Vue composable to implement Payload CMS Live Preview.
* This is a Vue composable to implement {@link https://payloadcms.com/docs/live-preview/overview Payload Live Preview}.
*
* {@link https://payloadcms.com/docs/live-preview/frontend View the documentation}
* @link https://payloadcms.com/docs/live-preview/frontend
*/
export const useLivePreview = <T extends Record<string, unknown>>(props: {
export const useLivePreview = <T extends Record<string, any>>(props: {
apiRoute?: string
depth?: number
/**
* To prevent the flicker of missing data on initial load,
* you can pass in the initial page data from the server.
*/
initialData: T
serverURL: string
}): {
data: Ref<T>
/**
* To prevent the flicker of stale data while the post message is being sent,
* you can conditionally render loading UI based on the `isLoading` state.
*/
isLoading: Ref<boolean>
} => {
const { apiRoute, depth, initialData, serverURL } = props

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/live-preview",
"version": "3.39.1",
"version": "3.40.0",
"description": "The official live preview JavaScript SDK for Payload",
"homepage": "https://payloadcms.com",
"repository": {
@@ -17,6 +17,7 @@
"url": "https://payloadcms.com"
}
],
"sideEffects": false,
"type": "module",
"exports": {
".": {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/next",
"version": "3.39.1",
"version": "3.40.0",
"homepage": "https://payloadcms.com",
"repository": {
"type": "git",
@@ -16,6 +16,10 @@
"url": "https://payloadcms.com"
}
],
"sideEffects": [
"*.scss",
"*.css"
],
"type": "module",
"exports": {
".": {
@@ -104,22 +108,22 @@
"uuid": "10.0.0"
},
"devDependencies": {
"@babel/cli": "7.26.4",
"@babel/core": "7.26.7",
"@babel/preset-env": "7.26.7",
"@babel/preset-react": "7.26.3",
"@babel/preset-typescript": "7.26.0",
"@next/eslint-plugin-next": "15.3.0",
"@babel/cli": "7.27.2",
"@babel/core": "7.27.3",
"@babel/preset-env": "7.27.2",
"@babel/preset-react": "7.27.1",
"@babel/preset-typescript": "7.27.1",
"@next/eslint-plugin-next": "15.3.2",
"@payloadcms/eslint-config": "workspace:*",
"@types/busboy": "1.5.4",
"@types/react": "19.1.0",
"@types/react-dom": "19.1.2",
"@types/uuid": "10.0.0",
"babel-plugin-react-compiler": "19.0.0-beta-e993439-20250405",
"esbuild": "0.24.2",
"babel-plugin-react-compiler": "19.1.0-rc.2",
"esbuild": "0.25.5",
"esbuild-sass-plugin": "3.3.1",
"payload": "workspace:*",
"swc-plugin-transform-remove-imports": "3.1.0"
"swc-plugin-transform-remove-imports": "4.0.4"
},
"peerDependencies": {
"graphql": "^16.8.1",

View File

@@ -1,3 +1,5 @@
@import '~@payloadcms/ui/scss';
@layer payload-default {
.doc-tab {
display: flex;

View File

@@ -1,4 +1,4 @@
@import '../../../scss/styles.scss';
@import '~@payloadcms/ui/scss';
@layer payload-default {
.doc-tabs {

View File

@@ -62,11 +62,31 @@ export const DocumentTabs: React.FC<{
(condition &&
Boolean(condition({ collectionConfig, config, globalConfig, permissions })))
const path = viewConfig && 'path' in viewConfig ? viewConfig.path : ''
if (meetsCondition) {
if (tabFromConfig?.Component) {
return RenderServerComponent({
clientProps: {
path,
} satisfies DocumentTabClientProps,
Component: tabFromConfig.Component,
importMap: payload.importMap,
key: `tab-${index}`,
serverProps: {
collectionConfig,
globalConfig,
i18n,
payload,
permissions,
} satisfies DocumentTabServerPropsOnly,
})
}
return (
<DocumentTab
key={`tab-${index}`}
path={viewConfig && 'path' in viewConfig ? viewConfig.path : ''}
path={path}
{...{
...props,
...(tab || {}),

View File

@@ -1,4 +1,4 @@
@import '../../scss/styles.scss';
@import '~@payloadcms/ui/scss';
@layer payload-default {
.doc-header {

Some files were not shown because too many files have changed in this diff Show More