Compare commits

..

93 Commits

Author SHA1 Message Date
Elliot DeNolf
c68189788c chore(release): v3.0.0-beta.39 [skip ci] 2024-05-30 14:26:03 -04:00
Jacob Fletcher
e603c83f55 fix(next): ssr live preview was not dispatching document save events (#6572) 2024-05-30 14:20:22 -04:00
Dan Ribbens
edfa85bcd5 feat(db-postgres)!: relationship column (#6339)
BREAKING CHANGE:

Moves `upload` field and `relationship` fields with `hasMany: false` &
`relationTo: string` from the many-to-many `_rels` join table to simple
columns. This only affects Postgres database users.

## TL;DR

We have dramatically simplified the storage of simple relationships in
relational databases to boost performance and align with more expected
relational paradigms. If you are using the beta Postgres adapter, and
you need to keep simple relationship data, you'll need to run a
migration script that we provide you.

### Background

For example, prior to this update, a collection of "posts" with a simple
`hasMany: false` and `relationTo: 'categories'` field would have a
`posts_rels` table where the category relations would be stored.

This was somewhat unnecessary as simple relations like this can be
expressed with a `category_id` column which is configured as a foreign
key. This also introduced added complexity for dealing directly with the
database if all you have are simple relations.

### Who needs to migrate

You need to migrate if you are using the beta Postgres database adapter
and any of the following applies to you.

- If you have versions enabled on any collection / global
- If you use the `upload` field 
- If you have relationship fields that are `hasMany: false` (default)
and `relationTo` to a single category ([has
one](https://payloadcms.com/docs/fields/relationship#has-one)) relations

### We have a migration for you

Even though the Postgres adapter is in beta, we've prepared a predefined
migration that will work out of the box for you to migrate from an
earlier version of the adapter to the most recent version easily.

It makes the schema changes in step with actually moving the data from
the old locations to the new before adding any null constraints and
dropping the old columns and tables.

### How to migrate

The steps to preserve your data while making this update are as follows.
These steps are the same whether you are moving from Payload v2 to v3 or
a previous version of v3 beta to the most recent v3 beta.

**Important: during these steps, don't start the dev server unless you
have `push: false` set on your Postgres adapter.**

#### Step 1 - backup

Always back up your database before performing big changes, especially
in production cases.

#### Step 2 - create a pre-update migration 
Before updating to new Payload and Postgres adapter versions, run
`payload migrate:create` without any other config changes to have a
prior snapshot of the schema from the previous adapter version

#### Step 3 - if you're migrating a dev DB, delete the dev `push` row
from your `payload_migrations` table

If you're migrating a dev database where you have the default setting to
push database changes directly to your DB, and you need to preserve data
in your development database, then you need to delete a `dev` migration
record from your database.

Connect directly to your database in any tool you'd like and delete the
dev push record from the `payload_migrations` table using the following
SQL statement:

```sql
DELETE FROM payload_migrations where batch = -1`
```

#### Step 4 - update Payload and Postgres versions to most recent

Update packages, making sure you have matching versions across all
`@payloadcms/*` and `payload` packages (including
`@payloadcms/db-postgres`)

#### Step 5 - create the predefined migration

Run the following command to create the predefined migration we've
provided:

```
payload migrate:create --file @payloadcms/db-postgres/relationships-v2-v3
```

#### Step 6 - migrate!

Run migrations with the following command: 

```
payload migrate
```

Assuming the migration worked, you can proceed to commit this change and
distribute it to be run on all other environments.

Note that if two servers connect to the same database, only one should
be running migrations to avoid transaction conflicts.

Related discussion:
https://github.com/payloadcms/payload/discussions/4163

---------

Co-authored-by: James <james@trbl.design>
Co-authored-by: PatrikKozak <patrik@payloadcms.com>
2024-05-30 14:09:11 -04:00
Elliot DeNolf
b86d4c647f chore(release): v3.0.0-beta.38 [skip ci] 2024-05-30 11:24:12 -04:00
Elliot DeNolf
4884f0d297 fix(cpa): safer command exists check (#6569)
- Use execa to check if command exists
- Remove third-party dep
2024-05-30 11:11:40 -04:00
Patrik
f1db24e303 fix(ui): adjusts sizing of remove/add buttons to be same size (#6529)
## Description

V2 PR [here](https://github.com/payloadcms/payload/pull/6527)

- [x] I have read and understand the
[CONTRIBUTING.md](https://github.com/payloadcms/payload/blob/main/CONTRIBUTING.md)
document in this repository.

## Type of change

- [x] Bug fix (non-breaking change which fixes an issue)

## Checklist:

- [x] Existing test suite passes locally with my changes
2024-05-30 09:42:25 -04:00
Patrik
7f15147286 fix: ui field validation error with admin.disableListColumn property (#6531)
## Description

V2 PR [here](https://github.com/payloadcms/payload/pull/6530)

- [x] I have read and understand the
[CONTRIBUTING.md](https://github.com/payloadcms/payload/blob/main/CONTRIBUTING.md)
document in this repository.

## Type of change

- [x] Bug fix (non-breaking change which fixes an issue)
- [x] This change requires a documentation update

## Checklist:

- [x] Existing test suite passes locally with my changes
- [x] I have made corresponding changes to the documentation
2024-05-30 09:41:58 -04:00
Patrik
e0a6db7f97 fix(translations): adds new userEmailAlreadyRegistered translations (#6550)
## Description

V2 PR [here](https://github.com/payloadcms/payload/pull/6549)

- [x] I have read and understand the
[CONTRIBUTING.md](https://github.com/payloadcms/payload/blob/main/CONTRIBUTING.md)
document in this repository.

## Type of change

- [x] Bug fix (non-breaking change which fixes an issue)

## Checklist:

- [x] Existing test suite passes locally with my changes
2024-05-30 09:37:02 -04:00
Elliot DeNolf
0d7d3e5c92 fix(cpa): more package manager detection improvements (#6566)
- Adjust package manager detection logic
- Remove pnpm-lock.yaml from blank template
2024-05-30 09:35:53 -04:00
Elliot DeNolf
8b49402e4c chore(templates): consolidate initial vercel-postgres migration (#6568) 2024-05-30 09:33:35 -04:00
Jarrod Flesch
347464250e fix: duplicate options appearing in relationship where builder (#6557)
- enables reactStrictMode by default
- enables reactCompiler by default
- fixes cases where ID's set to 0 broke UI
2024-05-30 00:35:59 -04:00
Paul
aa02801c3d fix(plugin-search): Render error on custom UI component (#6562)
## Type of change

- [x] Bug fix (non-breaking change which fixes an issue)
2024-05-30 01:56:06 +00:00
Elliot DeNolf
a3ee07f693 chore: import vercel-postgres one-click template (#6564)
Import vercel one-click template
2024-05-29 18:06:52 -04:00
Jessica Chowdhury
511908a964 docs: adds sentry to plugin docs (#6475) 2024-05-29 15:19:16 -04:00
Jarrod Flesch
425576be25 fix: ensure relationship field pills respect isSortable property (#6561) 2024-05-29 15:12:42 -04:00
Jacob Fletcher
92f458dad2 feat(next,ui): improves loading states (#6434) 2024-05-29 14:01:13 -04:00
Jarrod Flesch
043a91d719 fix: ability to query relationships not equal to ID (#6555) 2024-05-29 13:47:44 -04:00
Jacob Fletcher
54e2d7fb38 docs: renames vercel visual editing to vercel content link (#6559) 2024-05-29 13:39:40 -04:00
Jacob Fletcher
321e97f9fe feat: extracts buildFormState logic from endpoint for reuse (#6501) 2024-05-29 12:51:16 -04:00
Elliot DeNolf
4e0dfd410d chore(release): v3.0.0-beta.37 [skip ci] 2024-05-29 10:54:45 -04:00
Elliot DeNolf
8506385ef9 fix(cpa): improve package manager detection (#6546)
Improves package manager detection.

Closes #6231
2024-05-29 09:30:15 -04:00
Jarrod Flesch
e74952902e fix: multi value draggable/sortable pills (#6500) 2024-05-29 08:22:37 -04:00
Alessio Gravili
4a51f4d2c1 fix(richtext-lexical): various html converter fixes (#6544) 2024-05-29 00:29:25 -04:00
Alessio Gravili
2c283bcc08 fix(richtext-lexical): user-defined html converters not taking precedence, and shared default html converters doubling in size after every field initialization 2024-05-29 00:14:58 -04:00
Alessio Gravili
a2e9bcd333 fix(richtext-lexical): list converters and nodes being added duplicatively 2024-05-28 23:53:35 -04:00
Alessio Gravili
33d53121a2 feat(richtext-lexical): link markdown transformers (#6543)
Closes https://github.com/payloadcms/payload/issues/6507

---------

Co-authored-by: ShawnVogt <41651465+shawnvogt@users.noreply.github.com>
2024-05-29 03:28:26 +00:00
Leo Hilsheimer
e0b201c810 fix(richtext-lexical): link html converter: serialize newTab to target="_blank" (#6350)
Co-authored-by: Leo <leo.hilsheimer@gmail.com>
2024-05-28 23:20:44 -04:00
Alessio Gravili
a8000f644f feat(richtext-lexical): i18n (#6542)
Continuation of https://github.com/payloadcms/payload/pull/6524
2024-05-29 02:40:48 +00:00
Paul
7d0e909a30 feat(plugin-form-builder)!: update form builder plugin field overrides to use a function instead (#6497)
## Description

Changes the `fields` override for form builder plugin to use a function
instead so that we can actually override existing fields which currently
will not work.

```ts
//before
fields: [
  {
    name: 'custom',
    type: 'text',
  }
]

// current
fields: ({ defaultFields }) => {
  return [
    ...defaultFields,
    {
      name: 'custom',
      type: 'text',
    },
  ]
}
```

## Type of change

- [x] Breaking change (fix or feature that would cause existing
functionality to not work as expected)
2024-05-28 17:45:51 -03:00
Elliot DeNolf
b2662eeb1f ci: update app-build-with-packed job (#6541)
Add `--ignore-workspace` and `--no-frozen-lockfile` where necessary
2024-05-28 14:35:18 -04:00
Elliot DeNolf
0b274dd67e chore: adjust email-nodemailer workspace dep pattern (#6539)
Adjust dep pattern for email-nodemailer reference from plugin-cloud
2024-05-28 14:19:21 -04:00
Elliot DeNolf
2ddd50edc4 fix(deps): proper location for scheduler peer dep (#6537)
Properly put `scheduler` dep under `ui` instead of `payload`.
2024-05-28 14:15:56 -04:00
Elliot DeNolf
0287acb8f0 chore(templates): update dockerfile and docker-compose for blank template (#6536)
Update Dockerfile and docker-compose.yml for blank template.
2024-05-28 12:35:05 -04:00
Elliot DeNolf
10c94b3665 feat(cpa): update existing payload installation (#6193)
Updates create-payload-app to update an existing payload installation

- Detects existing Payload installation. Fixes #6517 
- If not latest, will install latest and grab the `(payload)` directory
structure (ripped from `templates/blank-3.0`
2024-05-28 11:38:33 -04:00
Alessio Gravili
ea48ca377e chore: move lexical package from workspace-root to test package (#6533) 2024-05-28 15:01:30 +00:00
zvizvi
6f5d86ed84 fix: Add missing He lang export in payload/i18n (#6484)
## Description
Fixed missing Hebrew language export in payload/i18n module.
The import statement import { he } from 'payload/i18n/he' was not
functioning due to he not being exported correctly.


<!-- Please include a summary of the pull request and any related issues
it fixes. Please also include relevant motivation and context. -->

- [x] I have read and understand the
[CONTRIBUTING.md](https://github.com/payloadcms/payload/blob/main/CONTRIBUTING.md)
document in this repository.

## Type of change

<!-- Please delete options that are not relevant. -->

- [x] Chore (non-breaking change which does not add functionality)
2024-05-28 10:52:20 -03:00
Alessio Gravili
c383f391e3 feat(richtext-lexical): i18n support (#6524) 2024-05-28 09:10:39 -04:00
Paul
8a91a7adbb feat(richtext-lexical): update validation of custom URLs to include relative and anchor links (#6525)
Updates the regex to allow relative and anchor links as well. Manually
tested all common variations of absolute, relative and anchor links with
a combination

## Type of change

- [x] New feature (non-breaking change which adds functionality)
2024-05-28 00:55:00 -03:00
Alessio Gravili
96181d91a6 chore(ui): add ability to compile using react compiler (#6483)
This does not enable the react compiler by default
2024-05-27 22:54:36 -04:00
Alessio Gravili
eff5129a5f fix(next): unable to pass custom view client components (#6513) 2024-05-27 22:48:41 -04:00
Dan Ribbens
38e5adc462 chore: fix seed file path windows (#6512) 2024-05-26 15:39:16 +00:00
Dan Ribbens
ff4ea1eecc chore: getPayloadHMR conditionally call db.connect (#6510) 2024-05-25 22:17:27 +00:00
Dan Ribbens
dbfd1beed5 chore: gitignore static files uploads tests (#6509) 2024-05-25 22:06:43 +00:00
Dan Ribbens
4b6774463e chore: loader tests error on windows (#6508) 2024-05-25 21:57:32 +00:00
Dan Ribbens
cb14b97a6e chore: swcrc syntax fix (#6505) 2024-05-25 15:45:05 +00:00
Jarrod Flesch
18bc4b708c fix: separate collection docs with same ids were excluded in selectable (#6499) 2024-05-24 15:20:07 -04:00
Paul
6d951e6987 chore: add lexical as a direct dependency to the website template (#6496) 2024-05-24 16:33:36 +00:00
Elliot DeNolf
365660764d chore(templates): enable next lint on blank (#6494)
Enables next linting on blank template

Closes #6481
2024-05-24 11:36:50 -04:00
Elliot DeNolf
8b91af8a5b chore(cpa): adjust unit test template (#6490)
Adjust template used in unit tests.
2024-05-24 10:12:19 -04:00
Paul
b4092f59ae chore: fix seed data validation in website template (#6491)
Fixes an issue with data validation in lexical for the seed script
2024-05-24 14:10:29 +00:00
Alessio Gravili
7a768144ea fix(richtext-lexical): localized sub-fields were omitted from the API output (#6489)
Closes #6455. Proper localization support will be worked on later, this
just resolves the issue where having it enabled not only doesn't
localize those fields, it also omits them from the API response. Now,
they are not omitted, and localization is simply skipped.
2024-05-24 10:01:04 -04:00
Elliot DeNolf
3839eb5ab0 chore(templates): remove blank v2 template (#6488)
New v3 is `blank-3.0`. Will rename that one in future PR.
2024-05-24 09:36:57 -04:00
Paul
fd02bee0fe chore: website template updates (#6480)
Just style updates
2024-05-23 20:38:25 +00:00
Patrik
42222cd2f6 fix(ui): where builder issues (#6478)
Co-authored-by: Jarrod Flesch <jarrodmflesch@gmail.com>
2024-05-23 16:01:13 -04:00
Elliot DeNolf
e3222f2ac3 chore(release): v3.0.0-beta.36 [skip ci] 2024-05-23 13:35:19 -04:00
Alessio Gravili
35f961fecb feat!: next.js 15, react 19, react compiler support (#6429)
**BREAKING:**
- bumps minimum required next.js version from `14.3.0-canary.68` to
`15.0.0-rc.0`
- bumps minimum required react and react-dom versions to `19.0.0
`(`19.0.0-rc-f994737d14-20240522` should be used)
- `@types/react` and `@types/react-dom` have to be bumped to
`npm:types-react@19.0.0-beta.2` using overrides and pnpm overrides, if
you want correct types. You can find an example of this here:
https://github.com/payloadcms/payload/pull/6429/files#diff-10cb9e57a77733f174ee2888587281e94c31f79e434aa3f932a8ec72fa7a5121L32

## Issues

- Bunch of todos for our react-select package which is having type
issues. Works fine, just type issues. Their type defs are importing JSX
in a weird way, we likely just have to wait until they fix them in a
future update.
2024-05-23 13:30:12 -04:00
Paul
85bfca79ef feat: add website template (#6470)
Adds the new website template for 3.0
2024-05-23 16:48:41 +00:00
Alessio Gravili
661a4a099d feat(ui): split up Select component into Select and SelectInput (#6471) 2024-05-23 11:36:57 -04:00
Jarrod Flesch
72c0534008 fix: adds host to initPage req creation (#6476) 2024-05-23 11:04:35 -04:00
Alessio Gravili
78579ed2bd feat(richtext-lexical): various UX and performance improvements (#6467) 2024-05-22 14:42:17 -04:00
Alessio Gravili
7bcb4ba1cc chore(email-*): remove excess backtick in readme install commands 2024-05-22 13:59:33 -04:00
Alessio Gravili
6b45cf3197 feat(richtext-lexical): improve block dragging UX 2024-05-22 13:55:44 -04:00
Elliot DeNolf
73d0b209d7 fix: isHotkey webpack error (#6466)
Fixes webpack issue with isHotkey: `TypeError:
is_hotkey__WEBPACK_IMPORTED_MODULE_9__ is not a function`

Changing this from a default import to a named export, and it appears to
resolve the issue.

Fixes #6421
2024-05-22 17:41:15 +00:00
Alessio Gravili
c93752bdbb fix(richtext-lexical): order of add/drag handles was inconsistent between gutter and no-gutter mode 2024-05-22 10:49:11 -04:00
Alessio Gravili
7a4dd5890e fix(ui): field errors aren't red in light mode 2024-05-22 10:29:41 -04:00
Alessio Gravili
60ee55fcaa chore(richtext-lexical): do not show red border & background for erroring field without gutter 2024-05-22 10:22:06 -04:00
Jacob Fletcher
1fe9790d0d feat(next): server-side theme detection (#6452) 2024-05-22 10:19:38 -04:00
zvizvi
3c0853a675 feat(translations): add Hebrew translation (#6428)
Hebrew translation added.
2024-05-22 14:15:10 +00:00
Jacob Fletcher
2b941b7e2c fix(next,ui): fixes global doc permissions and optimizes publish access data loading (#6451) 2024-05-22 10:03:12 -04:00
Elliot DeNolf
db772a058c chore: add label-author.yml 2024-05-22 09:09:52 -04:00
Alessio Gravili
0bfbf9c750 fix(richtext-lexical): link drawer sending too many form state requests for actions unrelated to links 2024-05-21 22:34:41 -04:00
Alessio Gravili
5c7647f45b ci: split up test suites (#6415) 2024-05-21 17:11:55 -04:00
Alessio Gravili
6c952875e8 feat(richtext-lexical): various gutter, error states & add/drag handle improvements (#6448)
## Gutter

Adds gutter by default:

![CleanShot 2024-05-21 at 16 24
13](https://github.com/payloadcms/payload/assets/70709113/09c59b6f-bd4a-4e81-bfdd-731d1cbbe075)


![CleanShot 2024-05-21 at 16 20
23](https://github.com/payloadcms/payload/assets/70709113/94df3e8c-663e-4b08-90cb-a24b2a788ff6)

can be disabled with admin.hideGutter

## Error states
![CleanShot 2024-05-21 at 16 21
18](https://github.com/payloadcms/payload/assets/70709113/06754d8f-c674-4aaa-a4e5-47e284970776)

Finally, proper error states display. Cleaner, and previously fields
were shown as erroring even though they weren't. No more!

## Drag & Block handles
Improved performance, and cleaned up code. Drag handle positions are now
only calculated for one editor rather than all editors on the page. Add
block handle calculation now uses a better algorithm to minimize the
amount of nodes which are iterated.

Additionally, have you noticed how sometimes the add button jumps to the
next node while the drag button is still at the previous node?


https://github.com/payloadcms/payload/assets/70709113/8dff3081-1de0-4902-8229-62f178f23549

No more! Now they behave the same. Feels a lot cleaner now.
2024-05-21 20:55:06 +00:00
Jacob Fletcher
af7e12aa2f chore(ui)!: uses consistent button naming conventions (#6444)
## Description

Renames the `Save` to `SaveButton`, etc. to match the already
established convention of the `PreviewButton`, etc. This matches the
imports with their respective component and type names, and also gives
these components more context to the developer whenever they're
rendered, i.e. its clearly just a button and not an entire block or
complex component.

**BREAKING**:

Import paths for these components have changed, if you were previously
importing these components into your own projects to customize, change
the import paths accordingly:

Old:
```ts
import { PublishButton } from '@payloadcms/ui/elements/Publish'
import { SaveButton } from '@payloadcms/ui/elements/Save'
import { SaveDraftButton } from '@payloadcms/ui/elements/SaveDraft'
```

New:
```ts
import { PublishButton } from '@payloadcms/ui/elements/PublishButton'
import { SaveButton } from '@payloadcms/ui/elements/SaveButton'
import { SaveDraftButton } from '@payloadcms/ui/elements/SaveDraftButton'
```

- [x] I have read and understand the
[CONTRIBUTING.md](https://github.com/payloadcms/payload/blob/main/CONTRIBUTING.md)
document in this repository.
2024-05-21 14:52:53 -04:00
Patrik
bcc506b423 fix(ui): disableListColumn fields not hidden in table columns (#6445)
## Description

Setting `disableListColumn` to `true` on a field would hide the field
from the column selector but not from the table columns.

- [x] I have read and understand the
[CONTRIBUTING.md](https://github.com/payloadcms/payload/blob/main/CONTRIBUTING.md)
document in this repository.

## Type of change

- [x] Bug fix (non-breaking change which fixes an issue)

## Checklist:

- [x] Existing test suite passes locally with my changes
2024-05-21 13:33:28 -04:00
Elliot DeNolf
3d7c8277d7 chore(release): v3.0.0-beta.35 [skip ci] 2024-05-21 10:51:19 -04:00
Paul
a8a2dc2347 chore!: export DefaultListView as ListView (#6432)
Change the exports of DefaultListView and DefaultEditView to be renamed
without "Default" as ListView

```ts
// before
import { DefaultEditView } from '@payloadcms/next/views'
import { DefaultListView } from '@payloadcms/next/views'

// after 
import { EditView } from '@payloadcms/next/views'
import { ListView } from '@payloadcms/next/views'
```
2024-05-21 10:22:44 -04:00
Alessio Gravili
6c74b0326b chore(richtext-lexical): improve node validation messages (#6443) 2024-05-21 10:19:24 -04:00
Paul
f51af92491 chore(translations): ai translation script should use formal language (#6433)
Added additional prompt to make sure the translation we receive is using
formal language where it makes sense.

In the context of latin languages for example:
- Spanish: "tu" should be using "vos"
- French: "tu" should be using "votre"

These differences can affect verb conjugations and in these languages it
comes across as less professional if informal language is used.
2024-05-21 10:15:01 -04:00
Alessio Gravili
77528a1e7d chore(richtext-slate): fix richtext container elements direction 2024-05-21 09:40:17 -04:00
Alessio Gravili
ba8b8e8330 chore(richtext-lexical): improve node validation messages 2024-05-21 09:36:58 -04:00
Jessica Chowdhury
23f9a32a99 fix: user verification email broken (#6442)
## Description

Closes
[#225](https://github.com/payloadcms/payload-3.0-demo/issues/225).

The user verification emails are not being sent and this error is shown:
```ts
APIError: Error sending email: 422 validation_error - Invalid `from` field. The email address needs to follow the `email@example.com` or `Name <email@example.com>` format.
```

The issue is resolved by updating the `from` property on the outgoing
verification email:
```ts
from: `"${email.defaultFromName}" <${email.defaultFromName}>`,
// to
from: `"${email.defaultFromName}" <${email. defaultFromAddress}>`,
```

**NOTE:** This was not broken in 2.0, see correct outgoing email
[here](https://github.com/payloadcms/payload/blob/main/packages/payload/src/auth/sendVerificationEmail.ts#L69).

- [X] I have read and understand the
[CONTRIBUTING.md](https://github.com/payloadcms/payload/blob/main/CONTRIBUTING.md)
document in this repository.

## Type of change

- [X] Bug fix (non-breaking change which fixes an issue)

## Checklist:

- [X] Existing test suite passes locally with my changes
2024-05-21 13:25:59 +00:00
Jessica Chowdhury
0190eb8b28 fix(ui): blocks browser save dialog from opening when hotkey used with no changes (#6366) 2024-05-21 09:00:34 -04:00
Anders Semb Hermansen
f482fdcfd5 fix: separate sort and search fields when looking up relationship. (#6440)
## Description

Default sort is used as searching field which is causing unexpected
behaviour described in https://github.com/payloadcms/payload/issues/4815
and https://github.com/payloadcms/payload/issues/5222 This bugfix
separates which field is used for sorting and which is used for
searching.

Fixes: https://github.com/payloadcms/payload/issues/4815
https://github.com/payloadcms/payload/issues/5222

@denolfe This fix is a port of the fix in
[#5964](https://github.com/payloadcms/payload/pull/5964) to beta branch.

- [X] I have read and understand the
[CONTRIBUTING.md](https://github.com/payloadcms/payload/blob/main/CONTRIBUTING.md)
document in this repository.

## Type of change

- [X] Bug fix (non-breaking change which fixes an issue)

## Checklist:

- [ ] I have added tests that prove my fix is effective or that my
feature works
- [ ] Existing test suite passes locally with my changes
- [ ] I have made corresponding changes to the documentation
2024-05-20 16:57:49 -04:00
Alessio Gravili
ed4766188d fix(ui): tooltip positioning issues (#6439) 2024-05-20 20:37:53 +00:00
Ritsu
e682cb1b04 fix(ui): update relationship cell formatted value when when search changes (#6208)
## Description

Fixes https://github.com/payloadcms/payload-3.0-demo/issues/181
Although issue is about page changing, it happens as well when you
change sort / limit / where filter (and probably locale)
<!-- Please include a summary of the pull request and any related issues
it fixes. Please also include relevant motivation and context. -->

- [x] I have read and understand the
[CONTRIBUTING.md](https://github.com/payloadcms/payload/blob/main/CONTRIBUTING.md)
document in this repository.

## Type of change

<!-- Please delete options that are not relevant. -->


- [x] Bug fix (non-breaking change which fixes an issue)

## Checklist:

- [x] Existing test suite passes locally with my changes

---------

Co-authored-by: Jessica Chowdhury <jessica@trbl.design>
2024-05-20 16:03:04 -04:00
Elliot DeNolf
36fda30c61 feat: store focal point on uploads (#6436)
Store focal point data on uploads as `focalX` and `focalY`

Addresses https://github.com/payloadcms/payload/discussions/4082

Mirrors #6364 for beta branch.
2024-05-20 15:57:52 -04:00
Alessio Gravili
fa7cc376d1 fix(richtext-lexical): field required validation not working if content was removed manually (#6435) 2024-05-20 17:17:54 +00:00
Paul
3fc2ff1ef9 chore: export DefaultListView for reuse (#6422)
Exports `DefaultListView` so other plugins or custom implementations can
re-use it
2024-05-20 11:53:36 -03:00
Jarrod Flesch
1d81eef805 fix: attributes graphql packages, adds esm import path (#6431) 2024-05-20 10:48:41 -04:00
Paul
8fcfac61b5 fix(plugin-seo): white screen of death on choosing an existing media for meta image (#6424)
Closes https://github.com/payloadcms/payload/issues/6423

## Type of change

- [x] Bug fix (non-breaking change which fixes an issue)
2024-05-19 04:51:19 +00:00
Elliot DeNolf
0d544dacdb chore(release): v3.0.0-beta.34 [skip ci] 2024-05-17 16:12:46 -04:00
Alessio Gravili
147b50e719 fix: page metadata generation not working in turbopack (#6417)
In turbo, payloadFaviconDark is a string, not an object with src
2024-05-17 15:44:12 -04:00
1053 changed files with 42910 additions and 55579 deletions

View File

@@ -31,6 +31,12 @@ module.exports = {
'perfectionist/sort-objects': 'off',
},
},
{
files: ['templates/vercel-postgres/**'],
rules: {
'no-restricted-exports': 'off',
},
},
{
files: ['package.json', 'tsconfig.json'],
rules: {

50
.github/workflows/label-author.yml vendored Normal file
View File

@@ -0,0 +1,50 @@
name: label-author
on:
pull_request:
types: [opened]
issues:
types: [opened]
permissions:
contents: read
pull-requests: write
issues: write
jobs:
debug-context:
runs-on: ubuntu-latest
steps:
- name: View context attributes
uses: actions/github-script@v7
with:
script: console.log(context)
label-created-by:
name: Label pr/issue on opening
runs-on: ubuntu-latest
steps:
- name: Tag with 'created-by'
uses: actions/github-script@v7
if: github.event.action == 'opened'
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
const type = context.payload.pull_request ? 'pull_request' : 'issue';
const association = context.payload[type].author_association;
let label = ''
if (association === 'MEMBER' || association === 'OWNER') {
label = 'created-by: Payload team';
} else if (association === 'CONTRIBUTOR') {
label = 'created-by: Contributor';
}
if (!label) return;
github.rest.issues.addLabels({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
labels: [label],
});
console.log('Added created-by: Payload team label');

View File

@@ -289,7 +289,8 @@ jobs:
suite:
- _community
- access-control
- admin
- admin__e2e__1
- admin__e2e__2
- auth
- field-error-states
- fields-relationship
@@ -298,7 +299,14 @@ jobs:
- fields__collections__Array
- fields__collections__Relationship
- fields__collections__RichText
- fields__collections__Lexical
- fields__collections__Lexical__e2e__main
- fields__collections__Lexical__e2e__blocks
- fields__collections__Date
- fields__collections__Number
- fields__collections__Point
- fields__collections__Tabs
- fields__collections__Text
- fields__collections__Upload
- live-preview
- localization
- i18n
@@ -419,8 +427,8 @@ jobs:
cd templates/blank-3.0
cp .env.example .env
ls -la
pnpm add ./*.tgz
pnpm install --ignore-workspace
pnpm add ./*.tgz --ignore-workspace
pnpm install --ignore-workspace --no-frozen-lockfile
cat package.json
pnpm run build

9
.swcrc
View File

@@ -7,6 +7,15 @@
"syntax": "typescript",
"tsx": true,
"dts": true
},
"transform": {
"react": {
"runtime": "automatic",
"pragmaFrag": "React.Fragment",
"throwIfNamespace": true,
"development": false,
"useBuiltins": true
}
}
},
"module": {

View File

@@ -31,6 +31,7 @@ With this field, you can also inject custom `Cell` components that appear as add
| **`label`** | Human-readable label for this UI field. |
| **`admin.components.Field`** \* | React component to be rendered for this field within the Edit view. [More](/docs/admin/components/#field-component) |
| **`admin.components.Cell`** | React component to be rendered as a Cell within collection List views. [More](/docs/admin/components/#field-component) |
| **`admin.disableListColumn`** | Set `disableListColumn` to `true` to prevent the UI field from appearing in the list view column selector. |
| **`custom`** | Extension point for adding custom data (e.g. for plugins) |
_\* An asterisk denotes that a property is required._

View File

@@ -1,30 +1,30 @@
---
title: Vercel Visual Editing
label: Vercel Visual Editing
title: Vercel Content Link
label: Vercel Content Link
order: 10
desc: Payload + Vercel Visual Editing allows yours editors to navigate directly from the content rendered on your front-end to the fields in Payload that control it.
keywords: vercel, vercel visual editing, visual editing, content source maps, Content Management System, cms, headless, javascript, node, react, express
desc: Payload + Vercel Content Link allows yours editors to navigate directly from the content rendered on your front-end to the fields in Payload that control it.
keywords: vercel, vercel content link, visual editing, content source maps, Content Management System, cms, headless, javascript, node, react, express
---
[Vercel Visual Editing](https://vercel.com/docs/workflow-collaboration/visual-editing) will allow your editors to navigate directly from the content rendered on your front-end to the fields in Payload that control it. This requires no changes to your front-end code and very few changes to your Payload config.
[Vercel Content Link](https://vercel.com/docs/workflow-collaboration/edit-mode#content-link) will allow your editors to navigate directly from the content rendered on your front-end to the fields in Payload that control it. This requires no changes to your front-end code and very few changes to your Payload config.
![Versions](/images/docs/vercel-visual-editing.jpg)
<Banner type="warning">
Vercel Visual Editing is an enterprise-only feature and only available for deployments hosted on
Vercel Content Link is an enterprise-only feature and only available for deployments hosted on
Vercel. If you are an existing enterprise customer, [contact our sales
team](https://payloadcms.com/for-enterprise) for help with your integration.
</Banner>
### How it works
To power Vercel Visual Editing, Payload embeds Content Source Maps into its API responses. Content Source Maps are invisible, encoded JSON values that include a link back to the field in the CMS that generated the content. When rendered on the page, Vercel detects and decodes these values to display the Visual Editing interface.
To power Vercel Content Link, Payload embeds Content Source Maps into its API responses. Content Source Maps are invisible, encoded JSON values that include a link back to the field in the CMS that generated the content. When rendered on the page, Vercel detects and decodes these values to display the Content Link interface.
For full details on how the encoding and decoding algorithm works, check out [`@vercel/stega`](https://www.npmjs.com/package/@vercel/stega).
### Getting Started
Setting up Payload with Vercel Visual Editing is easy. First, install the `@payloadcms/plugin-csm` plugin into your project. This plugin requires an API key to install, [contact our sales team](https://payloadcms.com/for-enterprise) if you don't already have one.
Setting up Payload with Vercel Content Link is easy. First, install the `@payloadcms/plugin-csm` plugin into your project. This plugin requires an API key to install, [contact our sales team](https://payloadcms.com/for-enterprise) if you don't already have one.
```bash
npm i @payloadcms/plugin-csm
@@ -76,7 +76,7 @@ And that's it! You are now ready to enter Edit Mode and begin visually editing y
##### Edit Mode
To see Visual Editing on your site, you first need to visit any preview deployment on Vercel and login using the Vercel Toolbar. When Content Source Maps are detected on the page, a pencil icon will appear in the toolbar. Clicking this icon will enable Edit Mode, highlighting all editable fields on the page in blue.
To see Content Link on your site, you first need to visit any preview deployment on Vercel and login using the Vercel Toolbar. When Content Source Maps are detected on the page, a pencil icon will appear in the toolbar. Clicking this icon will enable Edit Mode, highlighting all editable fields on the page in blue.
![Versions](/images/docs/vercel-toolbar.jpg)
@@ -93,7 +93,7 @@ const { cleaned, encoded } = vercelStegaSplit(text)
##### Blocks
All `blocks` fields by definition do not have plain text strings to encode. For this reason, blocks are given an additional `encodedSourceMap` key, which you can use to enable Visual Editing on entire sections of your site. You can then specify the editing container by adding the `data-vercel-edit-target` HTML attribute to any top-level element of your block.
All `blocks` fields by definition do not have plain text strings to encode. For this reason, blocks are given an additional `encodedSourceMap` key, which you can use to enable Content Link on entire sections of your site. You can then specify the editing container by adding the `data-vercel-edit-target` HTML attribute to any top-level element of your block.
```ts
<div data-vercel-edit-target>

133
docs/plugins/sentry.mdx Normal file
View File

@@ -0,0 +1,133 @@
---
title: Sentry Plugin
label: Sentry
order: 20
desc: Integrate Sentry error tracking into your Payload application
keywords: plugins, sentry, error, tracking, monitoring, logging, bug, reporting, performance
---
[![NPM](https://img.shields.io/npm/v/@payloadcms/plugin-sentry)](https://www.npmjs.com/package/@payloadcms/plugin-sentry)
This plugin allows you to integrate [Sentry](https://sentry.io/) seamlessly with your [Payload](https://github.com/payloadcms/payload) application.
### What is Sentry?
Sentry is a powerful error tracking and performance monitoring tool that helps developers identify, diagnose, and resolve issues in their applications.
<Banner type="success">
Sentry does smart stuff with error data to make bugs easier to find and fix. - [sentry.io](https://sentry.io/)
</Banner>
This multi-faceted software offers a range of features that will help you manage errors with greater ease and ultimately ensure your application is running smoothly:
#### Core Features
- **Error Tracking**: Instantly captures and logs errors as they occur in your application
- **Performance Monitoring**: Tracks application performance to identify slowdowns and bottlenecks
- **Detailed Reports**: Provides comprehensive insights into errors, including stack traces and context
- **Alerts and Notifications**: Send and customize event-triggered notifications
- **Issue Grouping, Filtering and Search**: Automatically groups similar errors, and allows filtering and searching issues by custom criteria
- **Breadcrumbs**: Records user actions and events leading up to an error
- **Integrations**: Connects with various tools and services for enhanced workflow and issue management
<Banner type="info">
This plugin is completely open-source and the [source code can be found here](https://github.com/payloadcms/payload/tree/main/packages/plugin-sentry). If you need help, check out our [Community Help](https://payloadcms.com/community-help). If you think you've found a bug, please [open a new issue](https://github.com/payloadcms/payload/issues/new?assignees=&labels=plugin%3A%20seo&template=bug_report.md&title=plugin-seo%3A) with as much detail as possible.
</Banner>
## Installation
Install the plugin using any JavaScript package manager like [Yarn](https://yarnpkg.com), [NPM](https://npmjs.com), or [PNPM](https://pnpm.io):
```bash
yarn add @payloadcms/plugin-sentry
```
## Basic Usage
In the `plugins` array of your [Payload config](https://payloadcms.com/docs/configuration/overview), call the plugin and pass in your Sentry DSN as an option.
```ts
import { buildConfig } from 'payload/config'
import { sentry } from '@payloadcms/plugin-sentry'
import { Pages, Media } from './collections'
const config = buildConfig({
collections: [Pages, Media],
plugins: [
sentry({
dsn: 'https://61edebas776889984d323d777@o4505289711681536.ingest.sentry.io/4505357433352176',
}),
],
})
export default config
```
## Options
- `dsn` : string | **required**
Sentry automatically assigns a DSN when you create a project, the unique DSN informs Sentry where to send events so they are associated with the correct project.
<Banner type="warning">
You can find your project DSN (Data Source Name) by visiting [sentry.io](sentry.io) and navigating to your [Project] > Settings > Client Keys (DSN).
</Banner>
- `enabled`: boolean | optional
Set to false to disable the plugin. Defaults to true.
- `init` : ClientOptions | optional
Sentry allows a variety of options to be passed into the Sentry.init() function, see the full list of options [here](https://docs.sentry.io/platforms/node/guides/express/configuration/options).
- `requestHandler` : RequestHandlerOptions | optional
Accepts options that let you decide what data should be included in the event sent to Sentry, checkout the options [here](https://docs.sentry.io/platforms/node/guides/express/configuration/options).
- `captureErrors`: number[] | optional
By default, `Sentry.errorHandler` will capture only errors with a status code of 500 or higher. To capture additional error codes, pass the values as numbers in an array.
To see all options available, visit the [Sentry Docs](https://docs.sentry.io/platforms/node/guides/express/configuration/options).
### Example
Configure any of these options by passing them to the plugin:
```ts
import { buildConfig } from 'payload/config'
import { sentry } from '@payloadcms/plugin-sentry'
import { Pages, Media } from './collections'
const config = buildConfig({
collections: [Pages, Media],
plugins: [
sentry({
dsn: 'https://61edebas777689984d323d777@o4505289711681536.ingest.sentry.io/4505357433352176',
options: {
init: {
debug: true,
environment: 'development',
tracesSampleRate: 1.0,
},
requestHandler: {
serverName: false,
user: ['email'],
},
captureErrors: [400, 403, 404],
},
}),
],
})
export default config
```
## TypeScript
All types can be directly imported:
```ts
import { PluginOptions } from '@payloadcms/plugin-sentry/types'
```

View File

@@ -1,5 +1,5 @@
import type { CountryField } from 'payload-plugin-form-builder/dist/types'
import type { Control, FieldErrorsImpl, FieldValues } from 'react-hook-form';
import type { Control, FieldErrorsImpl, FieldValues } from 'react-hook-form'
import React from 'react'
import { Controller } from 'react-hook-form'

View File

@@ -1,5 +1,5 @@
import type { SelectField } from 'payload-plugin-form-builder/dist/types'
import type { Control, FieldErrorsImpl, FieldValues } from 'react-hook-form';
import type { Control, FieldErrorsImpl, FieldValues } from 'react-hook-form'
import React from 'react'
import { Controller } from 'react-hook-form'

View File

@@ -1,5 +1,5 @@
import type { StateField } from 'payload-plugin-form-builder/dist/types'
import type { Control, FieldErrorsImpl, FieldValues } from 'react-hook-form';
import type { Control, FieldErrorsImpl, FieldValues } from 'react-hook-form'
import React from 'react'
import { Controller } from 'react-hook-form'

View File

@@ -1,5 +1,5 @@
'use client'
import type React from 'react';
import type React from 'react'
import { useModal } from '@faceless-ui/modal'
import { usePathname } from 'next/navigation'

View File

@@ -1,4 +1,4 @@
import type { Ref } from 'react';
import type { Ref } from 'react'
import React, { forwardRef } from 'react'

View File

@@ -3,10 +3,10 @@
"version": "0.1.0",
"private": true,
"scripts": {
"dev": "next dev -p 3001",
"build": "next build",
"start": "next start -p 3001",
"lint": "next lint"
"dev": "next dev -p 3001",
"lint": "next lint",
"start": "next start -p 3001"
},
"dependencies": {
"@payloadcms/live-preview-react": "3.0.0-beta.28",

View File

@@ -3,10 +3,10 @@
"version": "0.1.0",
"private": true,
"scripts": {
"dev": "next dev -p 3001",
"build": "next build",
"start": "next start -p 3001",
"lint": "next lint"
"dev": "next dev -p 3001",
"lint": "next lint",
"start": "next start -p 3001"
},
"dependencies": {
"@payloadcms/live-preview-react": "3.0.0-beta.28",

View File

@@ -1,4 +1,4 @@
import type { ElementType } from 'react';
import type { ElementType } from 'react'
import Link from 'next/link'
import React from 'react'

View File

@@ -1,4 +1,4 @@
import type { Ref } from 'react';
import type { Ref } from 'react'
import React, { forwardRef } from 'react'

View File

@@ -125,7 +125,7 @@ const link: LinkType = ({ appearances, disableLabel = false, overrides = {} } =
]
if (appearances) {
appearanceOptionsToUse = appearances.map(appearance => appearanceOptions[appearance])
appearanceOptionsToUse = appearances.map((appearance) => appearanceOptions[appearance])
}
linkResult.fields.push({

View File

@@ -9,7 +9,6 @@ const withBundleAnalyzer = bundleAnalyzer({
// eslint-disable-next-line no-restricted-exports
export default withBundleAnalyzer(
withPayload({
reactStrictMode: false,
eslint: {
ignoreDuringBuilds: true,
},

View File

@@ -1,6 +1,6 @@
{
"name": "payload-monorepo",
"version": "3.0.0-beta.33",
"version": "3.0.0-beta.39",
"private": true,
"type": "module",
"scripts": {
@@ -102,7 +102,8 @@
"@types/node": "20.12.5",
"@types/prompts": "^2.4.5",
"@types/qs": "6.9.14",
"@types/react": "18.3.2",
"@types/react": "npm:types-react@19.0.0-beta.2",
"@types/react-dom": "npm:types-react-dom@19.0.0-beta.2",
"@types/semver": "^7.5.3",
"@types/shelljs": "0.8.15",
"add-stream": "^1.0.0",
@@ -128,11 +129,10 @@
"jest": "29.7.0",
"jest-environment-jsdom": "29.7.0",
"jwt-decode": "4.0.0",
"lexical": "0.15.0",
"lint-staged": "^14.0.1",
"minimist": "1.2.8",
"mongodb-memory-server": "^9.0",
"next": "14.3.0-canary.68",
"next": "15.0.0-rc.0",
"node-mocks-http": "^1.14.1",
"nodemon": "3.0.3",
"open": "^10.1.0",
@@ -144,8 +144,8 @@
"prettier": "^3.0.3",
"prompts": "2.4.2",
"qs": "6.11.2",
"react": "^18.3.1",
"react-dom": "^18.3.1",
"react": "^19.0.0-rc-f994737d14-20240522",
"react-dom": "^19.0.0-rc-f994737d14-20240522",
"read-stream": "^2.1.1",
"rimraf": "3.0.2",
"semver": "^7.5.4",
@@ -166,8 +166,8 @@
"yocto-queue": "^1.0.0"
},
"peerDependencies": {
"react": "^18.2.0 || ^19.0.0",
"react-dom": "^18.2.0 || ^19.0.0"
"react": "^19.0.0 || ^19.0.0-rc-f994737d14-20240522",
"react-dom": "^19.0.0 || ^19.0.0-rc-f994737d14-20240522"
},
"engines": {
"node": ">=18.20.2",
@@ -180,6 +180,8 @@
"uuid": "3.4.0"
},
"overrides": {
"@types/react": "npm:types-react@19.0.0-beta.2",
"@types/react-dom": "npm:types-react-dom@19.0.0-beta.2",
"copyfiles": "$copyfiles",
"cross-env": "$cross-env",
"dotenv": "$dotenv",
@@ -194,6 +196,10 @@
"playwright@1.43.0": "patches/playwright@1.43.0.patch"
}
},
"overrides": {
"@types/react": "npm:types-react@19.0.0-beta.2",
"@types/react-dom": "npm:types-react-dom@19.0.0-beta.2"
},
"workspaces:": [
"packages/*",
"test/*"

View File

@@ -7,6 +7,15 @@
"syntax": "typescript",
"tsx": true,
"dts": true
},
"transform": {
"react": {
"runtime": "automatic",
"pragmaFrag": "React.Fragment",
"throwIfNamespace": true,
"development": false,
"useBuiltins": true
}
}
},
"module": {

View File

@@ -1,6 +1,6 @@
{
"name": "create-payload-app",
"version": "3.0.0-beta.33",
"version": "3.0.0-beta.39",
"homepage": "https://payloadcms.com",
"repository": {
"type": "git",
@@ -38,10 +38,8 @@
"@sindresorhus/slugify": "^1.1.0",
"arg": "^5.0.0",
"chalk": "^4.1.0",
"command-exists": "^1.2.9",
"comment-json": "^4.2.3",
"degit": "^2.8.4",
"detect-package-manager": "^3.0.1",
"esprima-next": "^6.0.3",
"execa": "^5.0.0",
"figures": "^6.1.0",
@@ -50,7 +48,6 @@
"terminal-link": "^2.1.1"
},
"devDependencies": {
"@types/command-exists": "^1.2.0",
"@types/degit": "^2.8.3",
"@types/esprima": "^4.0.6",
"@types/fs-extra": "^9.0.12",

View File

@@ -29,7 +29,7 @@ describe('createProject', () => {
const args = {
_: ['project-name'],
'--db': 'mongodb',
'--local-template': 'blank',
'--local-template': 'blank-3.0',
'--no-deps': true,
} as CliArgs
const packageManager = 'yarn'

View File

@@ -9,7 +9,7 @@ import path from 'path'
import type { CliArgs, DbDetails, PackageManager, ProjectTemplate } from '../types.js'
import { tryInitRepoAndCommit } from '../utils/git.js'
import { debug, error, warning } from '../utils/log.js'
import { debug, error, info, warning } from '../utils/log.js'
import { configurePayloadConfig } from './configure-payload-config.js'
const filename = fileURLToPath(import.meta.url)
@@ -99,6 +99,7 @@ export async function createProject(args: {
}
if (!cliArgs['--no-deps']) {
info(`Using ${packageManager}.\n`)
spinner.message('Installing dependencies...')
const result = await installDeps({ cliArgs, packageManager, projectDir })
if (result) {

View File

@@ -0,0 +1,44 @@
import execa from 'execa'
import fse from 'fs-extra'
import type { CliArgs, PackageManager } from '../types.js'
export async function getPackageManager(args: {
cliArgs?: CliArgs
projectDir: string
}): Promise<PackageManager> {
const { cliArgs, projectDir } = args
try {
// Check for yarn.lock, package-lock.json, or pnpm-lock.yaml
let detected: PackageManager = 'npm'
if (
cliArgs?.['--use-pnpm'] ||
fse.existsSync(`${projectDir}/pnpm-lock.yaml`) ||
(await commandExists('pnpm'))
) {
detected = 'pnpm'
} else if (
cliArgs?.['--use-yarn'] ||
fse.existsSync(`${projectDir}/yarn.lock`) ||
(await commandExists('yarn'))
) {
detected = 'yarn'
} else if (cliArgs?.['--use-npm'] || fse.existsSync(`${projectDir}/package-lock.json`)) {
detected = 'npm'
}
return detected
} catch (error) {
return 'npm'
}
}
async function commandExists(command: string): Promise<boolean> {
try {
await execa.command(`command -v ${command}`)
return true
} catch {
return false
}
}

View File

@@ -6,24 +6,24 @@ import execa from 'execa'
import fs from 'fs'
import fse from 'fs-extra'
import globby from 'globby'
import { fileURLToPath } from 'node:url'
import path from 'path'
import { promisify } from 'util'
import type { CliArgs, DbType, NextAppDetails, NextConfigType, PackageManager } from '../types.js'
import { copyRecursiveSync } from '../utils/copy-recursive-sync.js'
import { debug as origDebug, warning } from '../utils/log.js'
import { moveMessage } from '../utils/messages.js'
import { installPackages } from './install-packages.js'
import { wrapNextConfig } from './wrap-next-config.js'
const readFile = promisify(fs.readFile)
const writeFile = promisify(fs.writeFile)
const filename = fileURLToPath(import.meta.url)
const dirname = path.dirname(filename)
import { fileURLToPath } from 'node:url'
import type { CliArgs, DbType, PackageManager } from '../types.js'
import { copyRecursiveSync } from '../utils/copy-recursive-sync.js'
import { debug as origDebug, warning } from '../utils/log.js'
import { moveMessage } from '../utils/messages.js'
import { wrapNextConfig } from './wrap-next-config.js'
type InitNextArgs = Pick<CliArgs, '--debug'> & {
dbType: DbType
nextAppDetails?: NextAppDetails
@@ -32,8 +32,6 @@ type InitNextArgs = Pick<CliArgs, '--debug'> & {
useDistFiles?: boolean
}
type NextConfigType = 'cjs' | 'esm'
type InitNextResult =
| {
isSrcDir: boolean
@@ -55,7 +53,8 @@ export async function initNext(args: InitNextArgs): Promise<InitNextResult> {
nextAppDetails.nextAppDir = createdAppDir
}
const { hasTopLevelLayout, isSrcDir, nextAppDir, nextConfigType } = nextAppDetails
const { hasTopLevelLayout, isPayloadInstalled, isSrcDir, nextAppDir, nextConfigType } =
nextAppDetails
if (!nextConfigType) {
return {
@@ -222,43 +221,19 @@ function installAndConfigurePayload(
}
async function installDeps(projectDir: string, packageManager: PackageManager, dbType: DbType) {
const packagesToInstall = ['payload', '@payloadcms/next', '@payloadcms/richtext-lexical'].map(
(pkg) => `${pkg}@beta`,
)
const packagesToInstall = [
'payload',
'@payloadcms/next',
'@payloadcms/richtext-lexical',
'@payloadcms/plugin-cloud',
].map((pkg) => `${pkg}@beta`)
packagesToInstall.push(`@payloadcms/db-${dbType}@beta`)
let exitCode = 0
switch (packageManager) {
case 'npm': {
;({ exitCode } = await execa('npm', ['install', '--save', ...packagesToInstall], {
cwd: projectDir,
}))
break
}
case 'yarn':
case 'pnpm': {
;({ exitCode } = await execa(packageManager, ['add', ...packagesToInstall], {
cwd: projectDir,
}))
break
}
case 'bun': {
warning('Bun support is untested.')
;({ exitCode } = await execa('bun', ['add', ...packagesToInstall], { cwd: projectDir }))
break
}
}
// Match graphql version of @payloadcms/next
packagesToInstall.push('graphql@^16.8.1')
return { success: exitCode === 0 }
}
type NextAppDetails = {
hasTopLevelLayout: boolean
isSrcDir: boolean
nextAppDir?: string
nextConfigPath?: string
nextConfigType?: NextConfigType
return await installPackages({ packageManager, packagesToInstall, projectDir })
}
export async function getNextAppDetails(projectDir: string): Promise<NextAppDetails> {
@@ -267,6 +242,7 @@ export async function getNextAppDetails(projectDir: string): Promise<NextAppDeta
const nextConfigPath: string | undefined = (
await globby('next.config.*js', { absolute: true, cwd: projectDir })
)?.[0]
if (!nextConfigPath || nextConfigPath.length === 0) {
return {
hasTopLevelLayout: false,
@@ -275,6 +251,16 @@ export async function getNextAppDetails(projectDir: string): Promise<NextAppDeta
}
}
const packageObj = await fse.readJson(path.resolve(projectDir, 'package.json'))
if (packageObj.dependencies?.payload) {
return {
hasTopLevelLayout: false,
isPayloadInstalled: true,
isSrcDir,
nextConfigPath,
}
}
let nextAppDir: string | undefined = (
await globby(['**/app'], {
absolute: true,
@@ -288,7 +274,7 @@ export async function getNextAppDetails(projectDir: string): Promise<NextAppDeta
nextAppDir = undefined
}
const configType = await getProjectType(projectDir, nextConfigPath)
const configType = getProjectType({ nextConfigPath, packageObj })
const hasTopLevelLayout = nextAppDir
? fs.existsSync(path.resolve(nextAppDir, 'layout.tsx'))
@@ -297,7 +283,11 @@ export async function getNextAppDetails(projectDir: string): Promise<NextAppDeta
return { hasTopLevelLayout, isSrcDir, nextAppDir, nextConfigPath, nextConfigType: configType }
}
async function getProjectType(projectDir: string, nextConfigPath: string): Promise<'cjs' | 'esm'> {
function getProjectType(args: {
nextConfigPath: string
packageObj: Record<string, unknown>
}): 'cjs' | 'esm' {
const { nextConfigPath, packageObj } = args
if (nextConfigPath.endsWith('.mjs')) {
return 'esm'
}
@@ -305,7 +295,6 @@ async function getProjectType(projectDir: string, nextConfigPath: string): Promi
return 'cjs'
}
const packageObj = await fse.readJson(path.resolve(projectDir, 'package.json'))
const packageJsonType = packageObj.type
if (packageJsonType === 'module') {
return 'esm'

View File

@@ -0,0 +1,42 @@
import execa from 'execa'
import type { PackageManager } from '../types.js'
import { error, warning } from '../utils/log.js'
export async function installPackages(args: {
packageManager: PackageManager
packagesToInstall: string[]
projectDir: string
}) {
const { packageManager, packagesToInstall, projectDir } = args
let exitCode = 0
let stderr = ''
switch (packageManager) {
case 'npm': {
;({ exitCode, stderr } = await execa('npm', ['install', '--save', ...packagesToInstall], {
cwd: projectDir,
}))
break
}
case 'yarn':
case 'pnpm':
case 'bun': {
if (packageManager === 'bun') {
warning('Bun support is untested.')
}
;({ exitCode, stderr } = await execa(packageManager, ['add', ...packagesToInstall], {
cwd: projectDir,
}))
break
}
}
if (exitCode !== 0) {
error(`Unable to install packages. Error: ${stderr}`)
}
return { success: exitCode === 0 }
}

View File

@@ -15,7 +15,7 @@ export function validateTemplate(templateName: string): boolean {
export function getValidTemplates(): ProjectTemplate[] {
return [
{
name: 'blank-3.0',
name: 'blank',
type: 'starter',
description: 'Blank 3.0 Template',
url: 'https://github.com/payloadcms/payload/templates/blank-3.0#beta',
@@ -41,12 +41,12 @@ export function getValidTemplates(): ProjectTemplate[] {
// description: 'E-commerce Template',
// url: 'https://github.com/payloadcms/payload/templates/ecommerce',
// },
{
name: 'plugin',
type: 'plugin',
description: 'Template for creating a Payload plugin',
url: 'https://github.com/payloadcms/payload-plugin-template#beta',
},
// {
// name: 'plugin',
// type: 'plugin',
// description: 'Template for creating a Payload plugin',
// url: 'https://github.com/payloadcms/payload-plugin-template#beta',
// },
// {
// name: 'payload-demo',
// type: 'starter',

View File

@@ -0,0 +1,89 @@
import execa from 'execa'
import fse from 'fs-extra'
import { fileURLToPath } from 'node:url'
import path from 'path'
const filename = fileURLToPath(import.meta.url)
const dirname = path.dirname(filename)
import type { NextAppDetails } from '../types.js'
import { copyRecursiveSync } from '../utils/copy-recursive-sync.js'
import { info } from '../utils/log.js'
import { getPackageManager } from './get-package-manager.js'
import { installPackages } from './install-packages.js'
export async function updatePayloadInProject(
appDetails: NextAppDetails,
): Promise<{ message: string; success: boolean }> {
if (!appDetails.nextConfigPath) return { message: 'No Next.js config found', success: false }
const projectDir = path.dirname(appDetails.nextConfigPath)
const packageObj = (await fse.readJson(path.resolve(projectDir, 'package.json'))) as {
dependencies?: Record<string, string>
}
if (!packageObj?.dependencies) {
throw new Error('No package.json found in this project')
}
const payloadVersion = packageObj.dependencies?.payload
if (!payloadVersion) {
throw new Error('Payload is not installed in this project')
}
const packageManager = await getPackageManager({ projectDir })
// Fetch latest Payload version from npm
const { exitCode: getLatestVersionExitCode, stdout: latestPayloadVersion } = await execa('npm', [
'show',
'payload@beta',
'version',
])
if (getLatestVersionExitCode !== 0) {
throw new Error('Failed to fetch latest Payload version')
}
if (payloadVersion === latestPayloadVersion) {
return { message: `Payload v${payloadVersion} is already up to date.`, success: true }
}
// Update all existing Payload packages
const payloadPackages = Object.keys(packageObj.dependencies).filter((dep) =>
dep.startsWith('@payloadcms/'),
)
const packageNames = ['payload', ...payloadPackages]
const packagesToUpdate = packageNames.map((pkg) => `${pkg}@${latestPayloadVersion}`)
info(`Using ${packageManager}.\n`)
info(
`Updating ${packagesToUpdate.length} Payload packages to v${latestPayloadVersion}...\n\n${packageNames.map((p) => ` - ${p}`).join('\n')}`,
)
const { success: updateSuccess } = await installPackages({
packageManager,
packagesToInstall: packagesToUpdate,
projectDir,
})
if (!updateSuccess) {
throw new Error('Failed to update Payload packages')
}
info('Payload packages updated successfully.')
info(`Updating Payload Next.js files...`)
const templateFilesPath = dirname.endsWith('dist')
? path.resolve(dirname, '../..', 'dist/template')
: path.resolve(dirname, '../../../../templates/blank-3.0')
const templateSrcDir = path.resolve(templateFilesPath, 'src/app/(payload)')
copyRecursiveSync(
templateSrcDir,
path.resolve(projectDir, appDetails.isSrcDir ? 'src/app' : 'app', '(payload)'),
)
return { message: 'Payload updated successfully.', success: true }
}

View File

@@ -2,21 +2,21 @@ import * as p from '@clack/prompts'
import slugify from '@sindresorhus/slugify'
import arg from 'arg'
import chalk from 'chalk'
// @ts-expect-error no types
import { detect } from 'detect-package-manager'
import figures from 'figures'
import path from 'path'
import type { CliArgs, PackageManager } from './types.js'
import type { CliArgs } from './types.js'
import { configurePayloadConfig } from './lib/configure-payload-config.js'
import { createProject } from './lib/create-project.js'
import { generateSecret } from './lib/generate-secret.js'
import { getPackageManager } from './lib/get-package-manager.js'
import { getNextAppDetails, initNext } from './lib/init-next.js'
import { parseProjectName } from './lib/parse-project-name.js'
import { parseTemplate } from './lib/parse-template.js'
import { selectDb } from './lib/select-db.js'
import { getValidTemplates, validateTemplate } from './lib/templates.js'
import { updatePayloadInProject } from './lib/update-payload-in-project.js'
import { writeEnvFile } from './lib/write-env-file.js'
import { error, info } from './utils/log.js'
import {
@@ -85,7 +85,28 @@ export class Main {
// Detect if inside Next.js project
const nextAppDetails = await getNextAppDetails(process.cwd())
const { hasTopLevelLayout, nextAppDir, nextConfigPath } = nextAppDetails
const { hasTopLevelLayout, isPayloadInstalled, nextAppDir, nextConfigPath } = nextAppDetails
// Upgrade Payload in existing project
if (isPayloadInstalled && nextConfigPath) {
p.log.warn(`Payload installation detected in current project.`)
const shouldUpdate = await p.confirm({
initialValue: false,
message: chalk.bold(`Upgrade Payload in this project?`),
})
if (!p.isCancel(shouldUpdate) || shouldUpdate) {
const { message, success: updateSuccess } = await updatePayloadInProject(nextAppDetails)
if (updateSuccess) {
info(message)
} else {
error(message)
}
}
p.outro(feedbackOutro())
process.exit(0)
}
if (nextConfigPath) {
this.args['--name'] = slugify(path.basename(path.dirname(nextConfigPath)))
@@ -96,7 +117,7 @@ export class Main {
? path.dirname(nextConfigPath)
: path.resolve(process.cwd(), slugify(projectName))
const packageManager = await getPackageManager(this.args, projectDir)
const packageManager = await getPackageManager({ cliArgs: this.args, projectDir })
if (nextConfigPath) {
p.log.step(
@@ -212,19 +233,3 @@ export class Main {
}
}
}
async function getPackageManager(args: CliArgs, projectDir: string): Promise<PackageManager> {
let packageManager: PackageManager = 'npm'
if (args['--use-npm']) {
packageManager = 'npm'
} else if (args['--use-yarn']) {
packageManager = 'yarn'
} else if (args['--use-pnpm']) {
packageManager = 'pnpm'
} else {
const detected = await detect({ cwd: projectDir })
packageManager = detected || 'npm'
}
return packageManager
}

View File

@@ -65,3 +65,14 @@ export type DbDetails = {
}
export type EditorType = 'lexical' | 'slate'
export type NextAppDetails = {
hasTopLevelLayout: boolean
isPayloadInstalled?: boolean
isSrcDir: boolean
nextAppDir?: string
nextConfigPath?: string
nextConfigType?: NextConfigType
}
export type NextConfigType = 'cjs' | 'esm'

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/db-mongodb",
"version": "3.0.0-beta.33",
"version": "3.0.0-beta.39",
"description": "The officially supported MongoDB database adapter for Payload",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -1,22 +1,24 @@
/* eslint-disable no-restricted-syntax, no-await-in-loop */
import type { CreateMigration } from 'payload/database'
import type { CreateMigration, MigrationTemplateArgs } from 'payload/database'
import fs from 'fs'
import path from 'path'
import { getPredefinedMigration } from 'payload/database'
import { fileURLToPath } from 'url'
const migrationTemplate = (upSQL?: string, downSQL?: string) => `import {
const migrationTemplate = ({ downSQL, imports, upSQL }: MigrationTemplateArgs): string => `import {
MigrateUpArgs,
MigrateDownArgs,
} from "@payloadcms/db-mongodb";
} from '@payloadcms/db-mongodb'
${imports}
export async function up({ payload }: MigrateUpArgs): Promise<void> {
${upSQL ?? ` // Migration code`}
};
}
export async function down({ payload }: MigrateDownArgs): Promise<void> {
${downSQL ?? ` // Migration code`}
};
}
`
export const createMigration: CreateMigration = async function createMigration({
@@ -31,36 +33,14 @@ export const createMigration: CreateMigration = async function createMigration({
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir)
}
const predefinedMigration = await getPredefinedMigration({
dirname,
file,
migrationName,
payload,
})
let migrationFileContent: string | undefined
// Check for predefined migration.
// Either passed in via --file or prefixed with @payloadcms/db-mongodb/
if (file || migrationName?.startsWith('@payloadcms/db-mongodb/')) {
if (!file) file = migrationName
const predefinedMigrationName = file.replace('@payloadcms/db-mongodb/', '')
migrationName = predefinedMigrationName
const cleanPath = path.join(dirname, `../predefinedMigrations/${predefinedMigrationName}.js`)
// Check if predefined migration exists
if (fs.existsSync(cleanPath)) {
let migration = await eval(
`${typeof require === 'function' ? 'require' : 'import'}(${cleanPath})`,
)
if ('default' in migration) migration = migration.default
const { down, up } = migration
migrationFileContent = migrationTemplate(up, down)
} else {
payload.logger.error({
msg: `Canned migration ${predefinedMigrationName} not found.`,
})
process.exit(1)
}
} else {
migrationFileContent = migrationTemplate()
}
const migrationFileContent = migrationTemplate(predefinedMigration)
const [yyymmdd, hhmmss] = new Date().toISOString().split('T')
const formattedDate = yyymmdd.replace(/\D/g, '')

View File

@@ -193,17 +193,19 @@ export async function buildSearchParam({
if (field.type === 'relationship' || field.type === 'upload') {
let hasNumberIDRelation
let multiIDCondition = '$or'
if (operatorKey === '$ne') multiIDCondition = '$and'
const result = {
value: {
$or: [{ [path]: { [operatorKey]: formattedValue } }],
[multiIDCondition]: [{ [path]: { [operatorKey]: formattedValue } }],
},
}
if (typeof formattedValue === 'string') {
if (mongoose.Types.ObjectId.isValid(formattedValue)) {
result.value.$or.push({
[path]: { [operatorKey]: new ObjectId(formattedValue) },
result.value[multiIDCondition].push({
[path]: { [operatorKey]: ObjectId(formattedValue) },
})
} else {
;(Array.isArray(field.relationTo) ? field.relationTo : [field.relationTo]).forEach(
@@ -218,11 +220,13 @@ export async function buildSearchParam({
)
if (hasNumberIDRelation)
result.value.$or.push({ [path]: { [operatorKey]: parseFloat(formattedValue) } })
result.value[multiIDCondition].push({
[path]: { [operatorKey]: parseFloat(formattedValue) },
})
}
}
if (result.value.$or.length > 1) {
if (result.value[multiIDCondition].length > 1) {
return result
}
}

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/db-postgres",
"version": "3.0.0-beta.33",
"version": "3.0.0-beta.39",
"description": "The officially supported Postgres database adapter for Payload",
"homepage": "https://payloadcms.com",
"repository": {
@@ -21,6 +21,11 @@
"import": "./src/types.ts",
"require": "./src/types.ts",
"types": "./src/types.ts"
},
"./migration-utils": {
"import": "./src/exports/migration-utils.ts",
"require": "./src/exports/migration-utils.ts",
"types": "./src/exports/migration-utils.ts"
}
},
"main": "./src/index.ts",
@@ -30,11 +35,12 @@
"mock.js"
],
"scripts": {
"build": "pnpm build:swc && pnpm build:types",
"build": "pnpm build:swc && pnpm build:types && pnpm renamePredefinedMigrations",
"build:swc": "swc ./src -d ./dist --config-file .swcrc",
"build:types": "tsc --emitDeclarationOnly --outDir dist",
"clean": "rimraf {dist,*.tsbuildinfo}",
"prepublishOnly": "pnpm clean && pnpm turbo build"
"prepublishOnly": "pnpm clean && pnpm turbo build",
"renamePredefinedMigrations": "tsx ./scripts/renamePredefinedMigrations.ts"
},
"dependencies": {
"@libsql/client": "^0.5.2",
@@ -66,6 +72,11 @@
"import": "./dist/types.js",
"require": "./dist/types.js",
"types": "./dist/types.d.ts"
},
"./migration-utils": {
"import": "./dist/exports/migration-utils.js",
"require": "./dist/exports/migration-utils.js",
"types": "./dist/exports/migration-utils.d.ts"
}
},
"main": "./dist/index.js",

View File

@@ -0,0 +1,13 @@
const imports = `import { migratePostgresV2toV3 } from '@payloadcms/migratePostgresV2toV3'`;
const up = ` await migratePostgresV2toV3({
// enables logging of changes that will be made to the database
debug: false,
// skips calls that modify schema or data
dryRun: false,
payload,
req,
})
`;
export { imports, up };
//# sourceMappingURL=relationships-v2-v3.js.map

View File

@@ -0,0 +1,18 @@
import fs from 'fs'
import path from 'path'
/**
* Changes built .js files to .mjs to for ESM imports
*/
const rename = () => {
fs.readdirSync(path.resolve('./dist/predefinedMigrations'))
.filter((f) => {
return f.endsWith('.js')
})
.forEach((file) => {
const newPath = path.join('./dist/predefinedMigrations', file)
fs.renameSync(newPath, newPath.replace('.js', '.mjs'))
})
}
rename()

View File

@@ -21,7 +21,7 @@ export const count: Count = async function count(
const db = this.sessions[req.transactionID]?.db || this.drizzle
const table = this.tables[tableName]
const { joinAliases, joins, where } = await buildQuery({
const { joins, where } = await buildQuery({
adapter: this,
fields: collectionConfig.fields,
locale,
@@ -31,13 +31,6 @@ export const count: Count = async function count(
const selectCountMethods: ChainedMethods = []
joinAliases.forEach(({ condition, table }) => {
selectCountMethods.push({
args: [table, condition],
method: 'leftJoin',
})
})
Object.entries(joins).forEach(([joinTable, condition]) => {
if (joinTable) {
selectCountMethods.push({

View File

@@ -1,40 +1,30 @@
/* eslint-disable no-restricted-syntax, no-await-in-loop */
import type { DrizzleSnapshotJSON } from 'drizzle-kit/payload'
import type { CreateMigration } from 'payload/database'
import type { CreateMigration, MigrationTemplateArgs } from 'payload/database'
import fs from 'fs'
import { createRequire } from 'module'
import path from 'path'
import { getPredefinedMigration } from 'payload/database'
import prompts from 'prompts'
import { fileURLToPath } from 'url'
import type { PostgresAdapter } from './types.js'
const require = createRequire(import.meta.url)
const migrationTemplate = (
upSQL?: string,
downSQL?: string,
) => `import { MigrateUpArgs, MigrateDownArgs, sql } from '@payloadcms/db-postgres'
export async function up({ payload }: MigrateUpArgs): Promise<void> {
${
upSQL
? `await payload.db.drizzle.execute(sql\`
${upSQL}\`);
`
: '// Migration code'
}
const migrationTemplate = ({
downSQL,
imports,
upSQL,
}: MigrationTemplateArgs): string => `import { MigrateUpArgs, MigrateDownArgs, sql } from '@payloadcms/db-postgres'
${imports ? `${imports}\n` : ''}
export async function up({ payload, req }: MigrateUpArgs): Promise<void> {
${upSQL}
};
export async function down({ payload }: MigrateDownArgs): Promise<void> {
${
downSQL
? `await payload.db.drizzle.execute(sql\`
${downSQL}\`);
`
: '// Migration code'
}
export async function down({ payload, req }: MigrateDownArgs): Promise<void> {
${downSQL}
};
`
@@ -55,78 +45,95 @@ const getDefaultDrizzleSnapshot = (): DrizzleSnapshotJSON => ({
export const createMigration: CreateMigration = async function createMigration(
this: PostgresAdapter,
{ forceAcceptWarning, migrationName, payload },
{ file, forceAcceptWarning, migrationName, payload },
) {
const filename = fileURLToPath(import.meta.url)
const dirname = path.dirname(filename)
const dir = payload.db.migrationDir
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir)
}
const { generateDrizzleJson, generateMigration } = require('drizzle-kit/payload')
const drizzleJsonAfter = generateDrizzleJson(this.schema)
const [yyymmdd, hhmmss] = new Date().toISOString().split('T')
const formattedDate = yyymmdd.replace(/\D/g, '')
const formattedTime = hhmmss.split('.')[0].replace(/\D/g, '')
let imports: string = ''
let downSQL: string
let upSQL: string
;({ downSQL, imports, upSQL } = await getPredefinedMigration({
dirname,
file,
migrationName,
payload,
}))
const timestamp = `${formattedDate}_${formattedTime}`
const fileName = migrationName
? `${timestamp}_${migrationName.replace(/\W/g, '_')}`
: `${timestamp}`
const name = migrationName || file?.split('/').slice(2).join('/')
const fileName = `${timestamp}${name ? `_${name.replace(/\W/g, '_')}` : ''}`
const filePath = `${dir}/${fileName}`
let drizzleJsonBefore = getDefaultDrizzleSnapshot()
// Get latest migration snapshot
const latestSnapshot = fs
.readdirSync(dir)
.filter((file) => file.endsWith('.json'))
.sort()
.reverse()?.[0]
if (!upSQL) {
// Get latest migration snapshot
const latestSnapshot = fs
.readdirSync(dir)
.filter((file) => file.endsWith('.json'))
.sort()
.reverse()?.[0]
if (latestSnapshot) {
const latestSnapshotJSON = JSON.parse(
fs.readFileSync(`${dir}/${latestSnapshot}`, 'utf8'),
) as DrizzleSnapshotJSON
drizzleJsonBefore = latestSnapshotJSON
}
const drizzleJsonAfter = generateDrizzleJson(this.schema)
const sqlStatementsUp = await generateMigration(drizzleJsonBefore, drizzleJsonAfter)
const sqlStatementsDown = await generateMigration(drizzleJsonAfter, drizzleJsonBefore)
if (!sqlStatementsUp.length && !sqlStatementsDown.length && !forceAcceptWarning) {
const { confirm: shouldCreateBlankMigration } = await prompts(
{
name: 'confirm',
type: 'confirm',
initial: false,
message: 'No schema changes detected. Would you like to create a blank migration file?',
},
{
onCancel: () => {
process.exit(0)
},
},
)
if (!shouldCreateBlankMigration) {
process.exit(0)
if (latestSnapshot) {
drizzleJsonBefore = JSON.parse(
fs.readFileSync(`${dir}/${latestSnapshot}`, 'utf8'),
) as DrizzleSnapshotJSON
}
}
// write schema
fs.writeFileSync(`${filePath}.json`, JSON.stringify(drizzleJsonAfter, null, 2))
const sqlStatementsUp = await generateMigration(drizzleJsonBefore, drizzleJsonAfter)
const sqlStatementsDown = await generateMigration(drizzleJsonAfter, drizzleJsonBefore)
const sqlExecute = 'await payload.db.drizzle.execute(sql`'
if (sqlStatementsUp?.length) {
upSQL = `${sqlExecute}\n ${sqlStatementsUp?.join('\n')}\`)`
}
if (sqlStatementsDown?.length) {
downSQL = `${sqlExecute}\n ${sqlStatementsDown?.join('\n')}\`)`
}
if (!upSQL?.length && !downSQL?.length && !forceAcceptWarning) {
const { confirm: shouldCreateBlankMigration } = await prompts(
{
name: 'confirm',
type: 'confirm',
initial: false,
message: 'No schema changes detected. Would you like to create a blank migration file?',
},
{
onCancel: () => {
process.exit(0)
},
},
)
if (!shouldCreateBlankMigration) {
process.exit(0)
}
}
// write schema
fs.writeFileSync(`${filePath}.json`, JSON.stringify(drizzleJsonAfter, null, 2))
}
// write migration
fs.writeFileSync(
`${filePath}.ts`,
migrationTemplate(
sqlStatementsUp.length ? sqlStatementsUp?.join('\n') : undefined,
sqlStatementsDown.length ? sqlStatementsDown?.join('\n') : undefined,
),
migrationTemplate({
downSQL: downSQL || ` // Migration code`,
imports,
upSQL: upSQL || ` // Migration code`,
}),
)
payload.logger.info({ msg: `Migration created at ${filePath}.ts` })
}

View File

@@ -45,17 +45,12 @@ export async function createVersion<T extends TypeWithID>(
const table = this.tables[tableName]
const relationshipsTable = this.tables[`${tableName}${this.relationshipsSuffix}`]
if (collection.versions.drafts) {
await db.execute(sql`
UPDATE ${table}
SET latest = false
FROM ${relationshipsTable}
WHERE ${table.id} = ${relationshipsTable.parent}
AND ${relationshipsTable.path} = ${'parent'}
AND ${relationshipsTable[`${collectionSlug}ID`]} = ${parent}
AND ${table.id} != ${result.id};
WHERE ${table.id} != ${result.id}
AND ${table.parent} = ${parent}
`)
}

View File

@@ -22,7 +22,7 @@ export const deleteOne: DeleteOne = async function deleteOne(
let docToDelete: Record<string, unknown>
const { joinAliases, joins, selectFields, where } = await buildQuery({
const { joins, selectFields, where } = await buildQuery({
adapter: this,
fields: collection.fields,
locale: req.locale,
@@ -34,7 +34,6 @@ export const deleteOne: DeleteOne = async function deleteOne(
adapter: this,
chainedMethods: [{ args: [1], method: 'limit' }],
db,
joinAliases,
joins,
selectFields,
tableName,
@@ -59,6 +58,7 @@ export const deleteOne: DeleteOne = async function deleteOne(
}
const result = transform({
adapter: this,
config: this.payload.config,
data: docToDelete,
fields: collection.fields,

View File

@@ -0,0 +1 @@
export { migratePostgresV2toV3 } from '../predefinedMigrations/v2-v3/index.js'

View File

@@ -12,7 +12,11 @@ type BuildFindQueryArgs = {
tableName: string
}
export type Result = DBQueryConfig<'many', true, any, any>
export type Result = DBQueryConfig<'many', true, any, any> & {
with?: DBQueryConfig<'many', true, any, any> & {
_locales?: DBQueryConfig<'many', true, any, any>
}
}
// Generate the Drizzle query for findMany based on
// a collection field structure
@@ -31,6 +35,7 @@ export const buildFindManyArgs = ({
id: false,
_parentID: false,
},
with: {},
}
if (adapter.tables[`${tableName}_texts`]) {

View File

@@ -41,7 +41,7 @@ export const findMany = async function find({
let hasNextPage: boolean
let pagingCounter: number
const { joinAliases, joins, orderBy, selectFields, where } = await buildQuery({
const { joins, orderBy, selectFields, where } = await buildQuery({
adapter,
fields,
locale,
@@ -76,7 +76,6 @@ export const findMany = async function find({
adapter,
chainedMethods: selectDistinctMethods,
db,
joinAliases,
joins,
selectFields,
tableName,
@@ -122,29 +121,19 @@ export const findMany = async function find({
if (pagination !== false && (orderedIDs ? orderedIDs?.length <= limit : true)) {
const selectCountMethods: ChainedMethods = []
joinAliases.forEach(({ condition, table }) => {
joins.forEach(({ condition, table }) => {
selectCountMethods.push({
args: [table, condition],
method: 'leftJoin',
})
})
Object.entries(joins).forEach(([joinTable, condition]) => {
if (joinTable) {
selectCountMethods.push({
args: [adapter.tables[joinTable], condition],
method: 'leftJoin',
})
}
})
const countResult = await chainMethods({
methods: selectCountMethods,
query: db
.select({
count: sql<number>`count
(DISTINCT ${adapter.tables[tableName].id})`,
(DISTINCT ${adapter.tables[tableName].id})`,
})
.from(table)
.where(where),
@@ -172,6 +161,7 @@ export const findMany = async function find({
const docs = rawDocs.map((data: TypeWithID) => {
return transform({
adapter,
config: adapter.payload.config,
data,
fields,

View File

@@ -8,9 +8,9 @@ import type { PostgresAdapter } from '../types.js'
import type { Result } from './buildFindManyArgs.js'
type TraverseFieldArgs = {
_locales: Record<string, unknown>
_locales: Result
adapter: PostgresAdapter
currentArgs: Record<string, unknown>
currentArgs: Result
currentTableName: string
depth?: number
fields: Field[]
@@ -31,6 +31,19 @@ export const traverseFields = ({
topLevelTableName,
}: TraverseFieldArgs) => {
fields.forEach((field) => {
// handle simple relationship
if (
depth > 0 &&
(field.type === 'upload' ||
(field.type === 'relationship' && !field.hasMany && typeof field.relationTo === 'string'))
) {
if (field.localized) {
_locales.with[`${path}${field.name}`] = true
} else {
currentArgs.with[`${path}${field.name}`] = true
}
}
if (field.type === 'collapsible' || field.type === 'row') {
traverseFields({
_locales,
@@ -84,11 +97,19 @@ export const traverseFields = ({
const arrayTableNameWithLocales = `${arrayTableName}${adapter.localesSuffix}`
if (adapter.tables[arrayTableNameWithLocales]) withArray.with._locales = _locales
if (adapter.tables[arrayTableNameWithLocales]) {
withArray.with._locales = {
columns: {
id: false,
_parentID: false,
},
with: {},
}
}
currentArgs.with[`${path}${field.name}`] = withArray
traverseFields({
_locales,
_locales: withArray.with._locales,
adapter,
currentArgs: withArray,
currentTableName: arrayTableName,
@@ -137,12 +158,14 @@ export const traverseFields = ({
)
if (adapter.tables[`${tableName}${adapter.localesSuffix}`]) {
withBlock.with._locales = _locales
withBlock.with._locales = {
with: {},
}
}
topLevelArgs.with[blockKey] = withBlock
traverseFields({
_locales,
_locales: withBlock.with._locales,
adapter,
currentArgs: withBlock,
currentTableName: tableName,

View File

@@ -4,6 +4,7 @@ import type { SanitizedCollectionConfig } from 'payload/types'
import { pgEnum, pgSchema, pgTable } from 'drizzle-orm/pg-core'
import { buildVersionCollectionFields, buildVersionGlobalFields } from 'payload/versions'
import toSnakeCase from 'to-snake-case'
import type { PostgresAdapter } from './types.js'
@@ -25,16 +26,25 @@ export const init: Init = function init(this: PostgresAdapter) {
}
this.payload.config.collections.forEach((collection: SanitizedCollectionConfig) => {
const tableName = createTableName({
createTableName({
adapter: this,
config: collection,
})
if (collection.versions) {
createTableName({
adapter: this,
config: collection,
versions: true,
versionsCustomName: true,
})
}
})
this.payload.config.collections.forEach((collection: SanitizedCollectionConfig) => {
const tableName = this.tableNameMap.get(toSnakeCase(collection.slug))
buildTable({
adapter: this,
buildNumbers: true,
buildRelationships: true,
buildTexts: true,
disableNotNull: !!collection?.versions?.drafts,
disableUnique: false,
fields: collection.fields,
@@ -44,19 +54,13 @@ export const init: Init = function init(this: PostgresAdapter) {
})
if (collection.versions) {
const versionsTableName = createTableName({
adapter: this,
config: collection,
versions: true,
versionsCustomName: true,
})
const versionsTableName = this.tableNameMap.get(
`_${toSnakeCase(collection.slug)}${this.versionsSuffix}`,
)
const versionFields = buildVersionCollectionFields(collection)
buildTable({
adapter: this,
buildNumbers: true,
buildRelationships: true,
buildTexts: true,
disableNotNull: !!collection.versions?.drafts,
disableUnique: true,
fields: versionFields,
@@ -72,9 +76,6 @@ export const init: Init = function init(this: PostgresAdapter) {
buildTable({
adapter: this,
buildNumbers: true,
buildRelationships: true,
buildTexts: true,
disableNotNull: !!global?.versions?.drafts,
disableUnique: false,
fields: global.fields,
@@ -94,9 +95,6 @@ export const init: Init = function init(this: PostgresAdapter) {
buildTable({
adapter: this,
buildNumbers: true,
buildRelationships: true,
buildTexts: true,
disableNotNull: !!global.versions?.drafts,
disableUnique: true,
fields: versionFields,

View File

@@ -0,0 +1,10 @@
const imports = `import { migratePostgresV2toV3 } from '@payloadcms/db-postgres/migration-utils'`
const upSQL = ` await migratePostgresV2toV3({
// enables logging of changes that will be made to the database
debug: false,
payload,
req,
})
`
export { imports, upSQL }

View File

@@ -0,0 +1,237 @@
import type { Payload } from 'payload'
import type { Field, PayloadRequestWithData } from 'payload/types'
import type { DrizzleTransaction, PostgresAdapter } from '../../../types.js'
import type { DocsToResave } from '../types.js'
import { upsertRow } from '../../../upsertRow/index.js'
import { traverseFields } from './traverseFields.js'
type Args = {
adapter: PostgresAdapter
collectionSlug?: string
db: DrizzleTransaction
debug: boolean
docsToResave: DocsToResave
fields: Field[]
globalSlug?: string
isVersions: boolean
payload: Payload
req: PayloadRequestWithData
tableName: string
}
export const fetchAndResave = async ({
adapter,
collectionSlug,
db,
debug,
docsToResave,
fields,
globalSlug,
isVersions,
payload,
req,
tableName,
}: Args) => {
for (const [id, rows] of Object.entries(docsToResave)) {
if (collectionSlug) {
const collectionConfig = payload.collections[collectionSlug].config
if (collectionConfig) {
if (isVersions) {
const doc = await payload.findVersionByID({
id,
collection: collectionSlug,
depth: 0,
fallbackLocale: null,
locale: 'all',
req,
showHiddenFields: true,
})
if (debug) {
payload.logger.info(
`The collection "${collectionConfig.slug}" version with ID ${id} will be migrated`,
)
}
traverseFields({
doc,
fields,
path: '',
rows,
})
try {
await upsertRow({
id: doc.id,
adapter,
data: doc,
db,
fields,
ignoreResult: true,
operation: 'update',
req,
tableName,
})
} catch (err) {
payload.logger.error(
`"${collectionConfig.slug}" version with ID ${doc.id} FAILED TO MIGRATE`,
)
throw err
}
if (debug) {
payload.logger.info(
`"${collectionConfig.slug}" version with ID ${doc.id} migrated successfully!`,
)
}
} else {
const doc = await payload.findByID({
id,
collection: collectionSlug,
depth: 0,
fallbackLocale: null,
locale: 'all',
req,
showHiddenFields: true,
})
if (debug) {
payload.logger.info(
`The collection "${collectionConfig.slug}" with ID ${doc.id} will be migrated`,
)
}
traverseFields({
doc,
fields,
path: '',
rows,
})
try {
await upsertRow({
id: doc.id,
adapter,
data: doc,
db,
fields,
ignoreResult: true,
operation: 'update',
req,
tableName,
})
} catch (err) {
payload.logger.error(
`The collection "${collectionConfig.slug}" with ID ${doc.id} has FAILED TO MIGRATE`,
)
throw err
}
if (debug) {
payload.logger.info(
`The collection "${collectionConfig.slug}" with ID ${doc.id} has migrated successfully!`,
)
}
}
}
}
if (globalSlug) {
const globalConfig = payload.config.globals?.find((global) => global.slug === globalSlug)
if (globalConfig) {
if (isVersions) {
const { docs } = await payload.findGlobalVersions({
slug: globalSlug,
depth: 0,
fallbackLocale: null,
limit: 0,
locale: 'all',
req,
showHiddenFields: true,
})
if (debug) {
payload.logger.info(`${docs.length} global "${globalSlug}" versions will be migrated`)
}
for (const doc of docs) {
traverseFields({
doc,
fields,
path: '',
rows,
})
try {
await upsertRow({
id: doc.id,
adapter,
data: doc,
db,
fields,
ignoreResult: true,
operation: 'update',
req,
tableName,
})
} catch (err) {
payload.logger.error(`"${globalSlug}" version with ID ${doc.id} FAILED TO MIGRATE`)
throw err
}
if (debug) {
payload.logger.info(
`"${globalSlug}" version with ID ${doc.id} migrated successfully!`,
)
}
}
} else {
const doc = await payload.findGlobal({
slug: globalSlug,
depth: 0,
fallbackLocale: null,
locale: 'all',
req,
showHiddenFields: true,
})
traverseFields({
doc,
fields,
path: '',
rows,
})
try {
await upsertRow({
id: doc.id,
adapter,
data: doc,
db,
fields,
ignoreResult: true,
operation: 'update',
req,
tableName,
})
} catch (err) {
payload.logger.error(`The global "${globalSlug}" has FAILED TO MIGRATE`)
throw err
}
if (debug) {
payload.logger.info(`The global "${globalSlug}" has migrated successfully!`)
}
}
}
}
}
}

View File

@@ -0,0 +1,215 @@
import type { Field } from 'payload/types'
import { tabHasName } from 'payload/types'
type Args = {
doc: Record<string, unknown>
fields: Field[]
locale?: string
path: string
rows: Record<string, unknown>[]
}
export const traverseFields = ({ doc, fields, locale, path, rows }: Args) => {
fields.forEach((field) => {
switch (field.type) {
case 'group': {
const newPath = `${path ? `${path}.` : ''}${field.name}`
const newDoc = doc?.[field.name]
if (typeof newDoc === 'object' && newDoc !== null) {
if (field.localized) {
Object.entries(newDoc).forEach(([locale, localeDoc]) => {
return traverseFields({
doc: localeDoc,
fields: field.fields,
locale,
path: newPath,
rows,
})
})
} else {
return traverseFields({
doc: newDoc as Record<string, unknown>,
fields: field.fields,
path: newPath,
rows,
})
}
}
break
}
case 'row':
case 'collapsible': {
return traverseFields({
doc,
fields: field.fields,
path,
rows,
})
}
case 'array': {
const rowData = doc?.[field.name]
if (field.localized && typeof rowData === 'object' && rowData !== null) {
Object.entries(rowData).forEach(([locale, localeRows]) => {
if (Array.isArray(localeRows)) {
localeRows.forEach((row, i) => {
return traverseFields({
doc: row as Record<string, unknown>,
fields: field.fields,
locale,
path: `${path ? `${path}.` : ''}${field.name}.${i}`,
rows,
})
})
}
})
}
if (Array.isArray(rowData)) {
rowData.forEach((row, i) => {
return traverseFields({
doc: row as Record<string, unknown>,
fields: field.fields,
path: `${path ? `${path}.` : ''}${field.name}.${i}`,
rows,
})
})
}
break
}
case 'blocks': {
const rowData = doc?.[field.name]
if (field.localized && typeof rowData === 'object' && rowData !== null) {
Object.entries(rowData).forEach(([locale, localeRows]) => {
if (Array.isArray(localeRows)) {
localeRows.forEach((row, i) => {
const matchedBlock = field.blocks.find((block) => block.slug === row.blockType)
if (matchedBlock) {
return traverseFields({
doc: row as Record<string, unknown>,
fields: matchedBlock.fields,
locale,
path: `${path ? `${path}.` : ''}${field.name}.${i}`,
rows,
})
}
})
}
})
}
if (Array.isArray(rowData)) {
rowData.forEach((row, i) => {
const matchedBlock = field.blocks.find((block) => block.slug === row.blockType)
if (matchedBlock) {
return traverseFields({
doc: row as Record<string, unknown>,
fields: matchedBlock.fields,
path: `${path ? `${path}.` : ''}${field.name}.${i}`,
rows,
})
}
})
}
break
}
case 'tabs': {
return field.tabs.forEach((tab) => {
if (tabHasName(tab)) {
const newDoc = doc?.[tab.name]
const newPath = `${path ? `${path}.` : ''}${tab.name}`
if (typeof newDoc === 'object' && newDoc !== null) {
if (tab.localized) {
Object.entries(newDoc).forEach(([locale, localeDoc]) => {
return traverseFields({
doc: localeDoc,
fields: tab.fields,
locale,
path: newPath,
rows,
})
})
} else {
return traverseFields({
doc: newDoc as Record<string, unknown>,
fields: tab.fields,
path: newPath,
rows,
})
}
}
} else {
traverseFields({
doc,
fields: tab.fields,
path,
rows,
})
}
})
}
case 'relationship':
case 'upload': {
if (typeof field.relationTo === 'string') {
if (field.type === 'upload' || !field.hasMany) {
const relationshipPath = `${path ? `${path}.` : ''}${field.name}`
if (field.localized) {
const matchedRelationshipsWithLocales = rows.filter(
(row) => row.path === relationshipPath,
)
if (matchedRelationshipsWithLocales.length && !doc[field.name]) {
doc[field.name] = {}
}
const newDoc = doc[field.name] as Record<string, unknown>
matchedRelationshipsWithLocales.forEach((localeRow) => {
if (typeof localeRow.locale === 'string') {
const [, id] = Object.entries(localeRow).find(
([key, val]) =>
val !== null && !['id', 'locale', 'order', 'parent_id', 'path'].includes(key),
)
newDoc[localeRow.locale] = id
}
})
} else {
const matchedRelationship = rows.find((row) => {
const matchesPath = row.path === relationshipPath
if (locale) return matchesPath && locale === row.locale
return row.path === relationshipPath
})
if (matchedRelationship) {
const [, id] = Object.entries(matchedRelationship).find(
([key, val]) =>
val !== null && !['id', 'locale', 'order', 'parent_id', 'path'].includes(key),
)
doc[field.name] = id
}
}
}
}
}
}
})
}

View File

@@ -0,0 +1,74 @@
export type Groups =
| 'addColumn'
| 'addConstraint'
| 'dropColumn'
| 'dropConstraint'
| 'dropTable'
| 'notNull'
/**
* Convert an "ADD COLUMN" statement to an "ALTER COLUMN" statement
* example: ALTER TABLE "pages_blocks_my_block" ADD COLUMN "person_id" integer NOT NULL;
* to: ALTER TABLE "pages_blocks_my_block" ALTER COLUMN "person_id" SET NOT NULL;
* @param sql
*/
function convertAddColumnToAlterColumn(sql) {
// Regular expression to match the ADD COLUMN statement with its constraints
const regex = /ALTER TABLE ("[^"]+") ADD COLUMN ("[^"]+") [\w\s]+ NOT NULL;/
// Replace the matched part with "ALTER COLUMN ... SET NOT NULL;"
return sql.replace(regex, 'ALTER TABLE $1 ALTER COLUMN $2 SET NOT NULL;')
}
export const groupUpSQLStatements = (list: string[]): Record<Groups, string[]> => {
const groups = {
addColumn: 'ADD COLUMN',
// example: ALTER TABLE "posts" ADD COLUMN "category_id" integer
addConstraint: 'ADD CONSTRAINT',
//example:
// DO $$ BEGIN
// ALTER TABLE "pages_blocks_my_block" ADD CONSTRAINT "pages_blocks_my_block_person_id_users_id_fk" FOREIGN KEY ("person_id") REFERENCES "users"("id") ON DELETE cascade ON UPDATE no action;
// EXCEPTION
// WHEN duplicate_object THEN null;
// END $$;
dropColumn: 'DROP COLUMN',
// example: ALTER TABLE "_posts_v_rels" DROP COLUMN IF EXISTS "posts_id";
dropConstraint: 'DROP CONSTRAINT',
// example: ALTER TABLE "_posts_v_rels" DROP CONSTRAINT "_posts_v_rels_posts_fk";
dropTable: 'DROP TABLE',
// example: DROP TABLE "pages_rels";
notNull: 'NOT NULL',
// example: ALTER TABLE "pages_blocks_my_block" ALTER COLUMN "person_id" SET NOT NULL;
}
const result = Object.keys(groups).reduce((result, group: Groups) => {
result[group] = []
return result
}, {}) as Record<Groups, string[]>
for (const line of list) {
Object.entries(groups).some(([key, value]) => {
if (line.endsWith('NOT NULL;')) {
// split up the ADD COLUMN and ALTER COLUMN NOT NULL statements
// example: ALTER TABLE "pages_blocks_my_block" ADD COLUMN "person_id" integer NOT NULL;
// becomes two separate statements:
// 1. ALTER TABLE "pages_blocks_my_block" ADD COLUMN "person_id" integer;
// 2. ALTER TABLE "pages_blocks_my_block" ALTER COLUMN "person_id" SET NOT NULL;
result.addColumn.push(line.replace(' NOT NULL;', ';'))
result.notNull.push(convertAddColumnToAlterColumn(line))
return true
}
if (line.includes(value)) {
result[key].push(line)
return true
}
})
}
return result
}

View File

@@ -0,0 +1,278 @@
import type { DrizzleSnapshotJSON } from 'drizzle-kit/payload'
import type { Payload } from 'payload'
import type { PayloadRequestWithData } from 'payload/types'
import { sql } from 'drizzle-orm'
import fs from 'fs'
import { createRequire } from 'module'
import { buildVersionCollectionFields, buildVersionGlobalFields } from 'payload/versions'
import toSnakeCase from 'to-snake-case'
import type { PostgresAdapter } from '../../types.js'
import type { PathsToQuery } from './types.js'
import { groupUpSQLStatements } from './groupUpSQLStatements.js'
import { migrateRelationships } from './migrateRelationships.js'
import { traverseFields } from './traverseFields.js'
const require = createRequire(import.meta.url)
type Args = {
debug?: boolean
payload: Payload
req: PayloadRequestWithData
}
/**
* Moves upload and relationship columns from the join table and into the tables while moving data
* This is done in the following order:
* ADD COLUMNs
* -- manipulate data to move relationships to new columns
* ADD CONSTRAINTs
* NOT NULLs
* DROP TABLEs
* DROP CONSTRAINTs
* DROP COLUMNs
* @param debug
* @param payload
* @param req
*/
export const migratePostgresV2toV3 = async ({ debug, payload, req }: Args) => {
const adapter = payload.db as PostgresAdapter
const db = adapter.sessions[req.transactionID]?.db
const dir = payload.db.migrationDir
// get the drizzle migrateUpSQL from drizzle using the last schema
const { generateDrizzleJson, generateMigration } = require('drizzle-kit/payload')
const drizzleJsonAfter = generateDrizzleJson(adapter.schema)
// Get latest migration snapshot
const latestSnapshot = fs
.readdirSync(dir)
.filter((file) => file.endsWith('.json'))
.sort()
.reverse()?.[0]
if (!latestSnapshot) {
throw new Error(
`No previous migration schema file found! A prior migration from v2 is required to migrate to v3.`,
)
}
const drizzleJsonBefore = JSON.parse(
fs.readFileSync(`${dir}/${latestSnapshot}`, 'utf8'),
) as DrizzleSnapshotJSON
const generatedSQL = await generateMigration(drizzleJsonBefore, drizzleJsonAfter)
if (!generatedSQL.length) {
payload.logger.info(`No schema changes needed.`)
process.exit(0)
}
const sqlUpStatements = groupUpSQLStatements(generatedSQL)
const addColumnsStatement = sqlUpStatements.addColumn.join('\n')
if (debug) {
payload.logger.info('CREATING NEW RELATIONSHIP COLUMNS')
payload.logger.info(addColumnsStatement)
}
await db.execute(sql.raw(addColumnsStatement))
for (const collection of payload.config.collections) {
const tableName = adapter.tableNameMap.get(toSnakeCase(collection.slug))
const pathsToQuery: PathsToQuery = new Set()
traverseFields({
adapter,
collectionSlug: collection.slug,
columnPrefix: '',
db,
disableNotNull: false,
fields: collection.fields,
isVersions: false,
newTableName: tableName,
parentTableName: tableName,
path: '',
pathsToQuery,
payload,
rootTableName: tableName,
})
await migrateRelationships({
adapter,
collectionSlug: collection.slug,
db,
debug,
fields: collection.fields,
isVersions: false,
pathsToQuery,
payload,
req,
tableName,
})
if (collection.versions) {
const versionsTableName = adapter.tableNameMap.get(
`_${toSnakeCase(collection.slug)}${adapter.versionsSuffix}`,
)
const versionFields = buildVersionCollectionFields(collection)
const versionPathsToQuery: PathsToQuery = new Set()
traverseFields({
adapter,
collectionSlug: collection.slug,
columnPrefix: '',
db,
disableNotNull: true,
fields: versionFields,
isVersions: true,
newTableName: versionsTableName,
parentTableName: versionsTableName,
path: '',
pathsToQuery: versionPathsToQuery,
payload,
rootTableName: versionsTableName,
})
await migrateRelationships({
adapter,
collectionSlug: collection.slug,
db,
debug,
fields: versionFields,
isVersions: true,
pathsToQuery: versionPathsToQuery,
payload,
req,
tableName: versionsTableName,
})
}
}
for (const global of payload.config.globals) {
const tableName = adapter.tableNameMap.get(toSnakeCase(global.slug))
const pathsToQuery: PathsToQuery = new Set()
traverseFields({
adapter,
columnPrefix: '',
db,
disableNotNull: false,
fields: global.fields,
globalSlug: global.slug,
isVersions: false,
newTableName: tableName,
parentTableName: tableName,
path: '',
pathsToQuery,
payload,
rootTableName: tableName,
})
await migrateRelationships({
adapter,
db,
debug,
fields: global.fields,
globalSlug: global.slug,
isVersions: false,
pathsToQuery,
payload,
req,
tableName,
})
if (global.versions) {
const versionsTableName = adapter.tableNameMap.get(
`_${toSnakeCase(global.slug)}${adapter.versionsSuffix}`,
)
const versionFields = buildVersionGlobalFields(global)
const versionPathsToQuery: PathsToQuery = new Set()
traverseFields({
adapter,
columnPrefix: '',
db,
disableNotNull: true,
fields: versionFields,
globalSlug: global.slug,
isVersions: true,
newTableName: versionsTableName,
parentTableName: versionsTableName,
path: '',
pathsToQuery: versionPathsToQuery,
payload,
rootTableName: versionsTableName,
})
await migrateRelationships({
adapter,
db,
debug,
fields: versionFields,
globalSlug: global.slug,
isVersions: true,
pathsToQuery: versionPathsToQuery,
payload,
req,
tableName: versionsTableName,
})
}
}
// ADD CONSTRAINT
const addConstraintsStatement = sqlUpStatements.addConstraint.join('\n')
if (debug) {
payload.logger.info('ADDING CONSTRAINTS')
payload.logger.info(addConstraintsStatement)
}
await db.execute(sql.raw(addConstraintsStatement))
// NOT NULL
const notNullStatements = sqlUpStatements.notNull.join('\n')
if (debug) {
payload.logger.info('NOT NULL CONSTRAINTS')
payload.logger.info(notNullStatements)
}
await db.execute(sql.raw(notNullStatements))
// DROP TABLE
const dropTablesStatement = sqlUpStatements.dropTable.join('\n')
if (debug) {
payload.logger.info('DROPPING TABLES')
payload.logger.info(dropTablesStatement)
}
await db.execute(sql.raw(dropTablesStatement))
// DROP CONSTRAINT
const dropConstraintsStatement = sqlUpStatements.dropConstraint.join('\n')
if (debug) {
payload.logger.info('DROPPING CONSTRAINTS')
payload.logger.info(dropConstraintsStatement)
}
await db.execute(sql.raw(dropConstraintsStatement))
// DROP COLUMN
const dropColumnsStatement = sqlUpStatements.dropColumn.join('\n')
if (debug) {
payload.logger.info('DROPPING COLUMNS')
payload.logger.info(dropColumnsStatement)
}
await db.execute(sql.raw(dropColumnsStatement))
}

View File

@@ -0,0 +1,102 @@
import type { Field, Payload, PayloadRequestWithData } from 'payload/types'
import { sql } from 'drizzle-orm'
import type { DrizzleTransaction, PostgresAdapter } from '../../types.js'
import type { DocsToResave, PathsToQuery } from './types.js'
import { fetchAndResave } from './fetchAndResave/index.js'
type Args = {
adapter: PostgresAdapter
collectionSlug?: string
db: DrizzleTransaction
debug: boolean
fields: Field[]
globalSlug?: string
isVersions: boolean
pathsToQuery: PathsToQuery
payload: Payload
req: PayloadRequestWithData
tableName: string
}
export const migrateRelationships = async ({
adapter,
collectionSlug,
db,
debug,
fields,
globalSlug,
isVersions,
pathsToQuery,
payload,
req,
tableName,
}: Args) => {
if (pathsToQuery.size === 0) return
let offset = 0
let paginationResult
const where = Array.from(pathsToQuery).reduce((statement, path, i) => {
return (statement += `
"${tableName}${adapter.relationshipsSuffix}"."path" LIKE '${path}'${pathsToQuery.size !== i + 1 ? ' OR' : ''}
`)
}, '')
while (typeof paginationResult === 'undefined' || paginationResult.rows.length > 0) {
const paginationStatement = `SELECT DISTINCT parent_id FROM ${tableName}${adapter.relationshipsSuffix} WHERE
${where} ORDER BY parent_id LIMIT 500 OFFSET ${offset * 500};
`
paginationResult = await adapter.drizzle.execute(sql.raw(`${paginationStatement}`))
if (paginationResult.rows.length === 0) return
offset += 1
const statement = `SELECT * FROM ${tableName}${adapter.relationshipsSuffix} WHERE
(${where}) AND parent_id IN (${paginationResult.rows.map((row) => row.parent_id).join(', ')});
`
if (debug) {
payload.logger.info('FINDING ROWS TO MIGRATE')
payload.logger.info(statement)
}
const result = await adapter.drizzle.execute(sql.raw(`${statement}`))
const docsToResave: DocsToResave = {}
result.rows.forEach((row) => {
const parentID = row.parent_id
if (typeof parentID === 'string' || typeof parentID === 'number') {
if (!docsToResave[parentID]) docsToResave[parentID] = []
docsToResave[parentID].push(row)
}
})
await fetchAndResave({
adapter,
collectionSlug,
db,
debug,
docsToResave,
fields,
globalSlug,
isVersions,
payload,
req,
tableName,
})
}
const deleteStatement = `DELETE FROM ${tableName}${adapter.relationshipsSuffix} WHERE ${where}`
if (debug) {
payload.logger.info('DELETING ROWS')
payload.logger.info(deleteStatement)
}
await db.execute(sql.raw(`${deleteStatement}`))
}

View File

@@ -0,0 +1,116 @@
import type { Payload } from 'payload'
import { type Field, tabHasName } from 'payload/types'
import toSnakeCase from 'to-snake-case'
import type { DrizzleTransaction, PostgresAdapter } from '../../types.js'
import type { PathsToQuery } from './types.js'
type Args = {
adapter: PostgresAdapter
collectionSlug?: string
columnPrefix: string
db: DrizzleTransaction
disableNotNull: boolean
fields: Field[]
globalSlug?: string
isVersions: boolean
newTableName: string
parentTableName: string
path: string
pathsToQuery: PathsToQuery
payload: Payload
rootTableName: string
}
export const traverseFields = (args: Args) => {
args.fields.forEach((field) => {
switch (field.type) {
case 'group': {
let newTableName = `${args.newTableName}_${toSnakeCase(field.name)}`
if (field.localized && args.payload.config.localization) {
newTableName += args.adapter.localesSuffix
}
return traverseFields({
...args,
columnPrefix: `${args.columnPrefix}${toSnakeCase(field.name)}_`,
fields: field.fields,
newTableName,
path: `${args.path ? `${args.path}.` : ''}${field.name}`,
})
}
case 'row':
case 'collapsible': {
return traverseFields({
...args,
fields: field.fields,
})
}
case 'array': {
const newTableName = args.adapter.tableNameMap.get(
`${args.newTableName}_${toSnakeCase(field.name)}`,
)
return traverseFields({
...args,
columnPrefix: '',
fields: field.fields,
newTableName,
parentTableName: newTableName,
path: `${args.path ? `${args.path}.` : ''}${field.name}.%`,
})
}
case 'blocks': {
return field.blocks.forEach((block) => {
const newTableName = args.adapter.tableNameMap.get(
`${args.rootTableName}_blocks_${toSnakeCase(block.slug)}`,
)
traverseFields({
...args,
columnPrefix: '',
fields: block.fields,
newTableName,
parentTableName: newTableName,
path: `${args.path ? `${args.path}.` : ''}${field.name}.%`,
})
})
}
case 'tabs': {
return field.tabs.forEach((tab) => {
if (tabHasName(tab)) {
args.columnPrefix = `${args.columnPrefix}_${toSnakeCase(tab.name)}_`
args.path = `${args.path ? `${args.path}.` : ''}${tab.name}`
args.newTableName = `${args.newTableName}_${toSnakeCase(tab.name)}`
if (tab.localized && args.payload.config.localization) {
args.newTableName += args.adapter.localesSuffix
}
}
traverseFields({
...args,
fields: tab.fields,
})
})
}
case 'relationship':
case 'upload': {
if (typeof field.relationTo === 'string') {
if (field.type === 'upload' || !field.hasMany) {
args.pathsToQuery.add(`${args.path ? `${args.path}.` : ''}${field.name}`)
}
}
return null
}
}
})
}

View File

@@ -0,0 +1,9 @@
/**
* Set of all paths which should be moved
* This will be built up into one WHERE query
*/
export type PathsToQuery = Set<string>
export type DocsToResave = {
[id: number | string]: Record<string, unknown>[]
}

View File

@@ -2,14 +2,13 @@ import type { SQL } from 'drizzle-orm'
import type { Field, Where } from 'payload/types'
import type { GenericColumn, PostgresAdapter } from '../types.js'
import type { BuildQueryJoinAliases, BuildQueryJoins } from './buildQuery.js'
import type { BuildQueryJoinAliases } from './buildQuery.js'
import { parseParams } from './parseParams.js'
export async function buildAndOrConditions({
adapter,
fields,
joinAliases,
joins,
locale,
selectFields,
@@ -20,8 +19,7 @@ export async function buildAndOrConditions({
collectionSlug?: string
fields: Field[]
globalSlug?: string
joinAliases: BuildQueryJoinAliases
joins: BuildQueryJoins
joins: BuildQueryJoinAliases
locale?: string
selectFields: Record<string, GenericColumn>
tableName: string
@@ -38,7 +36,6 @@ export async function buildAndOrConditions({
const result = await parseParams({
adapter,
fields,
joinAliases,
joins,
locale,
selectFields,

View File

@@ -26,8 +26,7 @@ type BuildQueryArgs = {
}
type Result = {
joinAliases: BuildQueryJoinAliases
joins: BuildQueryJoins
joins: BuildQueryJoinAliases
orderBy: {
column: GenericColumn
order: typeof asc | typeof desc
@@ -46,8 +45,7 @@ const buildQuery = async function buildQuery({
const selectFields: Record<string, GenericColumn> = {
id: adapter.tables[tableName].id,
}
const joins: BuildQueryJoins = {}
const joinAliases: BuildQueryJoinAliases = []
const joins: BuildQueryJoinAliases = []
const orderBy: Result['orderBy'] = {
column: null,
@@ -70,7 +68,6 @@ const buildQuery = async function buildQuery({
adapter,
collectionPath: sortPath,
fields,
joinAliases,
joins,
locale,
pathSegments: sortPath.replace(/__/g, '.').split('.'),
@@ -105,7 +102,6 @@ const buildQuery = async function buildQuery({
where = await parseParams({
adapter,
fields,
joinAliases,
joins,
locale,
selectFields,
@@ -115,7 +111,6 @@ const buildQuery = async function buildQuery({
}
return {
joinAliases,
joins,
orderBy,
selectFields,

View File

@@ -12,7 +12,7 @@ import toSnakeCase from 'to-snake-case'
import { v4 as uuid } from 'uuid'
import type { GenericColumn, GenericTable, PostgresAdapter } from '../types.js'
import type { BuildQueryJoinAliases, BuildQueryJoins } from './buildQuery.js'
import type { BuildQueryJoinAliases } from './buildQuery.js'
type Constraint = {
columnName: string
@@ -38,8 +38,7 @@ type Args = {
constraintPath?: string
constraints?: Constraint[]
fields: (Field | TabAsField)[]
joinAliases: BuildQueryJoinAliases
joins: BuildQueryJoins
joins: BuildQueryJoinAliases
locale?: string
pathSegments: string[]
rootTableName?: string
@@ -67,7 +66,6 @@ export const getTableColumnFromPath = ({
constraintPath: incomingConstraintPath,
constraints = [],
fields,
joinAliases,
joins,
locale: incomingLocale,
pathSegments: incomingSegments,
@@ -129,7 +127,6 @@ export const getTableColumnFromPath = ({
...tab,
type: 'tab',
})),
joinAliases,
joins,
locale,
pathSegments: pathSegments.slice(1),
@@ -150,7 +147,6 @@ export const getTableColumnFromPath = ({
constraintPath: `${constraintPath}${field.name}.`,
constraints,
fields: field.fields,
joinAliases,
joins,
locale,
pathSegments: pathSegments.slice(1),
@@ -169,7 +165,6 @@ export const getTableColumnFromPath = ({
constraintPath,
constraints,
fields: field.fields,
joinAliases,
joins,
locale,
pathSegments: pathSegments.slice(1),
@@ -185,10 +180,10 @@ export const getTableColumnFromPath = ({
if (locale && field.localized && adapter.payload.config.localization) {
newTableName = `${tableName}${adapter.localesSuffix}`
joins[tableName] = eq(
adapter.tables[tableName].id,
adapter.tables[newTableName]._parentID,
)
joins.push({
condition: eq(adapter.tables[tableName].id, adapter.tables[newTableName]._parentID),
table: adapter.tables[newTableName],
})
if (locale !== 'all') {
constraints.push({
columnName: '_locale',
@@ -205,7 +200,6 @@ export const getTableColumnFromPath = ({
constraintPath: `${constraintPath}${field.name}.`,
constraints,
fields: field.fields,
joinAliases,
joins,
locale,
pathSegments: pathSegments.slice(1),
@@ -224,10 +218,13 @@ export const getTableColumnFromPath = ({
)
if (locale && field.localized && adapter.payload.config.localization) {
joins[newTableName] = and(
eq(adapter.tables[tableName].id, adapter.tables[newTableName].parent),
eq(adapter.tables[newTableName]._locale, locale),
)
joins.push({
condition: and(
eq(adapter.tables[tableName].id, adapter.tables[newTableName].parent),
eq(adapter.tables[newTableName]._locale, locale),
),
table: adapter.tables[newTableName],
})
if (locale !== 'all') {
constraints.push({
columnName: '_locale',
@@ -236,10 +233,10 @@ export const getTableColumnFromPath = ({
})
}
} else {
joins[newTableName] = eq(
adapter.tables[tableName].id,
adapter.tables[newTableName].parent,
)
joins.push({
condition: eq(adapter.tables[tableName].id, adapter.tables[newTableName].parent),
table: adapter.tables[newTableName],
})
}
return {
@@ -268,10 +265,10 @@ export const getTableColumnFromPath = ({
]
if (locale && field.localized && adapter.payload.config.localization) {
joins[newTableName] = and(
...joinConstraints,
eq(adapter.tables[newTableName]._locale, locale),
)
joins.push({
condition: and(...joinConstraints, eq(adapter.tables[newTableName]._locale, locale)),
table: adapter.tables[newTableName],
})
if (locale !== 'all') {
constraints.push({
columnName: 'locale',
@@ -280,7 +277,10 @@ export const getTableColumnFromPath = ({
})
}
} else {
joins[newTableName] = and(...joinConstraints)
joins.push({
condition: and(...joinConstraints),
table: adapter.tables[newTableName],
})
}
return {
@@ -300,10 +300,13 @@ export const getTableColumnFromPath = ({
constraintPath = `${constraintPath}${field.name}.%.`
if (locale && field.localized && adapter.payload.config.localization) {
joins[newTableName] = and(
eq(adapter.tables[tableName].id, adapter.tables[newTableName]._parentID),
eq(adapter.tables[newTableName]._locale, locale),
)
joins.push({
condition: and(
eq(adapter.tables[tableName].id, adapter.tables[newTableName]._parentID),
eq(adapter.tables[newTableName]._locale, locale),
),
table: adapter.tables[newTableName],
})
if (locale !== 'all') {
constraints.push({
columnName: '_locale',
@@ -312,10 +315,10 @@ export const getTableColumnFromPath = ({
})
}
} else {
joins[newTableName] = eq(
adapter.tables[tableName].id,
adapter.tables[newTableName]._parentID,
)
joins.push({
condition: eq(adapter.tables[tableName].id, adapter.tables[newTableName]._parentID),
table: adapter.tables[newTableName],
})
}
return getTableColumnFromPath({
adapter,
@@ -323,7 +326,6 @@ export const getTableColumnFromPath = ({
constraintPath,
constraints,
fields: field.fields,
joinAliases,
joins,
locale,
pathSegments: pathSegments.slice(1),
@@ -344,18 +346,19 @@ export const getTableColumnFromPath = ({
const blockTypes = Array.isArray(value) ? value : [value]
blockTypes.forEach((blockType) => {
const block = field.blocks.find((block) => block.slug === blockType)
newTableName = adapter.tableNameMap.get(
`${tableName}_blocks_${toSnakeCase(block.slug)}`,
)
const newAliasTableName = toSnakeCase(uuid())
const newAliasTable = alias(adapter.tables[newTableName], newAliasTableName)
joins[newTableName] = eq(
adapter.tables[tableName].id,
adapter.tables[newTableName]._parentID,
)
joins.push({
condition: eq(adapter.tables[tableName].id, newAliasTable._parentID),
table: newAliasTable,
})
constraints.push({
columnName: '_path',
table: adapter.tables[newTableName],
table: newAliasTable,
value: pathSegments[0],
})
})
@@ -381,7 +384,6 @@ export const getTableColumnFromPath = ({
constraintPath,
constraints: blockConstraints,
fields: block.fields,
joinAliases,
joins,
locale,
pathSegments: pathSegments.slice(1),
@@ -400,10 +402,16 @@ export const getTableColumnFromPath = ({
constraints = constraints.concat(blockConstraints)
selectFields = { ...selectFields, ...blockSelectFields }
if (field.localized && adapter.payload.config.localization) {
joins[newTableName] = and(
eq(adapter.tables[tableName].id, adapter.tables[newTableName]._parentID),
eq(adapter.tables[newTableName]._locale, locale),
)
joins.push({
condition: and(
eq(
(aliasTable || adapter.tables[tableName]).id,
adapter.tables[newTableName]._parentID,
),
eq(adapter.tables[newTableName]._locale, locale),
),
table: adapter.tables[newTableName],
})
if (locale) {
constraints.push({
columnName: '_locale',
@@ -412,10 +420,13 @@ export const getTableColumnFromPath = ({
})
}
} else {
joins[newTableName] = eq(
adapter.tables[tableName].id,
adapter.tables[newTableName]._parentID,
)
joins.push({
condition: eq(
(aliasTable || adapter.tables[tableName]).id,
adapter.tables[newTableName]._parentID,
),
table: adapter.tables[newTableName],
})
}
return true
})
@@ -434,116 +445,178 @@ export const getTableColumnFromPath = ({
case 'relationship':
case 'upload': {
let relationshipFields
const relationTableName = `${rootTableName}${adapter.relationshipsSuffix}`
const newCollectionPath = pathSegments.slice(1).join('.')
const aliasRelationshipTableName = uuid()
const aliasRelationshipTable = alias(
adapter.tables[relationTableName],
aliasRelationshipTableName,
)
if (Array.isArray(field.relationTo) || (field.type === 'relationship' && field.hasMany)) {
let relationshipFields
const relationTableName = `${rootTableName}${adapter.relationshipsSuffix}`
const aliasRelationshipTableName = uuid()
const aliasRelationshipTable = alias(
adapter.tables[relationTableName],
aliasRelationshipTableName,
)
// Join in the relationships table
if (locale && field.localized && adapter.payload.config.localization) {
joinAliases.push({
condition: and(
eq((aliasTable || adapter.tables[rootTableName]).id, aliasRelationshipTable.parent),
eq(aliasRelationshipTable.locale, locale),
like(aliasRelationshipTable.path, `${constraintPath}${field.name}`),
),
table: aliasRelationshipTable,
})
if (locale !== 'all') {
constraints.push({
columnName: 'locale',
// Join in the relationships table
if (locale && field.localized && adapter.payload.config.localization) {
joins.push({
condition: and(
eq((aliasTable || adapter.tables[rootTableName]).id, aliasRelationshipTable.parent),
eq(aliasRelationshipTable.locale, locale),
like(aliasRelationshipTable.path, `${constraintPath}${field.name}`),
),
table: aliasRelationshipTable,
})
if (locale !== 'all') {
constraints.push({
columnName: 'locale',
table: aliasRelationshipTable,
value: locale,
})
}
} else {
// Join in the relationships table
joins.push({
condition: and(
eq((aliasTable || adapter.tables[rootTableName]).id, aliasRelationshipTable.parent),
like(aliasRelationshipTable.path, `${constraintPath}${field.name}`),
),
table: aliasRelationshipTable,
value: locale,
})
}
} else {
// Join in the relationships table
joinAliases.push({
condition: and(
eq((aliasTable || adapter.tables[rootTableName]).id, aliasRelationshipTable.parent),
like(aliasRelationshipTable.path, `${constraintPath}${field.name}`),
),
table: aliasRelationshipTable,
})
}
selectFields[`${relationTableName}.path`] = aliasRelationshipTable.path
selectFields[`${relationTableName}.path`] = aliasRelationshipTable.path
let newAliasTable
let newAliasTable
if (typeof field.relationTo === 'string') {
const relationshipConfig = adapter.payload.collections[field.relationTo].config
if (typeof field.relationTo === 'string') {
const relationshipConfig = adapter.payload.collections[field.relationTo].config
newTableName = adapter.tableNameMap.get(toSnakeCase(relationshipConfig.slug))
newTableName = adapter.tableNameMap.get(toSnakeCase(relationshipConfig.slug))
// parent to relationship join table
relationshipFields = relationshipConfig.fields
// parent to relationship join table
relationshipFields = relationshipConfig.fields
newAliasTable = alias(adapter.tables[newTableName], toSnakeCase(uuid()))
newAliasTable = alias(adapter.tables[newTableName], toSnakeCase(uuid()))
joinAliases.push({
condition: eq(newAliasTable.id, aliasRelationshipTable[`${field.relationTo}ID`]),
table: newAliasTable,
})
joins.push({
condition: eq(newAliasTable.id, aliasRelationshipTable[`${field.relationTo}ID`]),
table: newAliasTable,
})
if (newCollectionPath === '' || newCollectionPath === 'id') {
if (newCollectionPath === '' || newCollectionPath === 'id') {
return {
columnName: `${field.relationTo}ID`,
constraints,
field,
table: aliasRelationshipTable,
}
}
} else if (newCollectionPath === 'value') {
const tableColumnsNames = field.relationTo.map((relationTo) => {
const relationTableName = adapter.tableNameMap.get(
toSnakeCase(adapter.payload.collections[relationTo].config.slug),
)
return `"${aliasRelationshipTableName}"."${relationTableName}_id"`
})
return {
columnName: `${field.relationTo}ID`,
constraints,
field,
rawColumn: sql.raw(`COALESCE(${tableColumnsNames.join(', ')})`),
table: aliasRelationshipTable,
}
}
} else if (newCollectionPath === 'value') {
const tableColumnsNames = field.relationTo.map((relationTo) => {
const relationTableName = adapter.tableNameMap.get(
toSnakeCase(adapter.payload.collections[relationTo].config.slug),
)
} else if (newCollectionPath === 'relationTo') {
const relationTo = Array.isArray(field.relationTo)
? field.relationTo
: [field.relationTo]
return `"${aliasRelationshipTableName}"."${relationTableName}_id"`
return {
constraints,
field,
getNotNullColumnByValue: (val) => {
const matchedRelation = relationTo.find((relation) => relation === val)
if (matchedRelation) return `${matchedRelation}ID`
return undefined
},
table: aliasRelationshipTable,
}
} else {
throw new APIError('Not supported')
}
return getTableColumnFromPath({
adapter,
aliasTable: newAliasTable,
collectionPath: newCollectionPath,
constraints,
fields: relationshipFields,
joins,
locale,
pathSegments: pathSegments.slice(1),
rootTableName: newTableName,
selectFields,
tableName: newTableName,
value,
})
return {
constraints,
field,
rawColumn: sql.raw(`COALESCE(${tableColumnsNames.join(', ')})`),
table: aliasRelationshipTable,
}
} else if (newCollectionPath === 'relationTo') {
const relationTo = Array.isArray(field.relationTo) ? field.relationTo : [field.relationTo]
} else if (
pathSegments.length > 1 &&
!(pathSegments.length === 2 && pathSegments[1] === 'id')
) {
// simple relationships
const columnName = `${columnPrefix}${field.name}`
const newTableName = adapter.tableNameMap.get(
toSnakeCase(adapter.payload.collections[field.relationTo].config.slug),
)
const aliasTableName = uuid()
const newAliasTable = alias(adapter.tables[newTableName], aliasTableName)
return {
constraints,
field,
getNotNullColumnByValue: (val) => {
const matchedRelation = relationTo.find((relation) => relation === val)
if (matchedRelation) return `${matchedRelation}ID`
return undefined
},
table: aliasRelationshipTable,
if (field.localized && adapter.payload.config.localization) {
const aliasLocaleTableName = uuid()
const aliasLocaleTable = alias(
adapter.tables[`${rootTableName}${adapter.localesSuffix}`],
aliasLocaleTableName,
)
joins.push({
condition: and(
eq(aliasLocaleTable._parentID, adapter.tables[rootTableName].id),
eq(aliasLocaleTable._locale, locale),
),
table: aliasLocaleTable,
})
joins.push({
condition: eq(aliasLocaleTable[columnName], newAliasTable.id),
table: newAliasTable,
})
} else {
joins.push({
condition: eq(
newAliasTable.id,
aliasTable ? aliasTable[columnName] : adapter.tables[tableName][columnName],
),
table: newAliasTable,
})
}
} else {
throw new APIError('Not supported')
return getTableColumnFromPath({
adapter,
aliasTable: newAliasTable,
collectionPath: newCollectionPath,
constraintPath: '',
constraints,
fields: adapter.payload.collections[field.relationTo].config.fields,
joins,
locale,
pathSegments: pathSegments.slice(1),
selectFields,
tableName: newTableName,
value,
})
}
break
}
return getTableColumnFromPath({
adapter,
aliasTable: newAliasTable,
collectionPath: newCollectionPath,
constraints,
fields: relationshipFields,
joinAliases,
joins,
locale,
pathSegments: pathSegments.slice(1),
rootTableName: newTableName,
selectFields,
tableName: newTableName,
value,
})
default: {
// fall through
break
}
}
@@ -551,11 +624,13 @@ export const getTableColumnFromPath = ({
if (field.localized && adapter.payload.config.localization) {
// If localized, we go to localized table and set aliasTable to undefined
// so it is not picked up below to be used as targetTable
const parentTable = aliasTable || adapter.tables[tableName]
newTableName = `${tableName}${adapter.localesSuffix}`
const parentTable = aliasTable || adapter.tables[tableName]
joins[newTableName] = eq(parentTable.id, adapter.tables[newTableName]._parentID)
joins.push({
condition: eq(parentTable.id, adapter.tables[newTableName]._parentID),
table: adapter.tables[newTableName],
})
aliasTable = undefined

View File

@@ -7,7 +7,7 @@ import { QueryError } from 'payload/errors'
import { validOperators } from 'payload/types'
import type { GenericColumn, PostgresAdapter } from '../types.js'
import type { BuildQueryJoinAliases, BuildQueryJoins } from './buildQuery.js'
import type { BuildQueryJoinAliases } from './buildQuery.js'
import { buildAndOrConditions } from './buildAndOrConditions.js'
import { convertPathToJSONTraversal } from './createJSONQuery/convertPathToJSONTraversal.js'
@@ -19,8 +19,7 @@ import { sanitizeQueryValue } from './sanitizeQueryValue.js'
type Args = {
adapter: PostgresAdapter
fields: Field[]
joinAliases: BuildQueryJoinAliases
joins: BuildQueryJoins
joins: BuildQueryJoinAliases
locale: string
selectFields: Record<string, GenericColumn>
tableName: string
@@ -30,7 +29,6 @@ type Args = {
export async function parseParams({
adapter,
fields,
joinAliases,
joins,
locale,
selectFields,
@@ -55,7 +53,6 @@ export async function parseParams({
const builtConditions = await buildAndOrConditions({
adapter,
fields,
joinAliases,
joins,
locale,
selectFields,
@@ -86,7 +83,6 @@ export async function parseParams({
adapter,
collectionPath: relationOrPath,
fields,
joinAliases,
joins,
locale,
pathSegments: relationOrPath.replace(/__/g, '.').split('.'),

View File

@@ -2,7 +2,7 @@ import type { QueryPromise, SQL } from 'drizzle-orm'
import type { ChainedMethods } from '../find/chainMethods.js'
import type { DrizzleDB, PostgresAdapter } from '../types.js'
import type { BuildQueryJoinAliases, BuildQueryJoins } from './buildQuery.js'
import type { BuildQueryJoinAliases } from './buildQuery.js'
import { chainMethods } from '../find/chainMethods.js'
import { type GenericColumn } from '../types.js'
@@ -11,8 +11,7 @@ type Args = {
adapter: PostgresAdapter
chainedMethods?: ChainedMethods
db: DrizzleDB
joinAliases: BuildQueryJoinAliases
joins: BuildQueryJoins
joins: BuildQueryJoinAliases
selectFields: Record<string, GenericColumn>
tableName: string
where: SQL
@@ -25,33 +24,23 @@ export const selectDistinct = ({
adapter,
chainedMethods = [],
db,
joinAliases,
joins,
selectFields,
tableName,
where,
}: Args): QueryPromise<Record<string, GenericColumn> & { id: number | string }[]> => {
if (Object.keys(joins).length > 0 || joinAliases.length > 0) {
if (Object.keys(joins).length > 0) {
if (where) {
chainedMethods.push({ args: [where], method: 'where' })
}
joinAliases.forEach(({ condition, table }) => {
joins.forEach(({ condition, table }) => {
chainedMethods.push({
args: [table, condition],
method: 'leftJoin',
})
})
Object.entries(joins).forEach(([joinTable, condition]) => {
if (joinTable) {
chainedMethods.push({
args: [adapter.tables[joinTable], condition],
method: 'leftJoin',
})
}
})
return chainMethods({
methods: chainedMethods,
query: db.selectDistinct(selectFields).from(adapter.tables[tableName]),

View File

@@ -34,17 +34,18 @@ export type BaseExtraConfig = Record<
(cols: GenericColumns) => ForeignKeyBuilder | IndexBuilder | UniqueConstraintBuilder
>
export type RelationMap = Map<string, { localized: boolean; target: string; type: 'many' | 'one' }>
type Args = {
adapter: PostgresAdapter
baseColumns?: Record<string, PgColumnBuilder>
baseExtraConfig?: BaseExtraConfig
buildNumbers?: boolean
buildRelationships?: boolean
buildTexts?: boolean
disableNotNull: boolean
disableUnique: boolean
fields: Field[]
rootRelationsToBuild?: Map<string, string>
rootRelationsToBuild?: RelationMap
rootRelationships?: Set<string>
rootTableIDColType?: string
rootTableName?: string
@@ -56,16 +57,13 @@ type Args = {
type Result = {
hasManyNumberField: 'index' | boolean
hasManyTextField: 'index' | boolean
relationsToBuild: Map<string, string>
relationsToBuild: RelationMap
}
export const buildTable = ({
adapter,
baseColumns = {},
baseExtraConfig = {},
buildNumbers,
buildRelationships,
buildTexts,
disableNotNull,
disableUnique = false,
fields,
@@ -77,6 +75,7 @@ export const buildTable = ({
timestamps,
versions,
}: Args): Result => {
const isRoot = !incomingRootTableName
const rootTableName = incomingRootTableName || tableName
const columns: Record<string, PgColumnBuilder> = baseColumns
const indexes: Record<string, (cols: GenericColumns) => IndexBuilder> = {}
@@ -93,7 +92,7 @@ export const buildTable = ({
let relationshipsTable: GenericTable | PgTableWithColumns<any>
// Drizzle relations
const relationsToBuild: Map<string, string> = new Map()
const relationsToBuild: RelationMap = new Map()
const idColType: IDType = setColumnID({ adapter, columns, fields })
@@ -106,9 +105,6 @@ export const buildTable = ({
hasManyTextField,
} = traverseFields({
adapter,
buildNumbers,
buildRelationships,
buildTexts,
columns,
disableNotNull,
disableUnique,
@@ -126,6 +122,15 @@ export const buildTable = ({
versions,
})
// split the relationsToBuild by localized and non-localized
const localizedRelations = new Map()
const nonLocalizedRelations = new Map()
relationsToBuild.forEach(({ type, localized, target }, key) => {
const map = localized ? localizedRelations : nonLocalizedRelations
map.set(key, { type, target })
})
if (timestamps) {
columns.createdAt = timestamp('created_at', {
mode: 'string',
@@ -159,7 +164,7 @@ export const buildTable = ({
adapter.tables[tableName] = table
if (hasLocalizedField) {
if (hasLocalizedField || localizedRelations.size) {
const localeTableName = `${tableName}${adapter.localesSuffix}`
localesColumns.id = serial('id').primaryKey()
localesColumns._locale = adapter.enums.enum__locales('_locale').notNull()
@@ -187,114 +192,134 @@ export const buildTable = ({
adapter.tables[localeTableName] = localesTable
const localesTableRelations = relations(localesTable, ({ one }) => ({
_parentID: one(table, {
adapter.relations[`relations_${localeTableName}`] = relations(localesTable, ({ many, one }) => {
const result: Record<string, Relation<string>> = {}
result._parentID = one(table, {
fields: [localesTable._parentID],
references: [table.id],
}),
}))
// name the relationship by what the many() relationName is
relationName: '_locales',
})
adapter.relations[`relations_${localeTableName}`] = localesTableRelations
localizedRelations.forEach(({ type, target }, key) => {
if (type === 'one') {
result[key] = one(adapter.tables[target], {
fields: [localesTable[key]],
references: [adapter.tables[target].id],
relationName: key,
})
}
if (type === 'many') {
result[key] = many(adapter.tables[target], {
relationName: key,
})
}
})
return result
})
}
if (hasManyTextField && buildTexts) {
const textsTableName = `${rootTableName}_texts`
const columns: Record<string, PgColumnBuilder> = {
id: serial('id').primaryKey(),
order: integer('order').notNull(),
parent: parentIDColumnMap[idColType]('parent_id').notNull(),
path: varchar('path').notNull(),
text: varchar('text'),
}
if (hasLocalizedManyTextField) {
columns.locale = adapter.enums.enum__locales('locale')
}
textsTable = adapter.pgSchema.table(textsTableName, columns, (cols) => {
const config: Record<string, ForeignKeyBuilder | IndexBuilder> = {
orderParentIdx: index(`${textsTableName}_order_parent_idx`).on(cols.order, cols.parent),
parentFk: foreignKey({
name: `${textsTableName}_parent_fk`,
columns: [cols.parent],
foreignColumns: [table.id],
}).onDelete('cascade'),
}
if (hasManyTextField === 'index') {
config.text_idx = index(`${textsTableName}_text_idx`).on(cols.text)
if (isRoot) {
if (hasManyTextField) {
const textsTableName = `${rootTableName}_texts`
const columns: Record<string, PgColumnBuilder> = {
id: serial('id').primaryKey(),
order: integer('order').notNull(),
parent: parentIDColumnMap[idColType]('parent_id').notNull(),
path: varchar('path').notNull(),
text: varchar('text'),
}
if (hasLocalizedManyTextField) {
config.localeParent = index(`${textsTableName}_locale_parent`).on(cols.locale, cols.parent)
columns.locale = adapter.enums.enum__locales('locale')
}
return config
})
textsTable = adapter.pgSchema.table(textsTableName, columns, (cols) => {
const config: Record<string, ForeignKeyBuilder | IndexBuilder> = {
orderParentIdx: index(`${textsTableName}_order_parent_idx`).on(cols.order, cols.parent),
parentFk: foreignKey({
name: `${textsTableName}_parent_fk`,
columns: [cols.parent],
foreignColumns: [table.id],
}).onDelete('cascade'),
}
adapter.tables[textsTableName] = textsTable
if (hasManyTextField === 'index') {
config.text_idx = index(`${textsTableName}_text_idx`).on(cols.text)
}
const textsTableRelations = relations(textsTable, ({ one }) => ({
parent: one(table, {
fields: [textsTable.parent],
references: [table.id],
}),
}))
if (hasLocalizedManyTextField) {
config.localeParent = index(`${textsTableName}_locale_parent`).on(
cols.locale,
cols.parent,
)
}
adapter.relations[`relations_${textsTableName}`] = textsTableRelations
}
return config
})
if (hasManyNumberField && buildNumbers) {
const numbersTableName = `${rootTableName}_numbers`
const columns: Record<string, PgColumnBuilder> = {
id: serial('id').primaryKey(),
number: numeric('number'),
order: integer('order').notNull(),
parent: parentIDColumnMap[idColType]('parent_id').notNull(),
path: varchar('path').notNull(),
adapter.tables[textsTableName] = textsTable
adapter.relations[`relations_${textsTableName}`] = relations(textsTable, ({ one }) => ({
parent: one(table, {
fields: [textsTable.parent],
references: [table.id],
relationName: '_texts',
}),
}))
}
if (hasLocalizedManyNumberField) {
columns.locale = adapter.enums.enum__locales('locale')
}
numbersTable = adapter.pgSchema.table(numbersTableName, columns, (cols) => {
const config: Record<string, ForeignKeyBuilder | IndexBuilder> = {
orderParentIdx: index(`${numbersTableName}_order_parent_idx`).on(cols.order, cols.parent),
parentFk: foreignKey({
name: `${numbersTableName}_parent_fk`,
columns: [cols.parent],
foreignColumns: [table.id],
}).onDelete('cascade'),
}
if (hasManyNumberField === 'index') {
config.numberIdx = index(`${numbersTableName}_number_idx`).on(cols.number)
if (hasManyNumberField) {
const numbersTableName = `${rootTableName}_numbers`
const columns: Record<string, PgColumnBuilder> = {
id: serial('id').primaryKey(),
number: numeric('number'),
order: integer('order').notNull(),
parent: parentIDColumnMap[idColType]('parent_id').notNull(),
path: varchar('path').notNull(),
}
if (hasLocalizedManyNumberField) {
config.localeParent = index(`${numbersTableName}_locale_parent`).on(
cols.locale,
cols.parent,
)
columns.locale = adapter.enums.enum__locales('locale')
}
return config
})
numbersTable = adapter.pgSchema.table(numbersTableName, columns, (cols) => {
const config: Record<string, ForeignKeyBuilder | IndexBuilder> = {
orderParentIdx: index(`${numbersTableName}_order_parent_idx`).on(cols.order, cols.parent),
parentFk: foreignKey({
name: `${numbersTableName}_parent_fk`,
columns: [cols.parent],
foreignColumns: [table.id],
}).onDelete('cascade'),
}
adapter.tables[numbersTableName] = numbersTable
if (hasManyNumberField === 'index') {
config.numberIdx = index(`${numbersTableName}_number_idx`).on(cols.number)
}
const numbersTableRelations = relations(numbersTable, ({ one }) => ({
parent: one(table, {
fields: [numbersTable.parent],
references: [table.id],
}),
}))
if (hasLocalizedManyNumberField) {
config.localeParent = index(`${numbersTableName}_locale_parent`).on(
cols.locale,
cols.parent,
)
}
adapter.relations[`relations_${numbersTableName}`] = numbersTableRelations
}
return config
})
adapter.tables[numbersTableName] = numbersTable
adapter.relations[`relations_${numbersTableName}`] = relations(numbersTable, ({ one }) => ({
parent: one(table, {
fields: [numbersTable.parent],
references: [table.id],
relationName: '_numbers',
}),
}))
}
if (buildRelationships) {
if (relationships.size) {
const relationshipColumns: Record<string, PgColumnBuilder> = {
id: serial('id').primaryKey(),
@@ -308,7 +333,6 @@ export const buildTable = ({
}
const relationExtraConfig: BaseExtraConfig = {}
const relationshipsTableName = `${tableName}${adapter.relationshipsSuffix}`
relationships.forEach((relationTo) => {
@@ -319,7 +343,6 @@ export const buildTable = ({
throwValidationError: true,
})
let colType = adapter.idType === 'uuid' ? 'uuid' : 'integer'
const relatedCollectionCustomIDType =
adapter.payload.collections[relationshipConfig.slug]?.customIDType
@@ -371,51 +394,63 @@ export const buildTable = ({
adapter.tables[relationshipsTableName] = relationshipsTable
const relationshipsTableRelations = relations(relationshipsTable, ({ one }) => {
const result: Record<string, Relation<string>> = {
parent: one(table, {
fields: [relationshipsTable.parent],
references: [table.id],
relationName: '_rels',
}),
}
adapter.relations[`relations_${relationshipsTableName}`] = relations(
relationshipsTable,
({ one }) => {
const result: Record<string, Relation<string>> = {
parent: one(table, {
fields: [relationshipsTable.parent],
references: [table.id],
relationName: '_rels',
}),
}
relationships.forEach((relationTo) => {
const relatedTableName = createTableName({
adapter,
config: adapter.payload.collections[relationTo].config,
throwValidationError: true,
relationships.forEach((relationTo) => {
const relatedTableName = createTableName({
adapter,
config: adapter.payload.collections[relationTo].config,
throwValidationError: true,
})
const idColumnName = `${relationTo}ID`
result[idColumnName] = one(adapter.tables[relatedTableName], {
fields: [relationshipsTable[idColumnName]],
references: [adapter.tables[relatedTableName].id],
relationName: relationTo,
})
})
const idColumnName = `${relationTo}ID`
result[idColumnName] = one(adapter.tables[relatedTableName], {
fields: [relationshipsTable[idColumnName]],
references: [adapter.tables[relatedTableName].id],
})
})
return result
})
adapter.relations[`relations_${relationshipsTableName}`] = relationshipsTableRelations
return result
},
)
}
}
const tableRelations = relations(table, ({ many }) => {
adapter.relations[`relations_${tableName}`] = relations(table, ({ many, one }) => {
const result: Record<string, Relation<string>> = {}
relationsToBuild.forEach((val, key) => {
result[key] = many(adapter.tables[val])
nonLocalizedRelations.forEach(({ type, target }, key) => {
if (type === 'one') {
result[key] = one(adapter.tables[target], {
fields: [table[key]],
references: [adapter.tables[target].id],
relationName: key,
})
}
if (type === 'many') {
result[key] = many(adapter.tables[target], { relationName: key })
}
})
if (hasLocalizedField) {
result._locales = many(localesTable)
result._locales = many(localesTable, { relationName: '_locales' })
}
if (hasManyTextField) {
result._texts = many(textsTable)
result._texts = many(textsTable, { relationName: '_texts' })
}
if (hasManyNumberField) {
result._numbers = many(numbersTable)
result._numbers = many(numbersTable, { relationName: '_numbers' })
}
if (relationships.size && relationshipsTable) {
@@ -427,7 +462,5 @@ export const buildTable = ({
return result
})
adapter.relations[`relations_${tableName}`] = tableRelations
return { hasManyNumberField, hasManyTextField, relationsToBuild }
}

View File

@@ -24,7 +24,7 @@ import { fieldAffectsData, optionIsObject } from 'payload/types'
import toSnakeCase from 'to-snake-case'
import type { GenericColumns, IDType, PostgresAdapter } from '../types.js'
import type { BaseExtraConfig } from './build.js'
import type { BaseExtraConfig, RelationMap } from './build.js'
import { hasLocalesTable } from '../utilities/hasLocalesTable.js'
import { buildTable } from './build.js'
@@ -36,9 +36,6 @@ import { validateExistingBlockIsIdentical } from './validateExistingBlockIsIdent
type Args = {
adapter: PostgresAdapter
buildNumbers: boolean
buildRelationships: boolean
buildTexts: boolean
columnPrefix?: string
columns: Record<string, PgColumnBuilder>
disableNotNull: boolean
@@ -51,9 +48,9 @@ type Args = {
localesIndexes: Record<string, (cols: GenericColumns) => IndexBuilder>
newTableName: string
parentTableName: string
relationsToBuild: Map<string, string>
relationsToBuild: RelationMap
relationships: Set<string>
rootRelationsToBuild?: Map<string, string>
rootRelationsToBuild?: RelationMap
rootTableIDColType: string
rootTableName: string
versions: boolean
@@ -70,9 +67,6 @@ type Result = {
export const traverseFields = ({
adapter,
buildNumbers,
buildRelationships,
buildTexts,
columnPrefix,
columns,
disableNotNull,
@@ -121,7 +115,13 @@ export const traverseFields = ({
// If field is localized,
// add the column to the locale table instead of main table
if (adapter.payload.config.localization && (field.localized || forceLocalized)) {
if (
adapter.payload.config.localization &&
(field.localized || forceLocalized) &&
field.type !== 'array' &&
field.type !== 'blocks' &&
(('hasMany' in field && field.hasMany !== true) || !('hasMany' in field))
) {
hasLocalizedField = true
targetTable = localesColumns
targetIndexes = localesIndexes
@@ -250,6 +250,7 @@ export const traverseFields = ({
parentTableName: newTableName,
prefix: `${newTableName}_`,
throwValidationError,
versionsCustomName: versions,
})
const baseColumns: Record<string, PgColumnBuilder> = {
order: integer('order').notNull(),
@@ -264,7 +265,7 @@ export const traverseFields = ({
name: `${selectTableName}_parent_fk`,
columns: [cols.parent],
foreignColumns: [adapter.tables[parentTableName].id],
}),
}).onDelete('cascade'),
parentIdx: (cols) => index(`${selectTableName}_parent_idx`).on(cols.parent),
}
@@ -285,24 +286,28 @@ export const traverseFields = ({
disableNotNull,
disableUnique,
fields: [],
rootTableName,
tableName: selectTableName,
versions,
})
relationsToBuild.set(fieldName, selectTableName)
relationsToBuild.set(fieldName, {
type: 'many',
// selects have their own localized table, independent of the base table.
localized: false,
target: selectTableName,
})
const selectTableRelations = relations(adapter.tables[selectTableName], ({ one }) => {
const result: Record<string, Relation<string>> = {
adapter.relations[`relations_${selectTableName}`] = relations(
adapter.tables[selectTableName],
({ one }) => ({
parent: one(adapter.tables[parentTableName], {
fields: [adapter.tables[selectTableName].parent],
references: [adapter.tables[parentTableName].id],
relationName: fieldName,
}),
}
return result
})
adapter.relations[`relation_${selectTableName}`] = selectTableRelations
}),
)
} else {
targetTable[fieldName] = adapter.enums[enumName](fieldName)
}
@@ -376,28 +381,49 @@ export const traverseFields = ({
hasManyNumberField = subHasManyNumberField
}
relationsToBuild.set(fieldName, arrayTableName)
const arrayTableRelations = relations(adapter.tables[arrayTableName], ({ many, one }) => {
const result: Record<string, Relation<string>> = {
_parentID: one(adapter.tables[parentTableName], {
fields: [adapter.tables[arrayTableName]._parentID],
references: [adapter.tables[parentTableName].id],
}),
}
if (hasLocalesTable(field.fields)) {
result._locales = many(adapter.tables[`${arrayTableName}${adapter.localesSuffix}`])
}
subRelationsToBuild.forEach((val, key) => {
result[key] = many(adapter.tables[val])
})
return result
relationsToBuild.set(fieldName, {
type: 'many',
// arrays have their own localized table, independent of the base table.
localized: false,
target: arrayTableName,
})
adapter.relations[`relations_${arrayTableName}`] = arrayTableRelations
adapter.relations[`relations_${arrayTableName}`] = relations(
adapter.tables[arrayTableName],
({ many, one }) => {
const result: Record<string, Relation<string>> = {
_parentID: one(adapter.tables[parentTableName], {
fields: [adapter.tables[arrayTableName]._parentID],
references: [adapter.tables[parentTableName].id],
relationName: fieldName,
}),
}
if (hasLocalesTable(field.fields)) {
result._locales = many(adapter.tables[`${arrayTableName}${adapter.localesSuffix}`], {
relationName: '_locales',
})
}
subRelationsToBuild.forEach(({ type, localized, target }, key) => {
if (type === 'one') {
const arrayWithLocalized = localized
? `${arrayTableName}${adapter.localesSuffix}`
: arrayTableName
result[key] = one(adapter.tables[target], {
fields: [adapter.tables[arrayWithLocalized][key]],
references: [adapter.tables[target].id],
relationName: key,
})
}
if (type === 'many') {
result[key] = many(adapter.tables[target], { relationName: key })
}
})
return result
},
)
break
}
@@ -468,31 +494,43 @@ export const traverseFields = ({
hasManyNumberField = subHasManyNumberField
}
const blockTableRelations = relations(
adapter.relations[`relations_${blockTableName}`] = relations(
adapter.tables[blockTableName],
({ many, one }) => {
const result: Record<string, Relation<string>> = {
_parentID: one(adapter.tables[rootTableName], {
fields: [adapter.tables[blockTableName]._parentID],
references: [adapter.tables[rootTableName].id],
relationName: `_blocks_${block.slug}`,
}),
}
if (hasLocalesTable(block.fields)) {
result._locales = many(
adapter.tables[`${blockTableName}${adapter.localesSuffix}`],
{ relationName: '_locales' },
)
}
subRelationsToBuild.forEach((val, key) => {
result[key] = many(adapter.tables[val])
subRelationsToBuild.forEach(({ type, localized, target }, key) => {
if (type === 'one') {
const blockWithLocalized = localized
? `${blockTableName}${adapter.localesSuffix}`
: blockTableName
result[key] = one(adapter.tables[target], {
fields: [adapter.tables[blockWithLocalized][key]],
references: [adapter.tables[target].id],
relationName: key,
})
}
if (type === 'many') {
result[key] = many(adapter.tables[target], { relationName: key })
}
})
return result
},
)
adapter.relations[`relations_${blockTableName}`] = blockTableRelations
} else if (process.env.NODE_ENV !== 'production' && !versions) {
validateExistingBlockIsIdentical({
block,
@@ -502,7 +540,13 @@ export const traverseFields = ({
tableLocales: adapter.tables[`${blockTableName}${adapter.localesSuffix}`],
})
}
rootRelationsToBuild.set(`_blocks_${block.slug}`, blockTableName)
// blocks relationships are defined from the collection or globals table down to the block, bypassing any subBlocks
rootRelationsToBuild.set(`_blocks_${block.slug}`, {
type: 'many',
// blocks are not localized on the parent table
localized: false,
target: blockTableName,
})
})
break
@@ -520,9 +564,6 @@ export const traverseFields = ({
hasManyTextField: groupHasManyTextField,
} = traverseFields({
adapter,
buildNumbers,
buildRelationships,
buildTexts,
columnPrefix,
columns,
disableNotNull,
@@ -563,9 +604,6 @@ export const traverseFields = ({
hasManyTextField: groupHasManyTextField,
} = traverseFields({
adapter,
buildNumbers,
buildRelationships,
buildTexts,
columnPrefix: `${columnName}_`,
columns,
disableNotNull: disableNotNullFromHere,
@@ -607,9 +645,6 @@ export const traverseFields = ({
hasManyTextField: tabHasManyTextField,
} = traverseFields({
adapter,
buildNumbers,
buildRelationships,
buildTexts,
columnPrefix,
columns,
disableNotNull: disableNotNullFromHere,
@@ -651,9 +686,6 @@ export const traverseFields = ({
hasManyTextField: rowHasManyTextField,
} = traverseFields({
adapter,
buildNumbers,
buildRelationships,
buildTexts,
columnPrefix,
columns,
disableNotNull: disableNotNullFromHere,
@@ -687,13 +719,45 @@ export const traverseFields = ({
case 'upload':
if (Array.isArray(field.relationTo)) {
field.relationTo.forEach((relation) => relationships.add(relation))
} else {
} else if (field.type === 'relationship' && field.hasMany) {
relationships.add(field.relationTo)
}
} else {
// simple relationships get a column on the targetTable with a foreign key to the relationTo table
const relationshipConfig = adapter.payload.collections[field.relationTo].config
if (field.localized && adapter.payload.config.localization) {
const tableName = adapter.tableNameMap.get(toSnakeCase(field.relationTo))
// get the id type of the related collection
let colType = adapter.idType === 'uuid' ? 'uuid' : 'integer'
const relatedCollectionCustomID = relationshipConfig.fields.find(
(field) => fieldAffectsData(field) && field.name === 'id',
)
if (relatedCollectionCustomID?.type === 'number') colType = 'numeric'
if (relatedCollectionCustomID?.type === 'text') colType = 'varchar'
// make the foreign key column for relationship using the correct id column type
targetTable[fieldName] = parentIDColumnMap[colType](`${columnName}_id`).references(
() => adapter.tables[tableName].id,
{ onDelete: 'set null' },
)
// add relationship to table
relationsToBuild.set(fieldName, {
type: 'one',
localized: adapter.payload.config.localization && field.localized,
target: tableName,
})
// add notNull when not required
if (!disableNotNull && field.required && !field.admin?.condition) {
targetTable[fieldName].notNull()
}
break
}
if (adapter.payload.config.localization && field.localized) {
hasLocalizedRelationshipField = true
}
break
default:

View File

@@ -2,11 +2,14 @@
import type { SanitizedConfig } from 'payload/config'
import type { Field, TypeWithID } from 'payload/types'
import type { PostgresAdapter } from '../../types.js'
import { createBlocksMap } from '../../utilities/createBlocksMap.js'
import { createPathMap } from '../../utilities/createRelationshipMap.js'
import { traverseFields } from './traverseFields.js'
type TransformArgs = {
adapter: PostgresAdapter
config: SanitizedConfig
data: Record<string, unknown>
fallbackLocale?: false | string
@@ -16,7 +19,12 @@ type TransformArgs = {
// This is the entry point to transform Drizzle output data
// into the shape Payload expects based on field schema
export const transform = <T extends TypeWithID>({ config, data, fields }: TransformArgs): T => {
export const transform = <T extends TypeWithID>({
adapter,
config,
data,
fields,
}: TransformArgs): T => {
let relationships: Record<string, Record<string, unknown>[]> = {}
let texts: Record<string, Record<string, unknown>[]> = {}
let numbers: Record<string, Record<string, unknown>[]> = {}
@@ -40,6 +48,7 @@ export const transform = <T extends TypeWithID>({ config, data, fields }: Transf
const deletions = []
const result = traverseFields<T>({
adapter,
blocks,
config,
dataRef: {

View File

@@ -30,10 +30,6 @@ export const transformRelationship = ({ field, locale, ref, relations }: Args) =
value: matchedRelation[1],
}
}
} else {
// Handle hasOne
const relatedData = relation[`${field.relationTo}ID`]
result = relatedData
}
}
} else {

View File

@@ -4,6 +4,7 @@ import type { Field, TabAsField } from 'payload/types'
import { fieldAffectsData } from 'payload/types'
import type { PostgresAdapter } from '../../types.js'
import type { BlocksMap } from '../../utilities/createBlocksMap.js'
import { transformHasManyNumber } from './hasManyNumber.js'
@@ -11,6 +12,10 @@ import { transformHasManyText } from './hasManyText.js'
import { transformRelationship } from './relationship.js'
type TraverseFieldsArgs = {
/**
* The DB adapter
*/
adapter: PostgresAdapter
/**
* Pre-formatted blocks map
*/
@@ -60,6 +65,7 @@ type TraverseFieldsArgs = {
// Traverse fields recursively, transforming data
// for each field type into required Payload shape
export const traverseFields = <T extends Record<string, unknown>>({
adapter,
blocks,
config,
dataRef,
@@ -77,6 +83,7 @@ export const traverseFields = <T extends Record<string, unknown>>({
const formatted = fields.reduce((result, field) => {
if (field.type === 'tabs') {
traverseFields({
adapter,
blocks,
config,
dataRef,
@@ -97,6 +104,7 @@ export const traverseFields = <T extends Record<string, unknown>>({
(field.type === 'tab' && !('name' in field))
) {
traverseFields({
adapter,
blocks,
config,
dataRef,
@@ -114,6 +122,11 @@ export const traverseFields = <T extends Record<string, unknown>>({
if (fieldAffectsData(field)) {
const fieldName = `${fieldPrefix || ''}${field.name}`
const fieldData = table[fieldName]
const localizedFieldData = {}
const valuesToTransform: {
ref: Record<string, unknown>
table: Record<string, unknown>
}[] = []
if (fieldPrefix) {
deletions.push(() => delete table[fieldName])
@@ -134,6 +147,7 @@ export const traverseFields = <T extends Record<string, unknown>>({
}
const rowResult = traverseFields<T>({
adapter,
blocks,
config,
dataRef: data,
@@ -168,6 +182,7 @@ export const traverseFields = <T extends Record<string, unknown>>({
}
return traverseFields<T>({
adapter,
blocks,
config,
dataRef: row,
@@ -212,6 +227,7 @@ export const traverseFields = <T extends Record<string, unknown>>({
if (block) {
const blockResult = traverseFields<T>({
adapter,
blocks,
config,
dataRef: row,
@@ -243,6 +259,7 @@ export const traverseFields = <T extends Record<string, unknown>>({
if (block) {
return traverseFields<T>({
adapter,
blocks,
config,
dataRef: row,
@@ -266,49 +283,63 @@ export const traverseFields = <T extends Record<string, unknown>>({
}
if (field.type === 'relationship' || field.type === 'upload') {
const relationPathMatch = relationships[`${sanitizedPath}${field.name}`]
if (!relationPathMatch) {
if ('hasMany' in field && field.hasMany) {
if (field.localized && config.localization && config.localization.locales) {
result[field.name] = {
[config.localization.defaultLocale]: [],
if (typeof field.relationTo === 'string' && !('hasMany' in field && field.hasMany)) {
if (
field.localized &&
config.localization &&
config.localization.locales &&
Array.isArray(table?._locales)
) {
table._locales.forEach((localeRow) => {
result[field.name] = { [localeRow._locale]: localeRow[fieldName] }
})
} else {
valuesToTransform.push({ ref: result, table })
}
} else {
const relationPathMatch = relationships[`${sanitizedPath}${field.name}`]
if (!relationPathMatch) {
if ('hasMany' in field && field.hasMany) {
if (field.localized && config.localization && config.localization.locales) {
result[field.name] = {
[config.localization.defaultLocale]: [],
}
} else {
result[field.name] = []
}
} else {
result[field.name] = []
}
return result
}
return result
}
if (field.localized) {
result[field.name] = {}
const relationsByLocale: Record<string, Record<string, unknown>[]> = {}
if (field.localized) {
result[field.name] = {}
const relationsByLocale: Record<string, Record<string, unknown>[]> = {}
relationPathMatch.forEach((row) => {
if (typeof row.locale === 'string') {
if (!relationsByLocale[row.locale]) relationsByLocale[row.locale] = []
relationsByLocale[row.locale].push(row)
}
})
relationPathMatch.forEach((row) => {
if (typeof row.locale === 'string') {
if (!relationsByLocale[row.locale]) relationsByLocale[row.locale] = []
relationsByLocale[row.locale].push(row)
}
})
Object.entries(relationsByLocale).forEach(([locale, relations]) => {
Object.entries(relationsByLocale).forEach(([locale, relations]) => {
transformRelationship({
field,
locale,
ref: result,
relations,
})
})
} else {
transformRelationship({
field,
locale,
ref: result,
relations,
relations: relationPathMatch,
})
})
} else {
transformRelationship({
field,
ref: result,
relations: relationPathMatch,
})
}
return result
}
return result
}
if (field.type === 'text' && field?.hasMany) {
@@ -397,12 +428,6 @@ export const traverseFields = <T extends Record<string, unknown>>({
return result
}
const localizedFieldData = {}
const valuesToTransform: {
ref: Record<string, unknown>
table: Record<string, unknown>
}[] = []
if (field.localized && Array.isArray(table._locales)) {
table._locales.forEach((localeRow) => {
valuesToTransform.push({ ref: localizedFieldData, table: localeRow })
@@ -414,6 +439,7 @@ export const traverseFields = <T extends Record<string, unknown>>({
valuesToTransform.forEach(({ ref, table }) => {
const fieldData = table[`${fieldPrefix || ''}${field.name}`]
const locale = table?._locale
let val = fieldData
switch (field.type) {
case 'tab':
@@ -428,6 +454,7 @@ export const traverseFields = <T extends Record<string, unknown>>({
Object.entries(ref).forEach(([groupLocale, groupLocaleData]) => {
ref[groupLocale] = traverseFields<Record<string, unknown>>({
adapter,
blocks,
config,
dataRef: groupLocaleData as Record<string, unknown>,
@@ -448,6 +475,7 @@ export const traverseFields = <T extends Record<string, unknown>>({
const groupData = {}
ref[field.name] = traverseFields<Record<string, unknown>>({
adapter,
blocks,
config,
dataRef: groupData as Record<string, unknown>,
@@ -465,65 +493,55 @@ export const traverseFields = <T extends Record<string, unknown>>({
}
}
break
return
}
case 'text': {
let val = fieldData
if (typeof fieldData === 'string') {
val = String(fieldData)
}
if (typeof locale === 'string') {
ref[locale] = val
} else {
result[field.name] = val
}
break
}
case 'number': {
let val = fieldData
if (typeof fieldData === 'string') {
val = Number.parseFloat(fieldData)
}
if (typeof locale === 'string') {
ref[locale] = val
} else {
result[field.name] = val
}
break
}
case 'date': {
let val = fieldData
if (typeof fieldData === 'string') {
val = new Date(fieldData).toISOString()
}
if (typeof locale === 'string') {
ref[locale] = val
} else {
result[field.name] = val
break
}
case 'relationship':
case 'upload': {
if (
val &&
typeof field.relationTo === 'string' &&
adapter.payload.collections[field.relationTo].customIDType === 'number'
) {
val = Number(val)
}
break
}
default: {
if (typeof locale === 'string') {
ref[locale] = fieldData
} else {
result[field.name] = fieldData
}
break
}
}
if (typeof locale === 'string') {
ref[locale] = val
} else {
result[field.name] = val
}
})
if (Object.keys(localizedFieldData).length > 0) {

View File

@@ -354,7 +354,10 @@ export const traverseFields = ({
if (field.type === 'relationship' || field.type === 'upload') {
const relationshipPath = `${path || ''}${field.name}`
if (field.localized) {
if (
field.localized &&
(Array.isArray(field.relationTo) || ('hasMany' in field && field.hasMany))
) {
if (typeof fieldData === 'object') {
Object.entries(fieldData).forEach(([localeKey, localeData]) => {
if (localeData === null) {
@@ -376,7 +379,8 @@ export const traverseFields = ({
})
})
}
} else {
return
} else if (Array.isArray(field.relationTo) || ('hasMany' in field && field.hasMany)) {
if (fieldData === null || (Array.isArray(fieldData) && fieldData.length === 0)) {
relationshipsToDelete.push({ path: relationshipPath })
return
@@ -390,9 +394,30 @@ export const traverseFields = ({
field,
relationships,
})
return
} else {
if (
!field.localized &&
fieldData &&
typeof fieldData === 'object' &&
'id' in fieldData &&
fieldData?.id
) {
fieldData = fieldData.id
} else if (field.localized) {
if (typeof fieldData === 'object') {
Object.entries(fieldData).forEach(([localeKey, localeData]) => {
if (typeof localeData === 'object') {
if (localeData && 'id' in localeData && localeData?.id) {
fieldData[localeKey] = localeData.id
}
} else {
fieldData[localeKey] = localeData
}
})
}
}
}
return
}
if (field.type === 'text' && field.hasMany) {

View File

@@ -18,7 +18,7 @@ export const updateOne: UpdateOne = async function updateOne(
const whereToUse = whereArg || { id: { equals: id } }
let idToUpdate = id
const { joinAliases, joins, selectFields, where } = await buildQuery({
const { joins, selectFields, where } = await buildQuery({
adapter: this,
fields: collection.fields,
locale,
@@ -30,7 +30,6 @@ export const updateOne: UpdateOne = async function updateOne(
adapter: this,
chainedMethods: [{ args: [1], method: 'limit' }],
db,
joinAliases,
joins,
selectFields,
tableName,

View File

@@ -20,6 +20,7 @@ export const upsertRow = async <T extends TypeWithID>({
data,
db,
fields,
ignoreResult,
operation,
path = '',
req,
@@ -323,6 +324,8 @@ export const upsertRow = async <T extends TypeWithID>({
: error
}
if (ignoreResult) return data as T
// //////////////////////////////////
// RETRIEVE NEWLY UPDATED ROW
// //////////////////////////////////
@@ -343,6 +346,7 @@ export const upsertRow = async <T extends TypeWithID>({
// //////////////////////////////////
const result = transform<T>({
adapter,
config: adapter.payload.config,
data: doc,
fields,

View File

@@ -8,6 +8,11 @@ type BaseArgs = {
data: Record<string, unknown>
db: DrizzleDB
fields: Field[]
/**
* When true, skips reading the data back from the database and returns the input data
* @default false
*/
ignoreResult?: boolean
path?: string
req: PayloadRequestWithData
tableName: string

View File

@@ -7,6 +7,15 @@
"syntax": "typescript",
"tsx": true,
"dts": true
},
"transform": {
"react": {
"runtime": "automatic",
"pragmaFrag": "React.Fragment",
"throwIfNamespace": true,
"development": false,
"useBuiltins": true
}
}
},
"module": {

View File

@@ -9,7 +9,7 @@ It abstracts all of the email functionality that was in Payload by default in 2.
## Installation
```sh
pnpm add @payloadcms/email-nodemailer` nodemailer
pnpm add @payloadcms/email-nodemailer nodemailer
```
## Usage

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/email-nodemailer",
"version": "3.0.0-beta.33",
"version": "3.0.0-beta.39",
"description": "Payload Nodemailer Email Adapter",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -7,6 +7,15 @@
"syntax": "typescript",
"tsx": true,
"dts": true
},
"transform": {
"react": {
"runtime": "automatic",
"pragmaFrag": "React.Fragment",
"throwIfNamespace": true,
"development": false,
"useBuiltins": true
}
}
},
"module": {

View File

@@ -11,6 +11,15 @@
"syntax": "typescript",
"tsx": true,
"dts": true
},
"transform": {
"react": {
"runtime": "automatic",
"pragmaFrag": "React.Fragment",
"throwIfNamespace": true,
"development": false,
"useBuiltins": true
}
}
},
"module": {

View File

@@ -5,7 +5,7 @@ This adapter allows you to send emails using the [Resend](https://resend.com) RE
## Installation
```sh
pnpm add @payloadcms/email-resend`
pnpm add @payloadcms/email-resend
```
## Usage

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/email-resend",
"version": "3.0.0-beta.33",
"version": "3.0.0-beta.39",
"description": "Payload Resend Email Adapter",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/graphql",
"version": "3.0.0-beta.33",
"version": "3.0.0-beta.39",
"homepage": "https://payloadcms.com",
"repository": {
"type": "git",

View File

@@ -0,0 +1,23 @@
MIT License
Copyright (c) 2017 Ivo Meißner
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
[Package Link](https://github.com/slicknode/graphql-query-complexity)

View File

@@ -0,0 +1,23 @@
The MIT License (MIT)
Copyright (c) 2016 Jimmy Jia
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
[Package Link](https://github.com/taion/graphql-type-json/tree/master)

View File

@@ -1,5 +1,5 @@
import { GraphQLScalarType } from 'graphql'
import { Kind, print } from 'graphql/language'
import { Kind, print } from 'graphql/language/index.js'
function identity(value) {
return value

View File

@@ -7,6 +7,15 @@
"syntax": "typescript",
"tsx": true,
"dts": true
},
"transform": {
"react": {
"runtime": "automatic",
"pragmaFrag": "React.Fragment",
"throwIfNamespace": true,
"development": false,
"useBuiltins": true
}
}
},
"module": {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/live-preview-react",
"version": "3.0.0-beta.33",
"version": "3.0.0-beta.39",
"description": "The official live preview React SDK for Payload",
"homepage": "https://payloadcms.com",
"repository": {
@@ -35,12 +35,13 @@
},
"devDependencies": {
"@payloadcms/eslint-config": "workspace:*",
"@types/react": "18.3.2",
"@types/react": "npm:types-react@19.0.0-beta.2",
"@types/react-dom": "npm:types-react-dom@19.0.0-beta.2",
"payload": "workspace:*"
},
"peerDependencies": {
"react": "^18.2.0 || ^19.0.0",
"react-dom": "^18.2.0 || ^19.0.0"
"react": "^19.0.0 || ^19.0.0-rc-f994737d14-20240522",
"react-dom": "^19.0.0 || ^19.0.0-rc-f994737d14-20240522"
},
"publishConfig": {
"exports": {
@@ -53,5 +54,9 @@
"main": "./dist/index.js",
"registry": "https://registry.npmjs.org/",
"types": "./dist/index.d.ts"
},
"overrides": {
"@types/react": "npm:types-react@19.0.0-beta.2",
"@types/react-dom": "npm:types-react-dom@19.0.0-beta.2"
}
}

View File

@@ -6,7 +6,7 @@
"emitDeclarationOnly": true,
"outDir": "./dist" /* Specify an output folder for all emitted files. */,
"rootDir": "./src" /* Specify the root folder within your source files. */,
"jsx": "react"
"jsx": "react-jsx"
},
"exclude": [
"dist",

View File

@@ -1,37 +0,0 @@
/** @type {import('prettier').Config} */
module.exports = {
extends: ['@payloadcms'],
overrides: [
{
extends: ['plugin:@typescript-eslint/disable-type-checked'],
files: ['*.js', '*.cjs', '*.json', '*.md', '*.yml', '*.yaml'],
},
{
files: ['package.json', 'tsconfig.json'],
rules: {
'perfectionist/sort-array-includes': 'off',
'perfectionist/sort-astro-attributes': 'off',
'perfectionist/sort-classes': 'off',
'perfectionist/sort-enums': 'off',
'perfectionist/sort-exports': 'off',
'perfectionist/sort-imports': 'off',
'perfectionist/sort-interfaces': 'off',
'perfectionist/sort-jsx-props': 'off',
'perfectionist/sort-keys': 'off',
'perfectionist/sort-maps': 'off',
'perfectionist/sort-named-exports': 'off',
'perfectionist/sort-named-imports': 'off',
'perfectionist/sort-object-types': 'off',
'perfectionist/sort-objects': 'off',
'perfectionist/sort-svelte-attributes': 'off',
'perfectionist/sort-union-types': 'off',
'perfectionist/sort-vue-attributes': 'off',
},
},
],
parserOptions: {
project: ['./tsconfig.json'],
tsconfigRootDir: __dirname,
},
root: true,
}

View File

@@ -6,7 +6,7 @@
"emitDeclarationOnly": true,
"outDir": "./dist" /* Specify an output folder for all emitted files. */,
"rootDir": "./src" /* Specify the root folder within your source files. */,
"jsx": "react"
"jsx": "react-jsx"
},
"exclude": [
"dist",

View File

@@ -7,6 +7,15 @@
"syntax": "typescript",
"tsx": true,
"dts": true
},
"transform": {
"react": {
"runtime": "automatic",
"pragmaFrag": "React.Fragment",
"throwIfNamespace": true,
"development": false,
"useBuiltins": true
}
}
},
"module": {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/live-preview",
"version": "3.0.0-beta.33",
"version": "3.0.0-beta.39",
"description": "The official live preview JavaScript SDK for Payload",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -8,6 +8,15 @@
"tsx": true,
"dts": true
},
"transform": {
"react": {
"runtime": "automatic",
"pragmaFrag": "React.Fragment",
"throwIfNamespace": true,
"development": false,
"useBuiltins": true
}
},
"experimental": {
"plugins": [
[

View File

@@ -7,6 +7,15 @@
"syntax": "typescript",
"tsx": true,
"dts": true
},
"transform": {
"react": {
"runtime": "automatic",
"pragmaFrag": "React.Fragment",
"throwIfNamespace": true,
"development": false,
"useBuiltins": true
}
}
},
"module": {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/next",
"version": "3.0.0-beta.33",
"version": "3.0.0-beta.39",
"homepage": "https://payloadcms.com",
"repository": {
"type": "git",
@@ -49,7 +49,7 @@
"@types/busboy": "^1.5.3",
"busboy": "^1.6.0",
"deep-equal": "2.2.2",
"file-type": "19.0.0",
"file-type": "19.0.0 || 19.0.0-rc-f994737d14-20240522",
"graphql-http": "^1.22.0",
"graphql-playground-html": "1.6.30",
"http-status": "1.6.2",
@@ -63,8 +63,8 @@
"devDependencies": {
"@next/eslint-plugin-next": "^14.1.0",
"@payloadcms/eslint-config": "workspace:*",
"@types/react": "18.3.2",
"@types/react-dom": "18.3.0",
"@types/react": "npm:types-react@19.0.0-beta.2",
"@types/react-dom": "npm:types-react-dom@19.0.0-beta.2",
"@types/ws": "^8.5.10",
"css-loader": "^6.10.0",
"css-minimizer-webpack-plugin": "^6.0.0",
@@ -81,7 +81,7 @@
},
"peerDependencies": {
"graphql": "^16.8.1",
"next": "^14.3.0-canary.68",
"next": "^15.0.0-rc.0",
"payload": "workspace:*"
},
"engines": {
@@ -107,5 +107,9 @@
"main": "./dist/index.js",
"registry": "https://registry.npmjs.org/",
"types": "./dist/index.d.ts"
},
"overrides": {
"@types/react": "npm:types-react@19.0.0-beta.2",
"@types/react-dom": "npm:types-react-dom@19.0.0-beta.2"
}
}

View File

@@ -1,6 +1,5 @@
export { addDataAndFileToRequest } from '../utilities/addDataAndFileToRequest.js'
export { addLocalesToRequestFromData, sanitizeLocales } from '../utilities/addLocalesToRequest.js'
export { traverseFields } from '../utilities/buildFieldSchemaMap/traverseFields.js'
export { createPayloadRequest } from '../utilities/createPayloadRequest.js'
export { getNextRequestI18n } from '../utilities/getNextRequestI18n.js'
export { getPayloadHMR, reload } from '../utilities/getPayloadHMR.js'

View File

@@ -1,3 +1,4 @@
export { EditView } from '../views/Edit/index.js'
export { DefaultEditView as EditView } from '../views/Edit/Default/index.js'
export { DefaultListView as ListView } from '../views/List/Default/index.js'
export { NotFoundPage } from '../views/NotFound/index.js'
export { type GenerateViewMetadata, RootPage, generatePageMetadata } from '../views/Root/index.js'

View File

@@ -152,7 +152,7 @@ type FetchAPIFileUpload = (args: {
request: Request
}) => Promise<FetchAPIFileUploadResponse>
export const fetchAPIFileUpload: FetchAPIFileUpload = async ({ options, request }) => {
const uploadOptions = { ...DEFAULT_OPTIONS, ...options }
const uploadOptions: FetchAPIFileUploadOptions = { ...DEFAULT_OPTIONS, ...options }
if (!isEligibleRequest(request)) {
debugLog(uploadOptions, 'Request is not eligible for file upload!')
return {

View File

@@ -15,6 +15,7 @@ import 'react-toastify/dist/ReactToastify.css'
import { getPayloadHMR } from '../../utilities/getPayloadHMR.js'
import { getRequestLanguage } from '../../utilities/getRequestLanguage.js'
import { getRequestTheme } from '../../utilities/getRequestTheme.js'
import { DefaultEditView } from '../../views/Edit/Default/index.js'
import { DefaultListView } from '../../views/List/Default/index.js'
@@ -49,12 +50,20 @@ export const RootLayout = async ({
headers,
})
const theme = getRequestTheme({
config,
cookies,
headers,
})
const payload = await getPayloadHMR({ config })
const i18n: I18nClient = await initI18n({
config: config.i18n,
context: 'client',
language: languageCode,
})
const clientConfig = await createClientConfig({ config, t: i18n.t })
const dir = (rtlLanguages as unknown as AcceptedLanguages[]).includes(languageCode)
@@ -94,7 +103,7 @@ export const RootLayout = async ({
})
return (
<html className={merriweather.variable} dir={dir} lang={languageCode}>
<html className={merriweather.variable} data-theme={theme} dir={dir} lang={languageCode}>
<body>
<RootProvider
componentMap={componentMap}
@@ -105,6 +114,7 @@ export const RootLayout = async ({
languageOptions={languageOptions}
// eslint-disable-next-line react/jsx-no-bind
switchLanguageServerAction={switchLanguageServerAction}
theme={theme}
translations={i18n.translations}
>
{wrappedChildren}

View File

@@ -6,15 +6,29 @@ import type { BaseRouteHandler } from '../types.js'
import { headersWithCors } from '../../../utilities/headersWithCors.js'
export const access: BaseRouteHandler = async ({ req }) => {
const results = await accessOperation({
const headers = headersWithCors({
headers: new Headers(),
req,
})
return Response.json(results, {
headers: headersWithCors({
headers: new Headers(),
try {
const results = await accessOperation({
req,
}),
status: httpStatus.OK,
})
})
return Response.json(results, {
headers,
status: httpStatus.OK,
})
} catch (e: unknown) {
return Response.json(
{
error: e,
},
{
headers,
status: httpStatus.INTERNAL_SERVER_ERROR,
},
)
}
}

View File

@@ -1,33 +1,11 @@
import type { BuildFormStateArgs } from '@payloadcms/ui/forms/buildStateFromSchema'
import type { DocumentPreferences, Field, PayloadRequestWithData, TypeWithID } from 'payload/types'
import type { PayloadRequestWithData } from 'payload/types'
import { buildStateFromSchema } from '@payloadcms/ui/forms/buildStateFromSchema'
import { reduceFieldsToValues } from '@payloadcms/ui/utilities/reduceFieldsToValues'
import { buildFormState as buildFormStateFn } from '@payloadcms/ui/utilities/buildFormState'
import httpStatus from 'http-status'
import type { FieldSchemaMap } from '../../utilities/buildFieldSchemaMap/types.js'
import { buildFieldSchemaMap } from '../../utilities/buildFieldSchemaMap/index.js'
import { headersWithCors } from '../../utilities/headersWithCors.js'
import { routeError } from './routeError.js'
let cached = global._payload_fieldSchemaMap
if (!cached) {
// eslint-disable-next-line no-multi-assign
cached = global._payload_fieldSchemaMap = null
}
export const getFieldSchemaMap = (req: PayloadRequestWithData): FieldSchemaMap => {
if (cached && process.env.NODE_ENV !== 'development') {
return cached
}
cached = buildFieldSchemaMap(req)
return cached
}
export const buildFormState = async ({ req }: { req: PayloadRequestWithData }) => {
const headers = headersWithCors({
headers: new Headers(),
@@ -35,72 +13,17 @@ export const buildFormState = async ({ req }: { req: PayloadRequestWithData }) =
})
try {
const reqData: BuildFormStateArgs = req.data as BuildFormStateArgs
const { collectionSlug, formState, globalSlug, locale, operation, schemaPath } = reqData
const result = await buildFormStateFn({ req })
const incomingUserSlug = req.user?.collection
const adminUserSlug = req.payload.config.admin.user
// If we have a user slug, test it against the functions
if (incomingUserSlug) {
const adminAccessFunction = req.payload.collections[incomingUserSlug].config.access?.admin
// Run the admin access function from the config if it exists
if (adminAccessFunction) {
const canAccessAdmin = await adminAccessFunction({ req })
if (!canAccessAdmin) {
return Response.json(null, {
headers,
status: httpStatus.UNAUTHORIZED,
})
}
// Match the user collection to the global admin config
} else if (adminUserSlug !== incomingUserSlug) {
return Response.json(null, {
headers,
status: httpStatus.UNAUTHORIZED,
})
}
} else {
const hasUsers = await req.payload.find({
collection: adminUserSlug,
depth: 0,
limit: 1,
pagination: false,
})
// If there are users, we should not allow access because of /create-first-user
if (hasUsers.docs.length) {
return Response.json(null, {
headers,
status: httpStatus.UNAUTHORIZED,
})
}
}
const fieldSchemaMap = getFieldSchemaMap(req)
const id = collectionSlug ? reqData.id : undefined
const schemaPathSegments = schemaPath.split('.')
let fieldSchema: Field[]
if (schemaPathSegments.length === 1) {
if (req.payload.collections[schemaPath]) {
fieldSchema = req.payload.collections[schemaPath].config.fields
} else {
fieldSchema = req.payload.config.globals.find(
(global) => global.slug === schemaPath,
)?.fields
}
} else if (fieldSchemaMap.has(schemaPath)) {
fieldSchema = fieldSchemaMap.get(schemaPath)
}
if (!fieldSchema) {
return Response.json(result, {
headers,
status: httpStatus.OK,
})
} catch (err) {
if (err.message === 'Could not find field schema for given path') {
return Response.json(
{
message: 'Could not find field schema for given path',
message: err.message,
},
{
headers,
@@ -109,126 +32,13 @@ export const buildFormState = async ({ req }: { req: PayloadRequestWithData }) =
)
}
let docPreferences = reqData.docPreferences
let data = reqData.data
const promises: {
data?: Promise<void>
preferences?: Promise<void>
} = {}
// If the request does not include doc preferences,
// we should fetch them. This is useful for DocumentInfoProvider
// as it reduces the amount of client-side fetches necessary
// when we fetch data for the Edit view
if (!docPreferences) {
let preferencesKey
if (collectionSlug && id) {
preferencesKey = `collection-${collectionSlug}-${id}`
}
if (globalSlug) {
preferencesKey = `global-${globalSlug}`
}
if (preferencesKey) {
const fetchPreferences = async () => {
const preferencesResult = (await req.payload.find({
collection: 'payload-preferences',
depth: 0,
limit: 1,
where: {
key: {
equals: preferencesKey,
},
},
})) as unknown as { docs: { value: DocumentPreferences }[] }
if (preferencesResult?.docs?.[0]?.value) docPreferences = preferencesResult.docs[0].value
}
promises.preferences = fetchPreferences()
}
if (err.message === 'Unauthorized') {
return Response.json(null, {
headers,
status: httpStatus.UNAUTHORIZED,
})
}
// If there is a form state,
// then we can deduce data from that form state
if (formState) data = reduceFieldsToValues(formState, true)
// If we do not have data at this point,
// we can fetch it. This is useful for DocumentInfoProvider
// to reduce the amount of fetches required
if (!data) {
const fetchData = async () => {
let resolvedData: TypeWithID
if (collectionSlug && id) {
resolvedData = await req.payload.findByID({
id,
collection: collectionSlug,
depth: 0,
draft: true,
fallbackLocale: null,
locale,
overrideAccess: false,
user: req.user,
})
}
if (globalSlug && schemaPath === globalSlug) {
resolvedData = await req.payload.findGlobal({
slug: globalSlug,
depth: 0,
draft: true,
fallbackLocale: null,
locale,
overrideAccess: false,
user: req.user,
})
}
data = resolvedData
}
promises.data = fetchData()
}
if (Object.keys(promises).length > 0) {
await Promise.all(Object.values(promises))
}
const result = await buildStateFromSchema({
id,
data,
fieldSchema,
operation,
preferences: docPreferences || { fields: {} },
req,
})
// Maintain form state of auth / upload fields
if (collectionSlug && formState) {
if (req.payload.collections[collectionSlug]?.config?.upload && formState.file) {
result.file = formState.file
}
if (
req.payload.collections[collectionSlug]?.config?.auth &&
!req.payload.collections[collectionSlug].config.auth.disableLocalStrategy
) {
if (formState.password) result.password = formState.password
if (formState['confirm-password'])
result['confirm-password'] = formState['confirm-password']
if (formState.email) result.email = formState.email
}
}
return Response.json(result, {
headers,
status: httpStatus.OK,
})
} catch (err) {
req.payload.logger.error({ err, msg: `There was an error building form state` })
return routeError({

View File

@@ -37,7 +37,9 @@ export const reload = async (config: SanitizedConfig, payload: Payload): Promise
// TODO: support HMR for other props in the future (see payload/src/index init()) hat may change on Payload singleton
await payload.db.init()
await payload.db.connect({ hotReload: true })
if (payload.db.connect) {
await payload.db.connect({ hotReload: true })
}
}
export const getPayloadHMR = async (options: InitOptions): Promise<Payload> => {

View File

@@ -18,17 +18,19 @@ export const getRequestLanguage = ({
}: GetRequestLanguageArgs): AcceptedLanguages => {
const supportedLanguageKeys = <AcceptedLanguages[]>Object.keys(config.i18n.supportedLanguages)
const langCookie = cookies.get(`${config.cookiePrefix || 'payload'}-lng`)
const languageFromCookie: AcceptedLanguages = (
typeof langCookie === 'string' ? langCookie : langCookie?.value
) as AcceptedLanguages
const languageFromHeader = headers.get('Accept-Language')
? extractHeaderLanguage(headers.get('Accept-Language'))
: undefined
if (languageFromCookie && supportedLanguageKeys.includes(languageFromCookie)) {
return languageFromCookie
}
const languageFromHeader = headers.get('Accept-Language')
? extractHeaderLanguage(headers.get('Accept-Language'))
: undefined
if (languageFromHeader && supportedLanguageKeys.includes(languageFromHeader)) {
return languageFromHeader
}

View File

@@ -0,0 +1,33 @@
import type { Theme } from '@payloadcms/ui/providers/Theme'
import type { ReadonlyRequestCookies } from 'next/dist/server/web/spec-extension/adapters/request-cookies.js'
import type { SanitizedConfig } from 'payload/config'
import { defaultTheme } from '@payloadcms/ui/providers/Theme'
type GetRequestLanguageArgs = {
config: SanitizedConfig
cookies: Map<string, string> | ReadonlyRequestCookies
headers: Request['headers']
}
const acceptedThemes: Theme[] = ['dark', 'light']
export const getRequestTheme = ({ config, cookies, headers }: GetRequestLanguageArgs): Theme => {
const themeCookie = cookies.get(`${config.cookiePrefix || 'payload'}-theme`)
const themeFromCookie: Theme = (
typeof themeCookie === 'string' ? themeCookie : themeCookie?.value
) as Theme
if (themeFromCookie && acceptedThemes.includes(themeFromCookie)) {
return themeFromCookie
}
const themeFromHeader = headers.get('Sec-CH-Prefers-Color-Scheme') as Theme
if (themeFromHeader && acceptedThemes.includes(themeFromHeader)) {
return themeFromHeader
}
return defaultTheme
}

View File

@@ -47,11 +47,26 @@ export const initPage = async ({
language,
})
const languageOptions = Object.entries(payload.config.i18n.supportedLanguages || {}).reduce(
(acc, [language, languageConfig]) => {
if (Object.keys(payload.config.i18n.supportedLanguages).includes(language)) {
acc.push({
label: languageConfig.translations.general.thisLanguage,
value: language,
})
}
return acc
},
[],
)
const req = await createLocalReq(
{
fallbackLocale: null,
locale: locale.code,
req: {
host: headers.get('host'),
i18n,
query: qs.parse(queryString, {
depth: 10,
@@ -97,6 +112,7 @@ export const initPage = async ({
cookies,
docID,
globalConfig,
languageOptions,
locale,
permissions,
req,

View File

@@ -1,7 +1,6 @@
import type { SanitizedConfig } from 'payload/types'
const authRouteKeys: (keyof SanitizedConfig['admin']['routes'])[] = [
'account',
'createFirstUser',
'forgot',
'login',

Some files were not shown because too many files have changed in this diff Show More