Compare commits

..

33 Commits

Author SHA1 Message Date
Elliot DeNolf
e5d6cdae38 chore(release): db-postgres/0.8.9 [skip ci] 2024-10-18 15:37:39 -04:00
Elliot DeNolf
218f2ead03 chore(release): payload/2.30.3 [skip ci] 2024-10-18 15:36:55 -04:00
Sasha
e9c1222182 fix(db-postgres): migrate:create errors with previous schemas (#8786)
Fixes https://github.com/payloadcms/payload/issues/8782
2024-10-18 14:04:23 -04:00
Sasha
c8ed6454a7 fix: duplicate with select hasMany fields (#8734)
Fixes https://github.com/payloadcms/payload/issues/6522 by not sending
`id` of the _current_ document to the `post` / `patch` payload. It
caused issues with Postgres and select `hasMany: true`
2024-10-17 16:31:39 -04:00
Elliot DeNolf
4077598777 chore(release): richtext-lexical/0.11.4 [skip ci] 2024-10-17 09:18:09 -04:00
Elliot DeNolf
65d7d54ba3 chore(release): db-postgres/0.8.8 [skip ci] 2024-10-17 09:17:59 -04:00
Elliot DeNolf
6690c37c4e chore(release): payload/2.30.2 [skip ci] 2024-10-17 09:16:23 -04:00
Sasha
0efc610210 fix(db-postgres): select hasMany nested to array + tab/group (#8739) 2024-10-16 21:57:44 -04:00
Jarrod Flesch
cc99c3a619 chore: improves getLatestCollectionVersion where constraints (#8747) 2024-10-16 14:12:38 -04:00
Elliot DeNolf
24a8dc7aa3 ci: disable nissuer until can be reworked 2024-10-16 09:04:24 -04:00
Elliot DeNolf
90764efa9a ci: auto-close issues without repro, auto-label (#8725)
Implement Nissuer to auto-close issues without valid reproduction and
auto-label based upon selections.

**NOTE:** This does not exempt Payload team members from having a valid
reproduction link.
2024-10-15 23:37:24 -04:00
Sasha
d05e3b0411 fix(db-postgres): build indexes for relationships (#8446)
Fixes https://github.com/payloadcms/payload/issues/8413 for 2.0, builds
indexes for `_rels` tables by default.
Does not port `unique: true` from
https://github.com/payloadcms/payload/pull/8432 because could be a
breaking change if someone has incosistent unique data in the database.
2024-10-10 15:26:54 -04:00
Germán Jabloñski
e4bc281fc2 chore: add instructions to run the examples to the readme (#8622) 2024-10-10 09:50:36 -04:00
Patrik
9d05b82dc6 fix: calculates correct aspect ratio dimensions on sharp based files (#8510)
Fixes #8317 

Sharp based images are auto-oriented based on the EXIF data i.e.
`.rotate()`.

This can be problematic when resizing images as the
`originalAspectRatio` calculation we do in the `imageResizer` can become
incorrect if the files dimensions are rotated from sharp.

For example, uploading an `ios` based image with dimensions of 3024 x
4032 will be auto rotated to 4032 x 3024 because the exif data gives the
image an orientation of `6` - which means it needs to be rotated 90
degrees clockwise.

As a result, the original aspect ratio goes from being `0.75` to
`1.3333` - which is incorrect.

This PR preserves the original aspect ratio to properly resize images
based on the original dimensions - not the sharp based dimensions.
2024-10-08 14:45:07 -04:00
Patrik
f2284f3d1b fix: applies resize after cropping if resizeOptions are defined (#8535)
V3 PR [here](https://github.com/payloadcms/payload/pull/8528)
2024-10-08 14:42:10 -04:00
Sasha
1347b6cc36 fix(db-postgres): port many various fixes from 3.0 (#8468)
This fixes many various issues that are already fixed in 3.0. Updates
Drizzle to match beta to fix some issues
https://github.com/payloadcms/payload/issues/4673
https://github.com/payloadcms/payload/issues/6845
https://github.com/payloadcms/payload/issues/6266, prev Drizzle update
PR https://github.com/payloadcms/payload/pull/7460/

Ported PRs:
- https://github.com/payloadcms/payload/pull/6158
- https://github.com/payloadcms/payload/pull/7900
- https://github.com/payloadcms/payload/pull/7962 (does include
duplication fixes for blocks / arrays with specific for 2.0 method)
- https://github.com/payloadcms/payload/pull/8355
- https://github.com/payloadcms/payload/pull/8456
- https://github.com/payloadcms/payload/pull/8331 (not in the commits
list, as it was a clean merge)
- https://github.com/payloadcms/payload/pull/8369
- https://github.com/payloadcms/payload/pull/7749
- https://github.com/payloadcms/payload/pull/8539

---------

Co-authored-by: Dan Ribbens <dan.ribbens@gmail.com>
Co-authored-by: James Mikrut <james@payloadcms.com>
2024-10-08 10:57:42 -04:00
Elliot DeNolf
0a56d50334 chore(release): plugin-cloud-storage/1.2.0 [skip ci] 2024-10-08 10:54:04 -04:00
Dan Ribbens
02999a5659 feat(plugin-cloud-storage): add credentials to connect to azure (#7781)
Co-authored-by: Elliot DeNolf <denolfe@gmail.com>
2024-10-08 10:43:59 -04:00
Patrik
365127bee4 docs: clarifies distinction between official and community plugins in docs (#8584)
Updated the plugins overview page to better differentiate between
official Payload plugins and community plugins.

Clarified that only official plugins are maintained and supported by the
Payload team, while community plugins may have varying levels of
support.
2024-10-07 12:24:49 -04:00
Patrik
b67e97aa7f docs: specifies defaultLocale as a required property for localization (#8586) 2024-10-07 12:07:15 -04:00
Thomas Mills
61e8ce1743 fix(richtext-lexical): add target _blank for new-tab in linkFeature (#8571)
FIxes #8569 

Matches the fixes in commit 23df60dba5
(plugin-form-builder) and e0b201c810 (v3)

Adds target="_blank" where the link should be in a new tab
2024-10-06 23:17:24 -03:00
Chris Bailey
034aa68cd4 docs: fixes typo in website template README (#8565) 2024-10-06 21:32:51 -04:00
Elliot DeNolf
268e6c485e chore(release): db-mongodb/1.7.3 [skip ci] 2024-10-01 23:21:47 -04:00
Elliot DeNolf
4c1a5dca44 chore(release): payload/2.30.1 [skip ci] 2024-10-01 23:20:20 -04:00
dependabot[bot]
a12d1f4755 chore(deps): bump the production-deps group with 16 updates (#8492)
Bumps the production-deps group with 16 updates:

| Package | From | To |
| --- | --- | --- |
| [postcss](https://github.com/postcss/postcss) | `8.4.31` | `8.4.47` |
| [swc-loader](https://github.com/swc-project/pkgs) | `0.2.3` | `0.2.6`
|
|
[swc-minify-webpack-plugin](https://github.com/guoyunhe/swc-minify-webpack-plugin)
| `2.1.2` | `2.1.3` |
|
[terser-webpack-plugin](https://github.com/webpack-contrib/terser-webpack-plugin)
| `5.3.9` | `5.3.10` |
|
[eslint-plugin-react-hooks](https://github.com/facebook/react/tree/HEAD/packages/eslint-plugin-react-hooks)
| `4.6.0` | `4.6.2` |
| [@faceless-ui/modal](https://github.com/faceless-ui/modal) | `2.0.1` |
`2.0.2` |
| [@faceless-ui/window-info](https://github.com/faceless-ui/window-info)
| `2.1.1` | `2.1.2` |
| [body-parser](https://github.com/expressjs/body-parser) | `1.20.2` |
`1.20.3` |
|
[@types/body-parser](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/body-parser)
| `1.19.2` | `1.19.5` |
| [deep-equal](https://github.com/inspect-js/node-deep-equal) | `2.2.2`
| `2.2.3` |
| [jsonwebtoken](https://github.com/auth0/node-jsonwebtoken) | `9.0.1` |
`9.0.2` |
|
[@types/jsonwebtoken](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/jsonwebtoken)
| `8.5.9` | `9.0.7` |
| [nodemailer](https://github.com/nodemailer/nodemailer) | `6.9.9` |
`6.9.15` |
|
[@types/nodemailer](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/nodemailer)
| `6.4.14` | `6.4.16` |
|
[scheduler](https://github.com/facebook/react/tree/HEAD/packages/scheduler)
| `0.23.0` | `0.23.2` |
|
[react-error-boundary](https://github.com/bvaughn/react-error-boundary)
| `4.0.12` | `4.0.13` |

Updates `postcss` from 8.4.31 to 8.4.47
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/postcss/postcss/releases">postcss's
releases</a>.</em></p>
<blockquote>
<h2>8.4.47</h2>
<ul>
<li>Removed debug code.</li>
</ul>
<h2>8.4.46</h2>
<ul>
<li>Fixed <code>Cannot read properties of undefined (reading
'before')</code>.</li>
</ul>
<h2>8.4.45</h2>
<ul>
<li>Removed unnecessary fix which could lead to infinite loop.</li>
</ul>
<h2>8.4.44</h2>
<ul>
<li>Another way to fix <code>markClean is not a function</code>
error.</li>
</ul>
<h2>8.4.43</h2>
<ul>
<li>Fixed <code>markClean is not a function</code> error.</li>
</ul>
<h2>8.4.42</h2>
<ul>
<li>Fixed CSS syntax error on long minified files (by <a
href="https://github.com/varpstar"><code>@​varpstar</code></a>).</li>
</ul>
<h2>8.4.41</h2>
<ul>
<li>Fixed types (by <a
href="https://github.com/nex3"><code>@​nex3</code></a> and <a
href="https://github.com/querkmachine"><code>@​querkmachine</code></a>).</li>
<li>Cleaned up RegExps (by <a
href="https://github.com/bluwy"><code>@​bluwy</code></a>).</li>
</ul>
<h2>8.4.40</h2>
<ul>
<li>Moved to getter/setter in nodes types to help Sass team (by <a
href="https://github.com/nex3"><code>@​nex3</code></a>).</li>
</ul>
<h2>8.4.39</h2>
<ul>
<li>Fixed <code>CssSyntaxError</code> types (by <a
href="https://github.com/romainmenke"><code>@​romainmenke</code></a>).</li>
</ul>
<h2>8.4.38</h2>
<ul>
<li>Fixed <code>endIndex: 0</code> in errors and warnings (by <a
href="https://github.com/romainmenke"><code>@​romainmenke</code></a>).</li>
</ul>
<h2>8.4.37</h2>
<ul>
<li>Fixed <code>original.column are not numbers</code> error in another
case.</li>
</ul>
<h2>8.4.36</h2>
<ul>
<li>Fixed <code>original.column are not numbers</code> error on broken
previous source map.</li>
</ul>
<h2>8.4.35</h2>
<ul>
<li>Avoid <code>!</code> in <code>node.parent.nodes</code> type.</li>
<li>Allow to pass <code>undefined</code> to node adding method to
simplify types.</li>
</ul>
<h2>8.4.34</h2>
<ul>
<li>Fixed <code>AtRule#nodes</code> type (by <a
href="https://github.com/tim-we"><code>@​tim-we</code></a>).</li>
<li>Cleaned up code (by <a
href="https://github.com/DrKiraDmitry"><code>@​DrKiraDmitry</code></a>).</li>
</ul>
<h2>8.4.33</h2>
<ul>
<li>Fixed <code>NoWorkResult</code> behavior difference with normal mode
(by <a
href="https://github.com/romainmenke"><code>@​romainmenke</code></a>).</li>
<li>Fixed <code>NoWorkResult</code> usage conditions (by <a
href="https://github.com/ahmdammarr"><code>@​ahmdammarr</code></a>).</li>
</ul>
<h2>8.4.32</h2>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/postcss/postcss/blob/main/CHANGELOG.md">postcss's
changelog</a>.</em></p>
<blockquote>
<h2>8.4.47</h2>
<ul>
<li>Removed debug code.</li>
</ul>
<h2>8.4.46</h2>
<ul>
<li>Fixed <code>Cannot read properties of undefined (reading
'before')</code>.</li>
</ul>
<h2>8.4.45</h2>
<ul>
<li>Removed unnecessary fix which could lead to infinite loop.</li>
</ul>
<h2>8.4.44</h2>
<ul>
<li>Another way to fix <code>markClean is not a function</code>
error.</li>
</ul>
<h2>8.4.43</h2>
<ul>
<li>Fixed <code>markClean is not a function</code> error.</li>
</ul>
<h2>8.4.42</h2>
<ul>
<li>Fixed CSS syntax error on long minified files (by <a
href="https://github.com/varpstar"><code>@​varpstar</code></a>).</li>
</ul>
<h2>8.4.41</h2>
<ul>
<li>Fixed types (by <a
href="https://github.com/nex3"><code>@​nex3</code></a> and <a
href="https://github.com/querkmachine"><code>@​querkmachine</code></a>).</li>
<li>Cleaned up RegExps (by <a
href="https://github.com/bluwy"><code>@​bluwy</code></a>).</li>
</ul>
<h2>8.4.40</h2>
<ul>
<li>Moved to getter/setter in nodes types to help Sass team (by <a
href="https://github.com/nex3"><code>@​nex3</code></a>).</li>
</ul>
<h2>8.4.39</h2>
<ul>
<li>Fixed <code>CssSyntaxError</code> types (by <a
href="https://github.com/romainmenke"><code>@​romainmenke</code></a>).</li>
</ul>
<h2>8.4.38</h2>
<ul>
<li>Fixed <code>endIndex: 0</code> in errors and warnings (by <a
href="https://github.com/romainmenke"><code>@​romainmenke</code></a>).</li>
</ul>
<h2>8.4.37</h2>
<ul>
<li>Fixed <code>original.column are not numbers</code> error in another
case.</li>
</ul>
<h2>8.4.36</h2>
<ul>
<li>Fixed <code>original.column are not numbers</code> error on broken
previous source map.</li>
</ul>
<h2>8.4.35</h2>
<ul>
<li>Avoid <code>!</code> in <code>node.parent.nodes</code> type.</li>
<li>Allow to pass <code>undefined</code> to node adding method to
simplify types.</li>
</ul>
<h2>8.4.34</h2>
<ul>
<li>Fixed <code>AtRule#nodes</code> type (by Tim Weißenfels).</li>
<li>Cleaned up code (by Dmitry Kirillov).</li>
</ul>
<h2>8.4.33</h2>
<ul>
<li>Fixed <code>NoWorkResult</code> behavior difference with normal mode
(by Romain Menke).</li>
<li>Fixed <code>NoWorkResult</code> usage conditions (by <a
href="https://github.com/ahmdammarr"><code>@​ahmdammarr</code></a>).</li>
</ul>
<h2>8.4.32</h2>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="5e6fd1302d"><code>5e6fd13</code></a>
Release 8.4.47 version</li>
<li><a
href="714bc10258"><code>714bc10</code></a>
Typo</li>
<li><a
href="439d20e651"><code>439d20e</code></a>
Release 8.4.46 version</li>
<li><a
href="b93582f68e"><code>b93582f</code></a>
Update dependencies</li>
<li><a
href="c51e46767d"><code>c51e467</code></a>
Fix error on inserting node without raws in some cases</li>
<li><a
href="829ae47d6b"><code>829ae47</code></a>
Update dependencies</li>
<li><a
href="5aaaec2214"><code>5aaaec2</code></a>
Update remaining workflow jobs to use latest version of actions (<a
href="https://redirect.github.com/postcss/postcss/issues/1968">#1968</a>)</li>
<li><a
href="448c4f34d6"><code>448c4f3</code></a>
Release 8.4.45 version</li>
<li><a
href="1c77d2e333"><code>1c77d2e</code></a>
Update unnecessary check</li>
<li><a
href="f38b329323"><code>f38b329</code></a>
Try to fix CI</li>
<li>Additional commits viewable in <a
href="https://github.com/postcss/postcss/compare/8.4.31...8.4.47">compare
view</a></li>
</ul>
</details>
<br />

Updates `swc-loader` from 0.2.3 to 0.2.6
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/swc-project/pkgs/commits">compare view</a></li>
</ul>
</details>
<br />

Updates `swc-minify-webpack-plugin` from 2.1.2 to 2.1.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/guoyunhe/swc-minify-webpack-plugin/releases">swc-minify-webpack-plugin's
releases</a>.</em></p>
<blockquote>
<h2>2.1.3</h2>
<ul>
<li>Fixed Buffer data handling</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/guoyunhe/swc-minify-webpack-plugin/blob/main/CHANGELOG.md">swc-minify-webpack-plugin's
changelog</a>.</em></p>
<blockquote>
<h2>2.1.3 - 2024-08-22</h2>
<ul>
<li>Fixed Buffer data handling</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="83ad732d1a"><code>83ad732</code></a>
2.1.3</li>
<li><a
href="d59a7963a6"><code>d59a796</code></a>
changelog</li>
<li><a
href="1a2cea1a82"><code>1a2cea1</code></a>
Merge pull request <a
href="https://redirect.github.com/guoyunhe/swc-minify-webpack-plugin/issues/12">#12</a>
from martinjlowm/fix/pass-string-to-swc</li>
<li><a
href="60e294f610"><code>60e294f</code></a>
Ensure a string is passed to SWC</li>
<li>See full diff in <a
href="https://github.com/guoyunhe/swc-minify-webpack-plugin/compare/v2.1.2...v2.1.3">compare
view</a></li>
</ul>
</details>
<br />

Updates `terser-webpack-plugin` from 5.3.9 to 5.3.10
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/webpack-contrib/terser-webpack-plugin/releases">terser-webpack-plugin's
releases</a>.</em></p>
<blockquote>
<h2>v5.3.10</h2>
<h3><a
href="https://github.com/webpack-contrib/terser-webpack-plugin/compare/v5.3.9...v5.3.10">5.3.10</a>
(2023-12-28)</h3>
<h3>Bug Fixes</h3>
<ul>
<li>bump terser to the latest stable version (<a
href="https://redirect.github.com/webpack-contrib/terser-webpack-plugin/issues/587">#587</a>)
(<a
href="f650fa3ca7">f650fa3</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/webpack-contrib/terser-webpack-plugin/blob/master/CHANGELOG.md">terser-webpack-plugin's
changelog</a>.</em></p>
<blockquote>
<h3><a
href="https://github.com/webpack-contrib/terser-webpack-plugin/compare/v5.3.9...v5.3.10">5.3.10</a>
(2023-12-28)</h3>
<h3>Bug Fixes</h3>
<ul>
<li>bump terser to the latest stable version (<a
href="https://redirect.github.com/webpack-contrib/terser-webpack-plugin/issues/587">#587</a>)
(<a
href="f650fa3ca7">f650fa3</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="c87ade2a08"><code>c87ade2</code></a>
chore(release): 5.3.10</li>
<li><a
href="f650fa3ca7"><code>f650fa3</code></a>
fix: bump terser to the latest stable version (<a
href="https://redirect.github.com/webpack-contrib/terser-webpack-plugin/issues/587">#587</a>)</li>
<li><a
href="0403c772ef"><code>0403c77</code></a>
chore(deps-dev): bump <code>@​babel/traverse</code> from 7.22.17 to
7.23.6 (<a
href="https://redirect.github.com/webpack-contrib/terser-webpack-plugin/issues/586">#586</a>)</li>
<li><a
href="174d197ba8"><code>174d197</code></a>
chore: update dependencies to the latest version (<a
href="https://redirect.github.com/webpack-contrib/terser-webpack-plugin/issues/577">#577</a>)</li>
<li><a
href="1831a49183"><code>1831a49</code></a>
chore: update github action/setup-node (<a
href="https://redirect.github.com/webpack-contrib/terser-webpack-plugin/issues/584">#584</a>)</li>
<li><a
href="25d014707a"><code>25d0147</code></a>
chore: update github actions/checkout (<a
href="https://redirect.github.com/webpack-contrib/terser-webpack-plugin/issues/576">#576</a>)</li>
<li><a
href="fa86955aeb"><code>fa86955</code></a>
chore(deps-dev): bump word-wrap from 1.2.3 to 1.2.5 (<a
href="https://redirect.github.com/webpack-contrib/terser-webpack-plugin/issues/575">#575</a>)</li>
<li><a
href="086767314b"><code>0867673</code></a>
chore: update dependencies to the latest version (<a
href="https://redirect.github.com/webpack-contrib/terser-webpack-plugin/issues/574">#574</a>)</li>
<li><a
href="b8cfb07910"><code>b8cfb07</code></a>
chore: upgrade dependencies to the latest version (<a
href="https://redirect.github.com/webpack-contrib/terser-webpack-plugin/issues/572">#572</a>)</li>
<li><a
href="ce5a518fb0"><code>ce5a518</code></a>
refactor: code (<a
href="https://redirect.github.com/webpack-contrib/terser-webpack-plugin/issues/569">#569</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/webpack-contrib/terser-webpack-plugin/compare/v5.3.9...v5.3.10">compare
view</a></li>
</ul>
</details>
<br />

Updates `eslint-plugin-react-hooks` from 4.6.0 to 4.6.2
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/facebook/react/blob/main/packages/eslint-plugin-react-hooks/CHANGELOG.md">eslint-plugin-react-hooks's
changelog</a>.</em></p>
<blockquote>
<h2>5.0.0 (next release)</h2>
<ul>
<li><strong>New Violations:</strong> Component names now need to start
with an uppercase letter instead of a non-lowercase letter. This means
<code>_Button</code> or <code>_component</code> are no longer valid. (<a
href="https://github.com/kassens"><code>@​kassens</code></a>) in <a
href="https://redirect.github.com/facebook/react/pull/25162">#25162</a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/facebook/react/commits/HEAD/packages/eslint-plugin-react-hooks">compare
view</a></li>
</ul>
</details>
<details>
<summary>Maintainer changes</summary>
<p>This version was pushed to npm by <a
href="https://www.npmjs.com/~react-bot">react-bot</a>, a new releaser
for eslint-plugin-react-hooks since your current version.</p>
</details>
<br />

Updates `@faceless-ui/modal` from 2.0.1 to 2.0.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/faceless-ui/modal/releases"><code>@​faceless-ui/modal</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v2.0.2</h2>
<h2>What's Changed</h2>
<ul>
<li>chore: adds use client directive by <a
href="https://github.com/jacobsfletch"><code>@​jacobsfletch</code></a>
in <a
href="https://redirect.github.com/faceless-ui/modal/pull/54">faceless-ui/modal#54</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/faceless-ui/modal/compare/v2.0.1...v2.0.2">https://github.com/faceless-ui/modal/compare/v2.0.1...v2.0.2</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="64d66fb950"><code>64d66fb</code></a>
chore: adds use client directive (<a
href="https://redirect.github.com/faceless-ui/modal/issues/54">#54</a>)</li>
<li>See full diff in <a
href="https://github.com/faceless-ui/modal/compare/v2.0.1...v2.0.2">compare
view</a></li>
</ul>
</details>
<br />

Updates `@faceless-ui/window-info` from 2.1.1 to 2.1.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/faceless-ui/window-info/releases"><code>@​faceless-ui/window-info</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v2.1.2</h2>
<h2>What's Changed</h2>
<ul>
<li>chore: adds use client directive by <a
href="https://github.com/jacobsfletch"><code>@​jacobsfletch</code></a>
in <a
href="https://redirect.github.com/faceless-ui/window-info/pull/28">faceless-ui/window-info#28</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/faceless-ui/window-info/compare/v2.1.1...v2.1.2">https://github.com/faceless-ui/window-info/compare/v2.1.1...v2.1.2</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="89eba57c5f"><code>89eba57</code></a>
2.1.2</li>
<li><a
href="caa30698f5"><code>caa3069</code></a>
chore: adds use client directive (<a
href="https://redirect.github.com/faceless-ui/window-info/issues/28">#28</a>)</li>
<li>See full diff in <a
href="https://github.com/faceless-ui/window-info/compare/v2.1.1...v2.1.2">compare
view</a></li>
</ul>
</details>
<br />

Updates `body-parser` from 1.20.2 to 1.20.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/expressjs/body-parser/releases">body-parser's
releases</a>.</em></p>
<blockquote>
<h2>1.20.3</h2>
<h2>What's Changed</h2>
<h3>Important</h3>
<ul>
<li>deps: qs@6.13.0</li>
<li>add <code>depth</code> option to customize the depth level in the
parser</li>
<li><strong>IMPORTANT:</strong> The default <code>depth</code> level for
parsing URL-encoded data is now <code>32</code> (previously was
<code>Infinity</code>). <a
href="1752951367/README.md (depth)">Documentation</a></li>
</ul>
<h3>Other changes</h3>
<ul>
<li>chore: add support for OSSF scorecard reporting by <a
href="https://github.com/inigomarquinez"><code>@​inigomarquinez</code></a>
in <a
href="https://redirect.github.com/expressjs/body-parser/pull/522">expressjs/body-parser#522</a></li>
<li>ci: fix errors in ci github action for node 8 and 9 by <a
href="https://github.com/inigomarquinez"><code>@​inigomarquinez</code></a>
in <a
href="https://redirect.github.com/expressjs/body-parser/pull/523">expressjs/body-parser#523</a></li>
<li>fix: pin to node@22.4.1 by <a
href="https://github.com/wesleytodd"><code>@​wesleytodd</code></a> in <a
href="https://redirect.github.com/expressjs/body-parser/pull/527">expressjs/body-parser#527</a></li>
<li>deps: qs@6.12.3 by <a
href="https://github.com/melikhov-dev"><code>@​melikhov-dev</code></a>
in <a
href="https://redirect.github.com/expressjs/body-parser/pull/521">expressjs/body-parser#521</a></li>
<li>Add OSSF Scorecard badge by <a
href="https://github.com/bjohansebas"><code>@​bjohansebas</code></a> in
<a
href="https://redirect.github.com/expressjs/body-parser/pull/531">expressjs/body-parser#531</a></li>
<li>Linter by <a
href="https://github.com/UlisesGascon"><code>@​UlisesGascon</code></a>
in <a
href="https://redirect.github.com/expressjs/body-parser/pull/534">expressjs/body-parser#534</a></li>
<li>Release: 1.20.3 by <a
href="https://github.com/UlisesGascon"><code>@​UlisesGascon</code></a>
in <a
href="https://redirect.github.com/expressjs/body-parser/pull/535">expressjs/body-parser#535</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/inigomarquinez"><code>@​inigomarquinez</code></a>
made their first contribution in <a
href="https://redirect.github.com/expressjs/body-parser/pull/522">expressjs/body-parser#522</a></li>
<li><a
href="https://github.com/melikhov-dev"><code>@​melikhov-dev</code></a>
made their first contribution in <a
href="https://redirect.github.com/expressjs/body-parser/pull/521">expressjs/body-parser#521</a></li>
<li><a
href="https://github.com/bjohansebas"><code>@​bjohansebas</code></a>
made their first contribution in <a
href="https://redirect.github.com/expressjs/body-parser/pull/531">expressjs/body-parser#531</a></li>
<li><a
href="https://github.com/UlisesGascon"><code>@​UlisesGascon</code></a>
made their first contribution in <a
href="https://redirect.github.com/expressjs/body-parser/pull/534">expressjs/body-parser#534</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/expressjs/body-parser/compare/1.20.2...1.20.3">https://github.com/expressjs/body-parser/compare/1.20.2...1.20.3</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/expressjs/body-parser/blob/master/HISTORY.md">body-parser's
changelog</a>.</em></p>
<blockquote>
<h1>1.20.3 / 2024-09-10</h1>
<ul>
<li>deps: qs@6.13.0</li>
<li>add <code>depth</code> option to customize the depth level in the
parser</li>
<li>IMPORTANT: The default <code>depth</code> level for parsing
URL-encoded data is now <code>32</code> (previously was
<code>Infinity</code>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="1752951367"><code>1752951</code></a>
1.20.3</li>
<li><a
href="39744cfe2a"><code>39744cf</code></a>
chore: linter (<a
href="https://redirect.github.com/expressjs/body-parser/issues/534">#534</a>)</li>
<li><a
href="b2695c4450"><code>b2695c4</code></a>
Merge commit from fork</li>
<li><a
href="ade0f3f82f"><code>ade0f3f</code></a>
add scorecard to readme (<a
href="https://redirect.github.com/expressjs/body-parser/issues/531">#531</a>)</li>
<li><a
href="99a1bd6245"><code>99a1bd6</code></a>
deps: qs@6.12.3 (<a
href="https://redirect.github.com/expressjs/body-parser/issues/521">#521</a>)</li>
<li><a
href="9478591605"><code>9478591</code></a>
fix: pin to node@22.4.1</li>
<li><a
href="83db46a1e5"><code>83db46a</code></a>
ci: fix errors in ci github action for node 8 and 9 (<a
href="https://redirect.github.com/expressjs/body-parser/issues/523">#523</a>)</li>
<li><a
href="9d4e2125b5"><code>9d4e212</code></a>
chore: add support for OSSF scorecard reporting (<a
href="https://redirect.github.com/expressjs/body-parser/issues/522">#522</a>)</li>
<li>See full diff in <a
href="https://github.com/expressjs/body-parser/compare/1.20.2...1.20.3">compare
view</a></li>
</ul>
</details>
<details>
<summary>Maintainer changes</summary>
<p>This version was pushed to npm by <a
href="https://www.npmjs.com/~ulisesgascon">ulisesgascon</a>, a new
releaser for body-parser since your current version.</p>
</details>
<br />

Updates `@types/body-parser` from 1.19.2 to 1.19.5
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/body-parser">compare
view</a></li>
</ul>
</details>
<br />

Updates `deep-equal` from 2.2.2 to 2.2.3
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/inspect-js/node-deep-equal/blob/main/CHANGELOG.md">deep-equal's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/inspect-js/node-deep-equal/compare/v2.2.2...v2.2.3">v2.2.3</a>
- 2023-11-08</h2>
<h3>Fixed</h3>
<ul>
<li>[readme] remove performance comment and highlight robustness instead
<a
href="https://redirect.github.com/inspect-js/node-deep-equal/issues/76"><code>[#76](https://github.com/inspect-js/node-deep-equal/issues/76)</code></a>
<a
href="https://redirect.github.com/inspect-js/node-deep-equal/issues/106"><code>[#106](https://github.com/inspect-js/node-deep-equal/issues/106)</code></a></li>
</ul>
<h3>Commits</h3>
<ul>
<li>Merge tag 'v1.1.2' <a
href="c90525fe83"><code>c90525f</code></a></li>
<li>[Tests] port tests from main; only diff should be true/falses <a
href="e02cadb650"><code>e02cadb</code></a></li>
<li>[Dev Deps] update <code>@ljharb/eslint-config</code>,
<code>auto-changelog</code>, <code>aud</code>, <code>eslint</code>,
<code>set-publish-latest</code>, <code>tape</code> <a
href="11bd45b639"><code>11bd45b</code></a></li>
<li>[Tests] update <code>.github</code> from default branch <a
href="58885d3280"><code>58885d3</code></a></li>
<li>[readme] update readme from default branch <a
href="b0bca9a115"><code>b0bca9a</code></a></li>
<li>[Tests] add <code>nyc</code> for coverage <a
href="e25bc3716c"><code>e25bc37</code></a></li>
<li>[readme] update badge URLs, fix line breaking <a
href="1d58c6ecba"><code>1d58c6e</code></a></li>
<li>[Tests] use <code>Buffer.from</code> when available <a
href="f0d4a42fb8"><code>f0d4a42</code></a></li>
<li>[Tests] use <code>has-proto</code> <a
href="0263fb9170"><code>0263fb9</code></a></li>
<li>[Deps] update <code>is-arguments</code>,
<code>is-date-object</code>, <code>is-regex</code>,
<code>object-is</code>, <code>regexp.prototype.flags</code> <a
href="80c15cae82"><code>80c15ca</code></a></li>
<li>[meta] add missing <code>engines.node</code> <a
href="e1d08a818f"><code>e1d08a8</code></a></li>
<li>[meta] use <code>npmignore</code> to autogenerate an npmignore file
<a
href="e0770e594e"><code>e0770e5</code></a></li>
<li>[Deps] update <code>is-date-object</code>, <code>is-regex</code>,
<code>object-is</code>, <code>regexp.prototype.flags</code> <a
href="e4fb8c6459"><code>e4fb8c6</code></a></li>
<li>[Tests] handle ported test failures in iojs v2 <a
href="3798ff4902"><code>3798ff4</code></a></li>
<li>[Deps] update <code>call-bind</code>,
<code>regexp.prototype.flags</code>, <code>which-typed-array</code> <a
href="540e3a119d"><code>540e3a1</code></a></li>
<li>[Dev Deps] update <code>eslint</code>,
<code>@ljharb/eslint-config</code>, <code>tape</code> <a
href="0f8ca7575e"><code>0f8ca75</code></a></li>
<li>[Tests] handle some additional test differences in node &lt;= 0.10
<a
href="197a2203f0"><code>197a220</code></a></li>
<li>[Dev Deps] update <code>object.getownpropertydescriptors</code>,
<code>tape</code> <a
href="21851a62cd"><code>21851a6</code></a></li>
<li>[Dev Deps] update <code>semver</code>, <code>tape</code> <a
href="dd440b2267"><code>dd440b2</code></a></li>
<li>[meta] add missing <code>engines.node</code> <a
href="e158993fcf"><code>e158993</code></a></li>
<li>[meta] update <code>.gitignore</code> from default branch <a
href="6ee186bd39"><code>6ee186b</code></a></li>
<li>[Deps] update <code>get-intrinsic</code> <a
href="6da4b86e4d"><code>6da4b86</code></a></li>
<li>[Dev Deps] update <code>tape</code> <a
href="6ada1ab7f9"><code>6ada1ab</code></a></li>
<li>[Dev Deps] update <code>tape</code> <a
href="270d34b484"><code>270d34b</code></a></li>
<li>[meta] fix URLs <a
href="a269c183bc"><code>a269c18</code></a></li>
<li>[readme] update default branch name <a
href="030a63f40a"><code>030a63f</code></a></li>
<li>[Deps] update <code>which-typed-array</code> <a
href="2f0c327eaa"><code>2f0c327</code></a></li>
<li>[Tests] only use <code>Buffer.from</code> when it has a length of
&gt; 1 <a
href="f7e577622d"><code>f7e5776</code></a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="48d3bb5b7f"><code>48d3bb5</code></a>
v2.2.3</li>
<li><a
href="c90525fe83"><code>c90525f</code></a>
Merge tag 'v1.1.2'</li>
<li><a
href="be5f0362c9"><code>be5f036</code></a>
v1.1.2</li>
<li><a
href="197a2203f0"><code>197a220</code></a>
[Tests] handle some additional test differences in node &lt;= 0.10</li>
<li><a
href="e1d08a818f"><code>e1d08a8</code></a>
[meta] add missing <code>engines.node</code></li>
<li><a
href="e158993fcf"><code>e158993</code></a>
[meta] add missing <code>engines.node</code></li>
<li><a
href="3798ff4902"><code>3798ff4</code></a>
[Tests] handle ported test failures in iojs v2</li>
<li><a
href="6da4b86e4d"><code>6da4b86</code></a>
[Deps] update <code>get-intrinsic</code></li>
<li><a
href="6ada1ab7f9"><code>6ada1ab</code></a>
[Dev Deps] update <code>tape</code></li>
<li><a
href="e02cadb650"><code>e02cadb</code></a>
[Tests] port tests from main; only diff should be true/falses</li>
<li>Additional commits viewable in <a
href="https://github.com/inspect-js/node-deep-equal/compare/v2.2.2...v2.2.3">compare
view</a></li>
</ul>
</details>
<br />

Updates `jsonwebtoken` from 9.0.1 to 9.0.2
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/auth0/node-jsonwebtoken/blob/master/CHANGELOG.md">jsonwebtoken's
changelog</a>.</em></p>
<blockquote>
<h2>9.0.2 - 2023-08-30</h2>
<ul>
<li>security: updating semver to 7.5.4 to resolve CVE-2022-25883, closes
<a
href="https://redirect.github.com/auth0/node-jsonwebtoken/issues/921">#921</a>.</li>
<li>refactor: reduce library size by using lodash specific dependencies,
closes <a
href="https://redirect.github.com/auth0/node-jsonwebtoken/issues/878">#878</a>.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="bc28861f1f"><code>bc28861</code></a>
Release 9.0.2 (<a
href="https://redirect.github.com/auth0/node-jsonwebtoken/issues/935">#935</a>)</li>
<li><a
href="96b89060cf"><code>96b8906</code></a>
refactor: use specific lodash packages (<a
href="https://redirect.github.com/auth0/node-jsonwebtoken/issues/933">#933</a>)</li>
<li><a
href="ed35062239"><code>ed35062</code></a>
security: Updating semver to 7.5.4 to resolve CVE-2022-25883 (<a
href="https://redirect.github.com/auth0/node-jsonwebtoken/issues/932">#932</a>)</li>
<li>See full diff in <a
href="https://github.com/auth0/node-jsonwebtoken/compare/v9.0.1...v9.0.2">compare
view</a></li>
</ul>
</details>
<details>
<summary>Maintainer changes</summary>
<p>This version was pushed to npm by <a
href="https://www.npmjs.com/~charlesrea">charlesrea</a>, a new releaser
for jsonwebtoken since your current version.</p>
</details>
<br />

Updates `@types/jsonwebtoken` from 8.5.9 to 9.0.7
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/jsonwebtoken">compare
view</a></li>
</ul>
</details>
<br />

Updates `nodemailer` from 6.9.9 to 6.9.15
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/nodemailer/nodemailer/releases">nodemailer's
releases</a>.</em></p>
<blockquote>
<h2>v6.9.15</h2>
<h2><a
href="https://github.com/nodemailer/nodemailer/compare/v6.9.14...v6.9.15">6.9.15</a>
(2024-08-08)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Fix memory leak (<a
href="https://redirect.github.com/nodemailer/nodemailer/issues/1667">#1667</a>)
(<a
href="baa28f6596">baa28f6</a>)</li>
<li><strong>mime:</strong> Added GeoJSON closes <a
href="https://redirect.github.com/nodemailer/nodemailer/issues/1637">#1637</a>
(<a
href="https://redirect.github.com/nodemailer/nodemailer/issues/1665">#1665</a>)
(<a
href="79b8293ad5">79b8293</a>)</li>
</ul>
<h2>v6.9.14</h2>
<h2><a
href="https://github.com/nodemailer/nodemailer/compare/v6.9.13...v6.9.14">6.9.14</a>
(2024-06-19)</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>api:</strong> Added support for Ethereal authentication (<a
href="56b22052a9">56b2205</a>)</li>
<li><strong>services.json:</strong> Add Email Services Provider Feishu
Mail (CN) (<a
href="https://redirect.github.com/nodemailer/nodemailer/issues/1648">#1648</a>)
(<a
href="e9e9ecc99b">e9e9ecc</a>)</li>
<li><strong>services.json:</strong> update Mailtrap host and port in
well known (<a
href="https://redirect.github.com/nodemailer/nodemailer/issues/1652">#1652</a>)
(<a
href="fc2c9ea0b4">fc2c9ea</a>)</li>
<li><strong>well-known-services:</strong> Add Loopia in well known
services (<a
href="https://redirect.github.com/nodemailer/nodemailer/issues/1655">#1655</a>)
(<a
href="21a28a18fc">21a28a1</a>)</li>
</ul>
<h2>v6.9.13</h2>
<h2><a
href="https://github.com/nodemailer/nodemailer/compare/v6.9.12...v6.9.13">6.9.13</a>
(2024-03-20)</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>tls:</strong> Ensure servername for SMTP (<a
href="d66fdd3dcc">d66fdd3</a>)</li>
</ul>
<h2>v6.9.12</h2>
<h2><a
href="https://github.com/nodemailer/nodemailer/compare/v6.9.11...v6.9.12">6.9.12</a>
(2024-03-08)</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>message-generation:</strong> Escape single quote in address
names (<a
href="4ae5fadeaa">4ae5fad</a>)</li>
</ul>
<h2>v6.9.11</h2>
<h2><a
href="https://github.com/nodemailer/nodemailer/compare/v6.9.10...v6.9.11">6.9.11</a>
(2024-02-29)</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>headers:</strong> Ensure that Content-type is the bottom
header (<a
href="c7cf97e5ec">c7cf97e</a>)</li>
</ul>
<h2>v6.9.10</h2>
<h2><a
href="https://github.com/nodemailer/nodemailer/compare/v6.9.9...v6.9.10">6.9.10</a>
(2024-02-22)</h2>
<h3>Bug Fixes</h3>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/nodemailer/nodemailer/blob/master/CHANGELOG.md">nodemailer's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/nodemailer/nodemailer/compare/v6.9.14...v6.9.15">6.9.15</a>
(2024-08-08)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Fix memory leak (<a
href="https://redirect.github.com/nodemailer/nodemailer/issues/1667">#1667</a>)
(<a
href="baa28f6596">baa28f6</a>)</li>
<li><strong>mime:</strong> Added GeoJSON closes <a
href="https://redirect.github.com/nodemailer/nodemailer/issues/1637">#1637</a>
(<a
href="https://redirect.github.com/nodemailer/nodemailer/issues/1665">#1665</a>)
(<a
href="79b8293ad5">79b8293</a>)</li>
</ul>
<h2><a
href="https://github.com/nodemailer/nodemailer/compare/v6.9.13...v6.9.14">6.9.14</a>
(2024-06-19)</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>api:</strong> Added support for Ethereal authentication (<a
href="56b22052a9">56b2205</a>)</li>
<li><strong>services.json:</strong> Add Email Services Provider Feishu
Mail (CN) (<a
href="https://redirect.github.com/nodemailer/nodemailer/issues/1648">#1648</a>)
(<a
href="e9e9ecc99b">e9e9ecc</a>)</li>
<li><strong>services.json:</strong> update Mailtrap host and port in
well known (<a
href="https://redirect.github.com/nodemailer/nodemailer/issues/1652">#1652</a>)
(<a
href="fc2c9ea0b4">fc2c9ea</a>)</li>
<li><strong>well-known-services:</strong> Add Loopia in well known
services (<a
href="https://redirect.github.com/nodemailer/nodemailer/issues/1655">#1655</a>)
(<a
href="21a28a18fc">21a28a1</a>)</li>
</ul>
<h2><a
href="https://github.com/nodemailer/nodemailer/compare/v6.9.12...v6.9.13">6.9.13</a>
(2024-03-20)</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>tls:</strong> Ensure servername for SMTP (<a
href="d66fdd3dcc">d66fdd3</a>)</li>
</ul>
<h2><a
href="https://github.com/nodemailer/nodemailer/compare/v6.9.11...v6.9.12">6.9.12</a>
(2024-03-08)</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>message-generation:</strong> Escape single quote in address
names (<a
href="4ae5fadeaa">4ae5fad</a>)</li>
</ul>
<h2><a
href="https://github.com/nodemailer/nodemailer/compare/v6.9.10...v6.9.11">6.9.11</a>
(2024-02-29)</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>headers:</strong> Ensure that Content-type is the bottom
header (<a
href="c7cf97e5ec">c7cf97e</a>)</li>
</ul>
<h2><a
href="https://github.com/nodemailer/nodemailer/compare/v6.9.9...v6.9.10">6.9.10</a>
(2024-02-22)</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>data-uri:</strong> Do not use regular expressions for
parsing data URI schemes (<a
href="12e65e975d">12e65e9</a>)</li>
<li><strong>data-uri:</strong> Moved all data-uri regexes to use the
non-regex parseDataUri method (<a
href="edd5dfe5ce">edd5dfe</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="81de9ebaeb"><code>81de9eb</code></a>
chore(master): release 6.9.15 [skip-ci] (<a
href="https://redirect.github.com/nodemailer/nodemailer/issues/1668">#1668</a>)</li>
<li><a
href="79b8293ad5"><code>79b8293</code></a>
fix(mime): Added GeoJSON closes <a
href="https://redirect.github.com/nodemailer/nodemailer/issues/1637">#1637</a>
(<a
href="https://redirect.github.com/nodemailer/nodemailer/issues/1665">#1665</a>)</li>
<li><a
href="baa28f6596"><code>baa28f6</code></a>
fix: Fix memory leak (<a
href="https://redirect.github.com/nodemailer/nodemailer/issues/1667">#1667</a>)</li>
<li><a
href="f9a92ed5cb"><code>f9a92ed</code></a>
chore(master): release 6.9.14 [skip-ci] (<a
href="https://redirect.github.com/nodemailer/nodemailer/issues/1649">#1649</a>)</li>
<li><a
href="56b22052a9"><code>56b2205</code></a>
fix(api): Added support for Ethereal authentication</li>
<li><a
href="21a28a18fc"><code>21a28a1</code></a>
fix(well-known-services): Add Loopia in well known services (<a
href="https://redirect.github.com/nodemailer/nodemailer/issues/1655">#1655</a>)</li>
<li><a
href="fc2c9ea0b4"><code>fc2c9ea</code></a>
fix(services.json): update Mailtrap host and port in well known (<a
href="https://redirect.github.com/nodemailer/nodemailer/issues/1652">#1652</a>)</li>
<li><a
href="e9e9ecc99b"><code>e9e9ecc</code></a>
fix(services.json): Add Email Services Provider Feishu Mail (CN) (<a
href="https://redirect.github.com/nodemailer/nodemailer/issues/1648">#1648</a>)</li>
<li><a
href="fa63b52d8a"><code>fa63b52</code></a>
chore(master): release 6.9.13 [skip-ci] (<a
href="https://redirect.github.com/nodemailer/nodemailer/issues/1635">#1635</a>)</li>
<li><a
href="ea0d32f114"><code>ea0d32f</code></a>
Merge branch 'master' of github.com:nodemailer/nodemailer</li>
<li>Additional commits viewable in <a
href="https://github.com/nodemailer/nodemailer/compare/v6.9.9...v6.9.15">compare
view</a></li>
</ul>
</details>
<br />

Updates `@types/nodemailer` from 6.4.14 to 6.4.16
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/nodemailer">compare
view</a></li>
</ul>
</details>
<br />

Updates `scheduler` from 0.23.0 to 0.23.2
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/facebook/react/commits/HEAD/packages/scheduler">compare
view</a></li>
</ul>
</details>
<details>
<summary>Maintainer changes</summary>
<p>This version was pushed to npm by <a
href="https://www.npmjs.com/~react-bot">react-bot</a>, a new releaser
for scheduler since your current version.</p>
</details>
<br />

Updates `react-error-boundary` from 4.0.12 to 4.0.13
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/bvaughn/react-error-boundary/releases">react-error-boundary's
releases</a>.</em></p>
<blockquote>
<h2>4.0.13</h2>
<p>Removed references to ESLint config <code>kcd-scripts</code> from
<code>package.json</code></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="15f1ba2868"><code>15f1ba2</code></a>
Update README.md (<a
href="https://redirect.github.com/bvaughn/react-error-boundary/issues/180">#180</a>)</li>
<li><a
href="ed6d112ce8"><code>ed6d112</code></a>
ci(eslint): use eslint+prettier with ci strictly (<a
href="https://redirect.github.com/bvaughn/react-error-boundary/issues/165">#165</a>)</li>
<li>See full diff in <a
href="https://github.com/bvaughn/react-error-boundary/compare/4.0.12...4.0.13">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Elliot DeNolf <denolfe@gmail.com>
2024-09-30 21:14:00 -04:00
Elliot DeNolf
d55be73992 chore(dependabot): exclude drizzle packages 2024-09-30 16:06:43 -04:00
Elliot DeNolf
b9f236ae50 chore(deps): bump nodemailer (#8453)
Bumped nodemailer to latest
2024-09-30 16:00:18 -04:00
Dan Ribbens
1d38e6d5d5 fix: sorting by id incorrectly orders by version.id (#8450)
Same as fix in beta https://github.com/payloadcms/payload/pull/8442
2024-09-30 13:24:15 -04:00
Elliot DeNolf
2f3c994cea chore(dependabot): add weekly bump for patch versions on main 2024-09-27 23:17:06 -04:00
Patrik
0586f236bb fix: properly filters out number field values with the exists operator filter (#8415)
Fixes #8181
2024-09-27 21:53:46 -04:00
Elliot DeNolf
d582619ead chore(release): payload/2.30.0 [skip ci] 2024-09-27 12:33:19 -04:00
Paul
17fc2d13d0 chore: export toast from react toastify in payload (#8438)
Export `toast` from `react-toastify` directly as to avoid situations
where there could be a module mismatch when trying to use `toast` in
custom components.

This will make toast usable from

```ts
import { toast } from 'payload/components/elements'
```
2024-09-27 09:44:58 -06:00
Elliot DeNolf
800ffd2611 ci: exclude 'status: awaiting-reply' from issue locking 2024-09-25 13:22:58 -04:00
78 changed files with 6086 additions and 1132 deletions

View File

@@ -2,25 +2,6 @@ name: Bug Report v3
description: Create a bug report for Payload v3 (beta) description: Create a bug report for Payload v3 (beta)
labels: ['status: needs-triage', 'v3'] labels: ['status: needs-triage', 'v3']
body: body:
- type: input
id: reproduction-link
attributes:
label: Link to reproduction
description: Want us to look into your issue faster? Follow the [reproduction-guide](https://github.com/payloadcms/payload/blob/main/.github/reproduction-guide.md) for more information.
validations:
required: false
- type: textarea
attributes:
label: Environment Info
description: Paste output from `pnpm payload info` (>= beta.92) _or_ Payload, Node.js, and Next.js versions.
render: text
placeholder: |
Payload:
Node.js:
Next.js:
validations:
required: true
- type: textarea - type: textarea
attributes: attributes:
@@ -28,6 +9,16 @@ body:
validations: validations:
required: true required: true
- type: input
id: reproduction-link
attributes:
label: Link to the code that reproduces this issue
description: >-
Required: Please provide a link to your reproduction. Note, if the URL is invalid (404 or a private repository), we may close the issue.
Either use `npx create-payload-app@beta -t blank` or follow the [reproduction-guide](https://github.com/payloadcms/payload/blob/main/.github/reproduction-guide.md) for more information.
validations:
required: true
- type: textarea - type: textarea
attributes: attributes:
label: Reproduction Steps label: Reproduction Steps
@@ -35,11 +26,44 @@ body:
validations: validations:
required: true required: true
- type: input - type: dropdown
id: adapters-plugins
attributes: attributes:
label: Adapters and Plugins label: Which area(s) are affected? (Select all that apply)
description: What adapters and plugins are you using if relevant? ie. db-mongodb, db-postgres, storage-vercel-blob, etc. multiple: true
options:
- 'Not sure'
- 'area: core'
- 'area: templates'
- 'area: ui'
- 'db-mongodb'
- 'db-postgres'
- 'db-sqlite'
- 'db-vercel-postgres'
- 'plugin: cloud'
- 'plugin: cloud-storage'
- 'plugin: form-builder'
- 'plugin: nested-docs'
- 'plugin: richtext-lexical'
- 'plugin: richtext-slate'
- 'plugin: search'
- 'plugin: sentry'
- 'plugin: seo'
- 'plugin: stripe'
- 'plugin: other'
validations:
required: true
- type: textarea
attributes:
label: Environment Info
description: Paste output from `pnpm payload info` (>= beta.92) _or_ Payload, Node.js, and Next.js versions.
render: bash
placeholder: |
Payload:
Node.js:
Next.js:
validations:
required: true
- type: markdown - type: markdown
attributes: attributes:

View File

@@ -0,0 +1,18 @@
We cannot recreate the issue with the provided information. **Please add a reproduction in order for us to be able to investigate.**
### Why was this issue marked with the `invalid-reproduction` label?
To be able to investigate, we need access to a reproduction to identify what triggered the issue. We prefer a link to a public GitHub repository created with `create-payload-app@beta -t blank` or a forked/branched version of this repository with tests added (more info in the [reproduction-guide](https://github.com/payloadcms/payload/blob/main/.github/reproduction-guide.md)).
To make sure the issue is resolved as quickly as possible, please make sure that the reproduction is as **minimal** as possible. This means that you should **remove unnecessary code, files, and dependencies** that do not contribute to the issue. Ensure your reproduction does not depend on secrets, 3rd party registries, private dependencies, or any other data that cannot be made public. Avoid a reproduction including a whole monorepo (unless relevant to the issue). The easier it is to reproduce the issue, the quicker we can help.
Please test your reproduction against the latest version of Payload to make sure your issue has not already been fixed.
### I added a link, why was it still marked?
Ensure the link is pointing to a codebase that is accessible (e.g. not a private repository). "[example.com](http://example.com/)", "n/a", "will add later", etc. are not acceptable links -- we need to see a public codebase. See the above section for accepted links.
### Useful Resources
- [Reproduction Guide](https://github.com/payloadcms/payload/blob/main/.github/reproduction-guide.md)
- [Contributing to Payload](https://www.youtube.com/watch?v=08Qa3ggR9rw)

View File

@@ -31,17 +31,44 @@ updates:
labels: labels:
- dependencies - dependencies
groups: groups:
production: production-deps:
dependency-type: production dependency-type: production
update-types: update-types:
- minor - minor
- patch - patch
patterns: patterns:
- '*' - '*'
dev: exclude-patterns:
- 'drizzle*'
dev-deps:
dependency-type: development dependency-type: development
update-types: update-types:
- minor - minor
- patch - patch
patterns: patterns:
- '*' - '*'
exclude-patterns:
- 'drizzle*'
# Only bump patch versions for 2.x
- package-ecosystem: npm
directory: /
target-branch: main
schedule:
interval: weekly
day: sunday
timezone: America/Detroit
time: '06:00'
commit-message:
prefix: 'chore(deps)'
labels:
- dependencies
groups:
production-deps:
dependency-type: production
update-types:
- patch
patterns:
- '*'
exclude-patterns:
- 'drizzle*'

View File

@@ -18,6 +18,7 @@ jobs:
with: with:
process-only: 'issues' process-only: 'issues'
issue-inactive-days: '1' issue-inactive-days: '1'
exclude-any-issue-labels: 'status: awaiting-reply'
log-output: true log-output: true
issue-comment: > issue-comment: >
This issue has been automatically locked. This issue has been automatically locked.

29
.github/workflows/triage.yml vendored Normal file
View File

@@ -0,0 +1,29 @@
name: triage
on:
issues:
types:
- opened
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
permissions:
issues: write
jobs:
triage:
name: nissuer
if: false # Disable after adjusting scenarios which this should be applied
runs-on: ubuntu-latest
steps:
- uses: balazsorban44/nissuer@1.10.0
with:
label-area-prefix: ""
label-area-match: "name"
label-area-section: 'Which area\(s\) are affected\? \(Select all that apply\)(.*)### Environment Info'
reproduction-comment: '.github/comments/invalid-reproduction.md'
reproduction-blocklist: 'github.com/\\w*/?$,github.com$'
reproduction-link-section: '### Link to the code that reproduces this issue(.*)### Reproduction Steps'
reproduction-invalid-label: 'invalid-reproduction'
reproduction-issue-labels: 'status: needs-triage,'

View File

@@ -1,3 +1,34 @@
## [2.30.3](https://github.com/payloadcms/payload/compare/v2.30.2...v2.30.3) (2024-10-18)
### Bug Fixes
* **db-postgres:** migrate:create errors with previous schemas ([#8786](https://github.com/payloadcms/payload/issues/8786)) ([e9c1222](https://github.com/payloadcms/payload/commit/e9c12221824a9a180991722135d22ff91d07ef11))
* duplicate with select hasMany fields ([#8734](https://github.com/payloadcms/payload/issues/8734)) ([c8ed645](https://github.com/payloadcms/payload/commit/c8ed6454a733bea09ae620517c4894701999e119))
## [2.30.2](https://github.com/payloadcms/payload/compare/v2.30.1...v2.30.2) (2024-10-17)
### Bug Fixes
* applies resize after cropping if `resizeOptions` are defined ([#8535](https://github.com/payloadcms/payload/issues/8535)) ([f2284f3](https://github.com/payloadcms/payload/commit/f2284f3d1b420c543d9eee9929f961db4cbef8a1))
* calculates correct aspect ratio dimensions on sharp based files ([#8510](https://github.com/payloadcms/payload/issues/8510)) ([9d05b82](https://github.com/payloadcms/payload/commit/9d05b82dc67b967e55176b92f7b20d4486883201)), closes [#8317](https://github.com/payloadcms/payload/issues/8317)
* **db-postgres:** build indexes for relationships ([#8446](https://github.com/payloadcms/payload/issues/8446)) ([d05e3b0](https://github.com/payloadcms/payload/commit/d05e3b0411c2e705bd7a26e93ba55f2b7532e41b))
* **db-postgres:** port many various fixes from 3.0 ([#8468](https://github.com/payloadcms/payload/issues/8468)) ([1347b6c](https://github.com/payloadcms/payload/commit/1347b6cc36c33043755b4e31d7731ba19b8c985f))
* **db-postgres:** select hasMany nested to array + tab/group ([#8739](https://github.com/payloadcms/payload/issues/8739)) ([0efc610](https://github.com/payloadcms/payload/commit/0efc6102104729a162912157a7e34293ec9764a5))
* **richtext-lexical:** add target _blank for new-tab in linkFeature ([#8571](https://github.com/payloadcms/payload/issues/8571)) ([61e8ce1](https://github.com/payloadcms/payload/commit/61e8ce17439301cb16aee92887e97a69cac4e044)), closes [#8569](https://github.com/payloadcms/payload/issues/8569)
## [2.30.1](https://github.com/payloadcms/payload/compare/v2.30.0...v2.30.1) (2024-10-02)
### Bug Fixes
* **db-mongodb:** properly filters out `number` field values with the `exists` operator filter ([#8415](https://github.com/payloadcms/payload/issues/8415)) ([0586f23](https://github.com/payloadcms/payload/commit/0586f236bbf04163a0d9b226772849cb3d977864)), closes [#8181](https://github.com/payloadcms/payload/issues/8181)
* sorting by id incorrectly orders by version.id ([#8450](https://github.com/payloadcms/payload/issues/8450)) ([1d38e6d](https://github.com/payloadcms/payload/commit/1d38e6d5d5b56a91aa8f59a461d40f28b1750f8c))
## [2.30.0](https://github.com/payloadcms/payload/compare/v2.29.0...v2.30.0) (2024-09-27)
* export toast from react toastify in payload ([#8438](https://github.com/payloadcms/payload/issues/8438)) ([17fc2d1](https://github.com/payloadcms/payload/commit/17fc2d13d06b6de01f839c27fd706bc0d6a185eb))
## [2.29.0](https://github.com/payloadcms/payload/compare/v2.28.0...v2.29.0) (2024-09-25) ## [2.29.0](https://github.com/payloadcms/payload/compare/v2.28.0...v2.29.0) (2024-09-25)

View File

@@ -99,6 +99,10 @@ If you want to add contributions to this repository, please follow the instructi
The [Examples Directory](./examples) is a great resource for learning how to setup Payload in a variety of different ways, but you can also find great examples in our blog and throughout our social media. The [Examples Directory](./examples) is a great resource for learning how to setup Payload in a variety of different ways, but you can also find great examples in our blog and throughout our social media.
If you'd like to run the examples, you can either copy them to a folder outside this repo or run them directly by (1) navigating to the example's subfolder (`cd examples/your-example-folder`) and (2) using the `--ignore-workspace` flag to bypass workspace restrictions (e.g., `pnpm --ignore-workspace install` or `pnpm --ignore-workspace dev`).
You can see more examples at:
- [Examples Directory](./examples) - [Examples Directory](./examples)
- [Payload Blog](https://payloadcms.com/blog) - [Payload Blog](https://payloadcms.com/blog)
- [Payload YouTube](https://www.youtube.com/@payloadcms) - [Payload YouTube](https://www.youtube.com/@payloadcms)

View File

@@ -24,8 +24,8 @@ export default buildConfig({
// collections go here // collections go here
], ],
localization: { localization: {
locales: ['en', 'es', 'de'], locales: ['en', 'es', 'de'], // required
defaultLocale: 'en', defaultLocale: 'en', // required
fallback: true, fallback: true,
}, },
}) })
@@ -54,7 +54,7 @@ export default buildConfig({
rtl: true, rtl: true,
}, },
], ],
defaultLocale: 'en', defaultLocale: 'en', // required
fallback: true, fallback: true,
}, },
}) })
@@ -87,7 +87,7 @@ export default buildConfig({
code: 'nb', code: 'nb',
}, },
], ],
defaultLocale: 'en', defaultLocale: 'en', // required
fallback: true, fallback: true,
}, },
}) })

View File

@@ -144,6 +144,10 @@ export default addLastModified
### Available Plugins ### Available Plugins
You can discover existing plugins by browsing the `payload-plugin` topic on [GitHub](https://github.com/topics/payload-plugin). Payload supports both official plugins, maintained by the Payload team, and community plugins, developed by external contributors.
You can discover existing plugins by browsing the `payload-plugin` topic on [GitHub](https://github.com/topics/payload-plugin). These plugins offer a wide range of functionality. Some are maintained by the Payload team, while others are community-built. While we encourage users to explore them, please note that only official plugins are maintained and supported by the Payload team. For community plugins, support may vary as they are developed and maintained independently.
For maintainers building plugins for others to use, please add the topic to help others find it. If you would like one to be built by the core Payload team, [open a Feature Request](https://github.com/payloadcms/payload/discussions) in our GitHub Discussions board. We would be happy to review your code and maybe feature you and your plugin where appropriate. For maintainers building plugins for others to use, please add the topic to help others find it. If you would like one to be built by the core Payload team, [open a Feature Request](https://github.com/payloadcms/payload/discussions) in our GitHub Discussions board. We would be happy to review your code and maybe feature you and your plugin where appropriate.
For a list of official plugins, check the [Payload monorepo](https://github.com/payloadcms/payload/tree/main/packages).

View File

@@ -68,7 +68,7 @@
"copyfiles": "2.4.1", "copyfiles": "2.4.1",
"cross-env": "7.0.3", "cross-env": "7.0.3",
"dotenv": "8.6.0", "dotenv": "8.6.0",
"drizzle-orm": "0.29.3", "drizzle-orm": "0.32.1",
"express": "4.18.2", "express": "4.18.2",
"form-data": "3.0.1", "form-data": "3.0.1",
"fs-extra": "10.1.0", "fs-extra": "10.1.0",

View File

@@ -33,15 +33,15 @@
"md5": "2.3.0", "md5": "2.3.0",
"mini-css-extract-plugin": "1.6.2", "mini-css-extract-plugin": "1.6.2",
"path-browserify": "1.0.1", "path-browserify": "1.0.1",
"postcss": "8.4.31", "postcss": "8.4.47",
"postcss-loader": "6.2.1", "postcss-loader": "6.2.1",
"postcss-preset-env": "9.0.0", "postcss-preset-env": "9.0.0",
"process": "0.11.10", "process": "0.11.10",
"sass-loader": "12.6.0", "sass-loader": "12.6.0",
"style-loader": "^2.0.0", "style-loader": "^2.0.0",
"swc-loader": "^0.2.3", "swc-loader": "^0.2.6",
"swc-minify-webpack-plugin": "^2.1.0", "swc-minify-webpack-plugin": "^2.1.3",
"terser-webpack-plugin": "^5.3.6", "terser-webpack-plugin": "^5.3.10",
"url-loader": "4.1.1", "url-loader": "4.1.1",
"webpack": "^5.78.0", "webpack": "^5.78.0",
"webpack-bundle-analyzer": "^4.8.0", "webpack-bundle-analyzer": "^4.8.0",

View File

@@ -1,6 +1,6 @@
{ {
"name": "@payloadcms/db-mongodb", "name": "@payloadcms/db-mongodb",
"version": "1.7.2", "version": "1.7.3",
"description": "The officially supported MongoDB database adapter for Payload", "description": "The officially supported MongoDB database adapter for Payload",
"repository": { "repository": {
"type": "git", "type": "git",

View File

@@ -55,6 +55,30 @@ const handleNonHasManyValues = (formattedValue, operator, path) => {
} }
} }
const buildExistsQuery = (formattedValue, path) => {
if (formattedValue) {
return {
rawQuery: {
$and: [
{ [path]: { $exists: true } },
{ [path]: { $ne: null } },
{ [path]: { $ne: '' } }, // Exclude null and empty string
],
},
}
} else {
return {
rawQuery: {
$or: [
{ [path]: { $exists: false } },
{ [path]: { $eq: null } },
{ [path]: { $eq: '' } }, // Treat empty string as null / undefined
],
},
}
}
}
export const sanitizeQueryValue = ({ export const sanitizeQueryValue = ({
field, field,
hasCustomID, hasCustomID,
@@ -102,8 +126,16 @@ export const sanitizeQueryValue = ({
} }
} }
if (field.type === 'number' && typeof formattedValue === 'string') { if (field.type === 'number') {
formattedValue = Number(val) if (typeof formattedValue === 'string' && operator !== 'exists') {
formattedValue = Number(val)
}
if (operator === 'exists') {
formattedValue = val === 'true' ? true : val === 'false' ? false : Boolean(val)
return buildExistsQuery(formattedValue, path)
}
} }
if (field.type === 'date' && typeof val === 'string' && operator !== 'exists') { if (field.type === 'date' && typeof val === 'string' && operator !== 'exists') {
@@ -193,27 +225,7 @@ export const sanitizeQueryValue = ({
if (operator === 'exists') { if (operator === 'exists') {
formattedValue = formattedValue === 'true' || formattedValue === true formattedValue = formattedValue === 'true' || formattedValue === true
if (formattedValue) { return buildExistsQuery(formattedValue, path)
return {
rawQuery: {
$and: [
{ [path]: { $exists: true } },
{ [path]: { $ne: null } },
{ [path]: { $ne: '' } },
],
},
}
} else {
return {
rawQuery: {
$or: [
{ [path]: { $exists: false } },
{ [path]: { $eq: null } },
{ [path]: { $eq: '' } }, // Treat empty string as null / undefined
],
},
}
}
} }
} }

View File

@@ -1,6 +1,6 @@
{ {
"name": "@payloadcms/db-postgres", "name": "@payloadcms/db-postgres",
"version": "0.8.7", "version": "0.8.9",
"description": "The officially supported Postgres database adapter for Payload", "description": "The officially supported Postgres database adapter for Payload",
"repository": { "repository": {
"type": "git", "type": "git",
@@ -26,12 +26,12 @@
"dependencies": { "dependencies": {
"@libsql/client": "^0.3.1", "@libsql/client": "^0.3.1",
"console-table-printer": "2.11.2", "console-table-printer": "2.11.2",
"drizzle-kit": "0.20.14-1f2c838", "drizzle-kit": "0.23.2-df9e596",
"drizzle-orm": "0.29.3", "drizzle-orm": "0.32.1",
"pg": "8.11.3", "pg": "8.11.3",
"prompts": "2.4.2", "prompts": "2.4.2",
"to-snake-case": "1.0.0", "to-snake-case": "1.0.0",
"uuid": "9.0.0" "uuid": "10.0.0"
}, },
"devDependencies": { "devDependencies": {
"@payloadcms/eslint-config": "workspace:*", "@payloadcms/eslint-config": "workspace:*",

View File

@@ -1,14 +1,14 @@
import type { Payload } from 'payload' import type { Payload } from 'payload'
import type { Connect } from 'payload/database' import type { Connect } from 'payload/database'
import { eq, sql } from 'drizzle-orm' import { sql } from 'drizzle-orm'
import { drizzle } from 'drizzle-orm/node-postgres' import { drizzle } from 'drizzle-orm/node-postgres'
import { numeric, timestamp, varchar } from 'drizzle-orm/pg-core'
import { Pool } from 'pg' import { Pool } from 'pg'
import prompts from 'prompts'
import type { PostgresAdapter } from './types' import type { PostgresAdapter } from './types'
import { pushDevSchema } from './utilities/pushDevSchema'
const connectWithReconnect = async function ({ const connectWithReconnect = async function ({
adapter, adapter,
payload, payload,
@@ -48,6 +48,7 @@ const connectWithReconnect = async function ({
export const connect: Connect = async function connect(this: PostgresAdapter, payload) { export const connect: Connect = async function connect(this: PostgresAdapter, payload) {
this.schema = { this.schema = {
pgSchema: this.pgSchema,
...this.tables, ...this.tables,
...this.relations, ...this.relations,
...this.enums, ...this.enums,
@@ -77,76 +78,10 @@ export const connect: Connect = async function connect(this: PostgresAdapter, pa
// Only push schema if not in production // Only push schema if not in production
if ( if (
process.env.NODE_ENV === 'production' || process.env.NODE_ENV !== 'production' &&
process.env.PAYLOAD_MIGRATING === 'true' || process.env.PAYLOAD_MIGRATING !== 'true' &&
this.push === false this.push !== false
) ) {
return await pushDevSchema(this)
const { pushSchema } = require('drizzle-kit/payload')
// This will prompt if clarifications are needed for Drizzle to push new schema
const { apply, hasDataLoss, statementsToExecute, warnings } = await pushSchema(
this.schema,
this.drizzle,
)
if (warnings.length) {
let message = `Warnings detected during schema push: \n\n${warnings.join('\n')}\n\n`
if (hasDataLoss) {
message += `DATA LOSS WARNING: Possible data loss detected if schema is pushed.\n\n`
}
message += `Accept warnings and push schema to database?`
const { confirm: acceptWarnings } = await prompts(
{
name: 'confirm',
type: 'confirm',
initial: false,
message,
},
{
onCancel: () => {
process.exit(0)
},
},
)
// Exit if user does not accept warnings.
// Q: Is this the right type of exit for this interaction?
if (!acceptWarnings) {
process.exit(0)
}
}
await apply()
// Migration table def in order to use query using drizzle
const migrationsSchema = this.pgSchema.table('payload_migrations', {
name: varchar('name'),
batch: numeric('batch'),
created_at: timestamp('created_at'),
updated_at: timestamp('updated_at'),
})
const devPush = await this.drizzle
.select()
.from(migrationsSchema)
.where(eq(migrationsSchema.batch, '-1'))
if (!devPush.length) {
await this.drizzle.insert(migrationsSchema).values({
name: 'dev',
batch: '-1',
})
} else {
await this.drizzle
.update(migrationsSchema)
.set({
updated_at: new Date(),
})
.where(eq(migrationsSchema.batch, '-1'))
} }
} }

View File

@@ -1,7 +1,7 @@
import type { Count } from 'payload/database' import type { Count } from 'payload/database'
import type { SanitizedCollectionConfig } from 'payload/types' import type { SanitizedCollectionConfig } from 'payload/types'
import { sql } from 'drizzle-orm' import { sql, count as sqlCount } from 'drizzle-orm'
import toSnakeCase from 'to-snake-case' import toSnakeCase from 'to-snake-case'
import type { ChainedMethods } from './find/chainMethods' import type { ChainedMethods } from './find/chainMethods'
@@ -51,8 +51,11 @@ export const count: Count = async function count(
methods: selectCountMethods, methods: selectCountMethods,
query: db query: db
.select({ .select({
count: sql<number>`count count:
(DISTINCT ${this.tables[tableName].id})`, selectCountMethods.length > 0
? sql<number>`count
(DISTINCT ${this.tables[tableName].id})`
: sqlCount(),
}) })
.from(table) .from(table)
.where(where), .where(where),

View File

@@ -1,5 +1,5 @@
/* eslint-disable no-restricted-syntax, no-await-in-loop */ /* eslint-disable no-restricted-syntax, no-await-in-loop */
import type { DrizzleSnapshotJSON } from 'drizzle-kit/payload' import type { DrizzleSnapshotJSON } from 'drizzle-kit/api'
import type { CreateMigration } from 'payload/database' import type { CreateMigration } from 'payload/database'
import fs from 'fs' import fs from 'fs'
@@ -43,12 +43,13 @@ const getDefaultDrizzleSnapshot = (): DrizzleSnapshotJSON => ({
schemas: {}, schemas: {},
tables: {}, tables: {},
}, },
dialect: 'pg', dialect: 'postgresql',
enums: {}, enums: {},
prevId: '00000000-0000-0000-0000-00000000000', prevId: '00000000-0000-0000-0000-00000000000',
schemas: {}, schemas: {},
sequences: {},
tables: {}, tables: {},
version: '5', version: '7',
}) })
export const createMigration: CreateMigration = async function createMigration( export const createMigration: CreateMigration = async function createMigration(
@@ -60,7 +61,7 @@ export const createMigration: CreateMigration = async function createMigration(
fs.mkdirSync(dir) fs.mkdirSync(dir)
} }
const { generateDrizzleJson, generateMigration } = require('drizzle-kit/payload') const { generateDrizzleJson, generateMigration, upPgSnapshot } = require('drizzle-kit/api')
const [yyymmdd, hhmmss] = new Date().toISOString().split('T') const [yyymmdd, hhmmss] = new Date().toISOString().split('T')
const formattedDate = yyymmdd.replace(/\D/g, '') const formattedDate = yyymmdd.replace(/\D/g, '')
@@ -76,6 +77,12 @@ export const createMigration: CreateMigration = async function createMigration(
let drizzleJsonBefore = getDefaultDrizzleSnapshot() let drizzleJsonBefore = getDefaultDrizzleSnapshot()
if (this.schemaName) {
drizzleJsonBefore.schemas = {
[this.schemaName]: this.schemaName,
}
}
// Get latest migration snapshot // Get latest migration snapshot
const latestSnapshot = fs const latestSnapshot = fs
.readdirSync(dir) .readdirSync(dir)
@@ -92,6 +99,11 @@ export const createMigration: CreateMigration = async function createMigration(
} }
const drizzleJsonAfter = generateDrizzleJson(this.schema) const drizzleJsonAfter = generateDrizzleJson(this.schema)
if (drizzleJsonBefore.version < drizzleJsonAfter.version) {
drizzleJsonBefore = upPgSnapshot(drizzleJsonBefore)
}
const sqlStatementsUp = await generateMigration(drizzleJsonBefore, drizzleJsonAfter) const sqlStatementsUp = await generateMigration(drizzleJsonBefore, drizzleJsonAfter)
const sqlStatementsDown = await generateMigration(drizzleJsonAfter, drizzleJsonBefore) const sqlStatementsDown = await generateMigration(drizzleJsonAfter, drizzleJsonBefore)

View File

@@ -1,7 +1,7 @@
import type { FindArgs } from 'payload/database' import type { FindArgs } from 'payload/database'
import type { Field, PayloadRequest, TypeWithID } from 'payload/types' import type { Field, PayloadRequest, TypeWithID } from 'payload/types'
import { inArray, sql } from 'drizzle-orm' import { inArray, sql, count as sqlCount } from 'drizzle-orm'
import type { PostgresAdapter } from '../types' import type { PostgresAdapter } from '../types'
import type { ChainedMethods } from './chainMethods' import type { ChainedMethods } from './chainMethods'
@@ -143,8 +143,11 @@ export const findMany = async function find({
methods: selectCountMethods, methods: selectCountMethods,
query: db query: db
.select({ .select({
count: sql<number>`count count:
(DISTINCT ${adapter.tables[tableName].id})`, selectCountMethods.length > 0
? sql<number>`count
(DISTINCT ${adapter.tables[tableName].id})`
: sqlCount(),
}) })
.from(table) .from(table)
.where(where), .where(where),

View File

@@ -14,11 +14,11 @@ export const init: Init = async function init(this: PostgresAdapter) {
if (this.schemaName) { if (this.schemaName) {
this.pgSchema = pgSchema(this.schemaName) this.pgSchema = pgSchema(this.schemaName)
} else { } else {
this.pgSchema = { table: pgTable } this.pgSchema = { enum: pgEnum, table: pgTable }
} }
if (this.payload.config.localization) { if (this.payload.config.localization) {
this.enums.enum__locales = pgEnum( this.enums.enum__locales = this.pgSchema.enum(
'_locales', '_locales',
this.payload.config.localization.locales.map(({ code }) => code) as [string, ...string[]], this.payload.config.localization.locales.map(({ code }) => code) as [string, ...string[]],
) )

View File

@@ -27,7 +27,7 @@ export async function migrate(this: PostgresAdapter): Promise<void> {
let latestBatch = 0 let latestBatch = 0
let migrationsInDB = [] let migrationsInDB = []
const hasMigrationTable = await migrationTableExists(this.drizzle) const hasMigrationTable = await migrationTableExists(this)
if (hasMigrationTable) { if (hasMigrationTable) {
;({ docs: migrationsInDB } = await payload.find({ ;({ docs: migrationsInDB } = await payload.find({
@@ -80,7 +80,7 @@ export async function migrate(this: PostgresAdapter): Promise<void> {
} }
async function runMigrationFile(payload: Payload, migration: Migration, batch: number) { async function runMigrationFile(payload: Payload, migration: Migration, batch: number) {
const { generateDrizzleJson } = require('drizzle-kit/payload') const { generateDrizzleJson } = require('drizzle-kit/api')
const start = Date.now() const start = Date.now()
const req = { payload } as PayloadRequest const req = { payload } as PayloadRequest

View File

@@ -47,7 +47,7 @@ export async function migrateDown(this: PostgresAdapter): Promise<void> {
msg: `Migrated down: ${migrationFile.name} (${Date.now() - start}ms)`, msg: `Migrated down: ${migrationFile.name} (${Date.now() - start}ms)`,
}) })
const tableExists = await migrationTableExists(this.drizzle) const tableExists = await migrationTableExists(this)
if (tableExists) { if (tableExists) {
await payload.delete({ await payload.delete({
id: migration.id, id: migration.id,

View File

@@ -51,7 +51,7 @@ export async function migrateRefresh(this: PostgresAdapter) {
msg: `Migrated down: ${migration.name} (${Date.now() - start}ms)`, msg: `Migrated down: ${migration.name} (${Date.now() - start}ms)`,
}) })
const tableExists = await migrationTableExists(this.drizzle) const tableExists = await migrationTableExists(this)
if (tableExists) { if (tableExists) {
await payload.delete({ await payload.delete({
collection: 'payload-migrations', collection: 'payload-migrations',

View File

@@ -42,7 +42,7 @@ export async function migrateReset(this: PostgresAdapter): Promise<void> {
msg: `Migrated down: ${migrationFile.name} (${Date.now() - start}ms)`, msg: `Migrated down: ${migrationFile.name} (${Date.now() - start}ms)`,
}) })
const tableExists = await migrationTableExists(this.drizzle) const tableExists = await migrationTableExists(this)
if (tableExists) { if (tableExists) {
await payload.delete({ await payload.delete({
id: migration.id, id: migration.id,
@@ -68,7 +68,7 @@ export async function migrateReset(this: PostgresAdapter): Promise<void> {
// Delete dev migration // Delete dev migration
const tableExists = await migrationTableExists(this.drizzle) const tableExists = await migrationTableExists(this)
if (tableExists) { if (tableExists) {
try { try {
await payload.delete({ await payload.delete({

View File

@@ -14,7 +14,7 @@ export async function migrateStatus(this: PostgresAdapter): Promise<void> {
}) })
let existingMigrations = [] let existingMigrations = []
const hasMigrationTable = await migrationTableExists(this.drizzle) const hasMigrationTable = await migrationTableExists(this)
if (hasMigrationTable) { if (hasMigrationTable) {
;({ existingMigrations } = await getMigrations({ payload })) ;({ existingMigrations } = await getMigrations({ payload }))

View File

@@ -3,6 +3,7 @@ import type { SQL } from 'drizzle-orm'
import type { Field, Operator, Where } from 'payload/types' import type { Field, Operator, Where } from 'payload/types'
import { and, ilike, isNotNull, isNull, ne, notInArray, or, sql } from 'drizzle-orm' import { and, ilike, isNotNull, isNull, ne, notInArray, or, sql } from 'drizzle-orm'
import { PgUUID } from 'drizzle-orm/pg-core'
import { QueryError } from 'payload/errors' import { QueryError } from 'payload/errors'
import { validOperators } from 'payload/types' import { validOperators } from 'payload/types'
@@ -174,6 +175,7 @@ export async function parseParams({
const sanitizedQueryValue = sanitizeQueryValue({ const sanitizedQueryValue = sanitizeQueryValue({
adapter, adapter,
field, field,
isUUID: table?.[columnName] instanceof PgUUID,
operator, operator,
relationOrPath, relationOrPath,
val, val,

View File

@@ -1,12 +1,14 @@
import { APIError } from 'payload/errors' import { APIError } from 'payload/errors'
import { type Field, type TabAsField, fieldAffectsData } from 'payload/types' import { type Field, type TabAsField, fieldAffectsData } from 'payload/types'
import { createArrayFromCommaDelineated } from 'payload/utilities' import { createArrayFromCommaDelineated } from 'payload/utilities'
import { validate as uuidValidate } from 'uuid'
import type { PostgresAdapter } from '../types' import type { PostgresAdapter } from '../types'
type SanitizeQueryValueArgs = { type SanitizeQueryValueArgs = {
adapter: PostgresAdapter adapter: PostgresAdapter
field: Field | TabAsField field: Field | TabAsField
isUUID: boolean
operator: string operator: string
relationOrPath: string relationOrPath: string
val: any val: any
@@ -15,6 +17,7 @@ type SanitizeQueryValueArgs = {
export const sanitizeQueryValue = ({ export const sanitizeQueryValue = ({
adapter, adapter,
field, field,
isUUID,
operator: operatorArg, operator: operatorArg,
relationOrPath, relationOrPath,
val, val,
@@ -64,6 +67,16 @@ export const sanitizeQueryValue = ({
if (field.type === 'number' && typeof formattedValue === 'string') { if (field.type === 'number' && typeof formattedValue === 'string') {
formattedValue = Number(val) formattedValue = Number(val)
if (Number.isNaN(formattedValue)) {
formattedValue = null
}
}
if (isUUID && typeof formattedValue === 'string') {
if (!uuidValidate(val)) {
formattedValue = null
}
} }
if (field.type === 'date' && operator !== 'exists') { if (field.type === 'date' && operator !== 'exists') {

View File

@@ -7,7 +7,7 @@ import type {
PgTableWithColumns, PgTableWithColumns,
UniqueConstraintBuilder, UniqueConstraintBuilder,
} from 'drizzle-orm/pg-core' } from 'drizzle-orm/pg-core'
import { Field, fieldAffectsData } from 'payload/types' import type { Field } from 'payload/types'
import { relations } from 'drizzle-orm' import { relations } from 'drizzle-orm'
import { import {
@@ -20,10 +20,12 @@ import {
unique, unique,
varchar, varchar,
} from 'drizzle-orm/pg-core' } from 'drizzle-orm/pg-core'
import { fieldAffectsData } from 'payload/types'
import toSnakeCase from 'to-snake-case' import toSnakeCase from 'to-snake-case'
import type { GenericColumns, GenericTable, IDType, PostgresAdapter } from '../types' import type { GenericColumns, GenericTable, IDType, PostgresAdapter } from '../types'
import { createIndex } from './createIndex'
import { createTableName } from './createTableName' import { createTableName } from './createTableName'
import { parentIDColumnMap } from './parentIDColumnMap' import { parentIDColumnMap } from './parentIDColumnMap'
import { setColumnID } from './setColumnID' import { setColumnID } from './setColumnID'
@@ -51,9 +53,17 @@ type Args = {
tableName: string tableName: string
timestamps?: boolean timestamps?: boolean
versions: boolean versions: boolean
/**
* Tracks whether or not this table is built
* from the result of a localized array or block field at some point
*/
withinLocalizedArrayOrBlock?: boolean
} }
type Result = { type Result = {
hasLocalizedManyNumberField: boolean
hasLocalizedManyTextField: boolean
hasLocalizedRelationshipField: boolean
hasManyNumberField: 'index' | boolean hasManyNumberField: 'index' | boolean
hasManyTextField: 'index' | boolean hasManyTextField: 'index' | boolean
relationsToBuild: Map<string, string> relationsToBuild: Map<string, string>
@@ -76,6 +86,7 @@ export const buildTable = ({
tableName, tableName,
timestamps, timestamps,
versions, versions,
withinLocalizedArrayOrBlock,
}: Args): Result => { }: Args): Result => {
const rootTableName = incomingRootTableName || tableName const rootTableName = incomingRootTableName || tableName
const columns: Record<string, PgColumnBuilder> = baseColumns const columns: Record<string, PgColumnBuilder> = baseColumns
@@ -124,6 +135,7 @@ export const buildTable = ({
rootTableIDColType: rootTableIDColType || idColType, rootTableIDColType: rootTableIDColType || idColType,
rootTableName, rootTableName,
versions, versions,
withinLocalizedArrayOrBlock,
}) })
if (timestamps) { if (timestamps) {
@@ -328,16 +340,28 @@ export const buildTable = ({
if (relatedCollectionCustomIDType === 'number') colType = 'numeric' if (relatedCollectionCustomIDType === 'number') colType = 'numeric'
if (relatedCollectionCustomIDType === 'text') colType = 'varchar' if (relatedCollectionCustomIDType === 'text') colType = 'varchar'
relationshipColumns[`${relationTo}ID`] = parentIDColumnMap[colType]( const colName = `${relationTo}ID`
`${formattedRelationTo}_id`,
) relationshipColumns[colName] = parentIDColumnMap[colType](`${formattedRelationTo}_id`)
relationExtraConfig[`${relationTo}IdFk`] = (cols) => relationExtraConfig[`${relationTo}IdFk`] = (cols) =>
foreignKey({ foreignKey({
name: `${relationshipsTableName}_${toSnakeCase(relationTo)}_fk`, name: `${relationshipsTableName}_${toSnakeCase(relationTo)}_fk`,
columns: [cols[`${relationTo}ID`]], columns: [cols[colName]],
foreignColumns: [adapter.tables[formattedRelationTo].id], foreignColumns: [adapter.tables[formattedRelationTo].id],
}).onDelete('cascade') }).onDelete('cascade')
const indexName = [colName]
if (hasLocalizedRelationshipField) {
indexName.push('locale')
}
relationExtraConfig[`${relationTo}IdIdx`] = createIndex({
name: indexName,
columnName: `${formattedRelationTo}_id`,
tableName: relationshipsTableName,
})
}) })
relationshipsTable = adapter.pgSchema.table( relationshipsTable = adapter.pgSchema.table(
@@ -431,5 +455,12 @@ export const buildTable = ({
adapter.relations[`relations_${tableName}`] = tableRelations adapter.relations[`relations_${tableName}`] = tableRelations
return { hasManyNumberField, hasManyTextField, relationsToBuild } return {
hasLocalizedManyNumberField,
hasLocalizedManyTextField,
hasLocalizedRelationshipField,
hasManyNumberField,
hasManyTextField,
relationsToBuild,
}
} }

View File

@@ -14,7 +14,6 @@ import {
integer, integer,
jsonb, jsonb,
numeric, numeric,
pgEnum,
text, text,
timestamp, timestamp,
varchar, varchar,
@@ -57,6 +56,11 @@ type Args = {
rootTableIDColType: string rootTableIDColType: string
rootTableName: string rootTableName: string
versions: boolean versions: boolean
/**
* Tracks whether or not this table is built
* from the result of a localized array or block field at some point
*/
withinLocalizedArrayOrBlock?: boolean
} }
type Result = { type Result = {
@@ -91,6 +95,7 @@ export const traverseFields = ({
rootTableIDColType, rootTableIDColType,
rootTableName, rootTableName,
versions, versions,
withinLocalizedArrayOrBlock,
}: Args): Result => { }: Args): Result => {
const throwValidationError = true const throwValidationError = true
let hasLocalizedField = false let hasLocalizedField = false
@@ -152,7 +157,11 @@ export const traverseFields = ({
switch (field.type) { switch (field.type) {
case 'text': { case 'text': {
if (field.hasMany) { if (field.hasMany) {
if (field.localized) { const isLocalized =
Boolean(field.localized && adapter.payload.config.localization) ||
withinLocalizedArrayOrBlock
if (isLocalized) {
hasLocalizedManyTextField = true hasLocalizedManyTextField = true
} }
@@ -181,7 +190,12 @@ export const traverseFields = ({
case 'number': { case 'number': {
if (field.hasMany) { if (field.hasMany) {
if (field.localized) { const isLocalized =
Boolean(field.localized && adapter.payload.config.localization) ||
withinLocalizedArrayOrBlock ||
forceLocalized
if (isLocalized) {
hasLocalizedManyNumberField = true hasLocalizedManyNumberField = true
} }
@@ -232,7 +246,7 @@ export const traverseFields = ({
throwValidationError, throwValidationError,
}) })
adapter.enums[enumName] = pgEnum( adapter.enums[enumName] = adapter.pgSchema.enum(
enumName, enumName,
field.options.map((option) => { field.options.map((option) => {
if (optionIsObject(option)) { if (optionIsObject(option)) {
@@ -268,7 +282,12 @@ export const traverseFields = ({
parentIdx: (cols) => index(`${selectTableName}_parent_idx`).on(cols.parent), parentIdx: (cols) => index(`${selectTableName}_parent_idx`).on(cols.parent),
} }
if (field.localized) { const isLocalized =
Boolean(field.localized && adapter.payload.config.localization) ||
withinLocalizedArrayOrBlock ||
forceLocalized
if (isLocalized) {
baseColumns.locale = adapter.enums.enum__locales('locale').notNull() baseColumns.locale = adapter.enums.enum__locales('locale').notNull()
baseExtraConfig.localeIdx = (cols) => baseExtraConfig.localeIdx = (cols) =>
index(`${selectTableName}_locale_idx`).on(cols.locale) index(`${selectTableName}_locale_idx`).on(cols.locale)
@@ -342,13 +361,21 @@ export const traverseFields = ({
_parentIDIdx: (cols) => index(`${arrayTableName}_parent_id_idx`).on(cols._parentID), _parentIDIdx: (cols) => index(`${arrayTableName}_parent_id_idx`).on(cols._parentID),
} }
if (field.localized && adapter.payload.config.localization) { const isLocalized =
Boolean(field.localized && adapter.payload.config.localization) ||
withinLocalizedArrayOrBlock ||
forceLocalized
if (isLocalized) {
baseColumns._locale = adapter.enums.enum__locales('_locale').notNull() baseColumns._locale = adapter.enums.enum__locales('_locale').notNull()
baseExtraConfig._localeIdx = (cols) => baseExtraConfig._localeIdx = (cols) =>
index(`${arrayTableName}_locale_idx`).on(cols._locale) index(`${arrayTableName}_locale_idx`).on(cols._locale)
} }
const { const {
hasLocalizedManyNumberField: subHasLocalizedManyNumberField,
hasLocalizedManyTextField: subHasLocalizedManyTextField,
hasLocalizedRelationshipField: subHasLocalizedRelationshipField,
hasManyNumberField: subHasManyNumberField, hasManyNumberField: subHasManyNumberField,
hasManyTextField: subHasManyTextField, hasManyTextField: subHasManyTextField,
relationsToBuild: subRelationsToBuild, relationsToBuild: subRelationsToBuild,
@@ -365,8 +392,21 @@ export const traverseFields = ({
rootTableName, rootTableName,
tableName: arrayTableName, tableName: arrayTableName,
versions, versions,
withinLocalizedArrayOrBlock: isLocalized,
}) })
if (subHasLocalizedManyNumberField) {
hasLocalizedManyNumberField = subHasLocalizedManyNumberField
}
if (subHasLocalizedRelationshipField) {
hasLocalizedRelationshipField = subHasLocalizedRelationshipField
}
if (subHasLocalizedManyTextField) {
hasLocalizedManyTextField = subHasLocalizedManyTextField
}
if (subHasManyTextField) { if (subHasManyTextField) {
if (!hasManyTextField || subHasManyTextField === 'index') if (!hasManyTextField || subHasManyTextField === 'index')
hasManyTextField = subHasManyTextField hasManyTextField = subHasManyTextField
@@ -433,13 +473,21 @@ export const traverseFields = ({
_pathIdx: (cols) => index(`${blockTableName}_path_idx`).on(cols._path), _pathIdx: (cols) => index(`${blockTableName}_path_idx`).on(cols._path),
} }
if (field.localized && adapter.payload.config.localization) { const isLocalized =
Boolean(field.localized && adapter.payload.config.localization) ||
withinLocalizedArrayOrBlock ||
forceLocalized
if (isLocalized) {
baseColumns._locale = adapter.enums.enum__locales('_locale').notNull() baseColumns._locale = adapter.enums.enum__locales('_locale').notNull()
baseExtraConfig._localeIdx = (cols) => baseExtraConfig._localeIdx = (cols) =>
index(`${blockTableName}_locale_idx`).on(cols._locale) index(`${blockTableName}_locale_idx`).on(cols._locale)
} }
const { const {
hasLocalizedManyNumberField: subHasLocalizedManyNumberField,
hasLocalizedManyTextField: subHasLocalizedManyTextField,
hasLocalizedRelationshipField: subHasLocalizedRelationshipField,
hasManyNumberField: subHasManyNumberField, hasManyNumberField: subHasManyNumberField,
hasManyTextField: subHasManyTextField, hasManyTextField: subHasManyTextField,
relationsToBuild: subRelationsToBuild, relationsToBuild: subRelationsToBuild,
@@ -456,8 +504,21 @@ export const traverseFields = ({
rootTableName, rootTableName,
tableName: blockTableName, tableName: blockTableName,
versions, versions,
withinLocalizedArrayOrBlock: isLocalized,
}) })
if (subHasLocalizedManyNumberField) {
hasLocalizedManyNumberField = subHasLocalizedManyNumberField
}
if (subHasLocalizedRelationshipField) {
hasLocalizedRelationshipField = subHasLocalizedRelationshipField
}
if (subHasLocalizedManyTextField) {
hasLocalizedManyTextField = subHasLocalizedManyTextField
}
if (subHasManyTextField) { if (subHasManyTextField) {
if (!hasManyTextField || subHasManyTextField === 'index') if (!hasManyTextField || subHasManyTextField === 'index')
hasManyTextField = subHasManyTextField hasManyTextField = subHasManyTextField
@@ -541,6 +602,7 @@ export const traverseFields = ({
rootTableIDColType, rootTableIDColType,
rootTableName, rootTableName,
versions, versions,
withinLocalizedArrayOrBlock,
}) })
if (groupHasLocalizedField) hasLocalizedField = true if (groupHasLocalizedField) hasLocalizedField = true
@@ -584,6 +646,7 @@ export const traverseFields = ({
rootTableIDColType, rootTableIDColType,
rootTableName, rootTableName,
versions, versions,
withinLocalizedArrayOrBlock: withinLocalizedArrayOrBlock || field.localized,
}) })
if (groupHasLocalizedField) hasLocalizedField = true if (groupHasLocalizedField) hasLocalizedField = true
@@ -628,6 +691,7 @@ export const traverseFields = ({
rootTableIDColType, rootTableIDColType,
rootTableName, rootTableName,
versions, versions,
withinLocalizedArrayOrBlock,
}) })
if (tabHasLocalizedField) hasLocalizedField = true if (tabHasLocalizedField) hasLocalizedField = true
@@ -672,6 +736,7 @@ export const traverseFields = ({
rootTableIDColType, rootTableIDColType,
rootTableName, rootTableName,
versions, versions,
withinLocalizedArrayOrBlock,
}) })
if (rowHasLocalizedField) hasLocalizedField = true if (rowHasLocalizedField) hasLocalizedField = true
@@ -691,7 +756,10 @@ export const traverseFields = ({
relationships.add(field.relationTo) relationships.add(field.relationTo)
} }
if (field.localized && adapter.payload.config.localization) { if (
Boolean(field.localized && adapter.payload.config.localization) ||
withinLocalizedArrayOrBlock
) {
hasLocalizedRelationshipField = true hasLocalizedRelationshipField = true
} }
break break

View File

@@ -1,4 +1,3 @@
/* eslint-disable no-param-reassign */
import type { NumberField } from 'payload/types' import type { NumberField } from 'payload/types'
type Args = { type Args = {
@@ -6,10 +5,29 @@ type Args = {
locale?: string locale?: string
numberRows: Record<string, unknown>[] numberRows: Record<string, unknown>[]
ref: Record<string, unknown> ref: Record<string, unknown>
withinArrayOrBlockLocale?: string
} }
export const transformHasManyNumber = ({ field, locale, numberRows, ref }: Args) => { export const transformHasManyNumber = ({
const result = numberRows.map(({ number }) => number) field,
locale,
numberRows,
ref,
withinArrayOrBlockLocale,
}: Args) => {
let result: unknown[]
if (withinArrayOrBlockLocale) {
result = numberRows.reduce((acc, { locale, number }) => {
if (locale === withinArrayOrBlockLocale) {
acc.push(number)
}
return acc
}, [])
} else {
result = numberRows.map(({ number }) => number)
}
if (locale) { if (locale) {
ref[field.name][locale] = result ref[field.name][locale] = result

View File

@@ -1,4 +1,3 @@
/* eslint-disable no-param-reassign */
import type { TextField } from 'payload/types' import type { TextField } from 'payload/types'
type Args = { type Args = {
@@ -6,10 +5,29 @@ type Args = {
locale?: string locale?: string
ref: Record<string, unknown> ref: Record<string, unknown>
textRows: Record<string, unknown>[] textRows: Record<string, unknown>[]
withinArrayOrBlockLocale?: string
} }
export const transformHasManyText = ({ field, locale, ref, textRows }: Args) => { export const transformHasManyText = ({
const result = textRows.map(({ text }) => text) field,
locale,
ref,
textRows,
withinArrayOrBlockLocale,
}: Args) => {
let result: unknown[]
if (withinArrayOrBlockLocale) {
result = textRows.reduce((acc, { locale, text }) => {
if (locale === withinArrayOrBlockLocale) {
acc.push(text)
}
return acc
}, [])
} else {
result = textRows.map(({ text }) => text)
}
if (locale) { if (locale) {
ref[field.name][locale] = result ref[field.name][locale] = result

View File

@@ -6,21 +6,31 @@ type Args = {
locale?: string locale?: string
ref: Record<string, unknown> ref: Record<string, unknown>
relations: Record<string, unknown>[] relations: Record<string, unknown>[]
withinArrayOrBlockLocale?: string
} }
export const transformRelationship = ({ field, locale, ref, relations }: Args) => { export const transformRelationship = ({
field,
locale,
ref,
relations,
withinArrayOrBlockLocale,
}: Args) => {
let result: unknown let result: unknown
if (!('hasMany' in field) || field.hasMany === false) { if (!('hasMany' in field) || field.hasMany === false) {
const relation = relations[0] let relation = relations[0]
if (withinArrayOrBlockLocale) {
relation = relations.find((rel) => rel.locale === withinArrayOrBlockLocale)
}
if (relation) { if (relation) {
// Handle hasOne Poly // Handle hasOne Poly
if (Array.isArray(field.relationTo)) { if (Array.isArray(field.relationTo)) {
const matchedRelation = Object.entries(relation).find( const matchedRelation = Object.entries(relation).find(([key, val]) => {
([key, val]) => return val !== null && !['id', 'locale', 'order', 'parent', 'path'].includes(key)
val !== null && !['id', 'locale', 'order', 'parent', 'path'].includes(key), })
)
if (matchedRelation) { if (matchedRelation) {
const relationTo = matchedRelation[0].replace('ID', '') const relationTo = matchedRelation[0].replace('ID', '')
@@ -40,18 +50,26 @@ export const transformRelationship = ({ field, locale, ref, relations }: Args) =
const transformedRelations = [] const transformedRelations = []
relations.forEach((relation) => { relations.forEach((relation) => {
let matchedLocale = true
if (withinArrayOrBlockLocale) {
matchedLocale = relation.locale === withinArrayOrBlockLocale
}
// Handle hasMany // Handle hasMany
if (!Array.isArray(field.relationTo)) { if (!Array.isArray(field.relationTo)) {
const relatedData = relation[`${field.relationTo}ID`] const relatedData = relation[`${field.relationTo}ID`]
if (relatedData) { if (relatedData && matchedLocale) {
transformedRelations.push(relatedData) transformedRelations.push(relatedData)
} }
} else { } else {
// Handle hasMany Poly // Handle hasMany Poly
const matchedRelation = Object.entries(relation).find( const matchedRelation = Object.entries(relation).find(
([key, val]) => ([key, val]) =>
val !== null && !['id', 'locale', 'order', 'parent', 'path'].includes(key), val !== null &&
!['id', 'locale', 'order', 'parent', 'path'].includes(key) &&
matchedLocale,
) )
if (matchedRelation) { if (matchedRelation) {

View File

@@ -55,6 +55,10 @@ type TraverseFieldsArgs = {
* All hasMany text fields, as returned by Drizzle, keyed on an object by field path * All hasMany text fields, as returned by Drizzle, keyed on an object by field path
*/ */
texts: Record<string, Record<string, unknown>[]> texts: Record<string, Record<string, unknown>[]>
/**
* Set to a locale if this group of fields is within a localized array or block.
*/
withinArrayOrBlockLocale?: string
} }
// Traverse fields recursively, transforming data // Traverse fields recursively, transforming data
@@ -71,6 +75,7 @@ export const traverseFields = <T extends Record<string, unknown>>({
relationships, relationships,
table, table,
texts, texts,
withinArrayOrBlockLocale,
}: TraverseFieldsArgs): T => { }: TraverseFieldsArgs): T => {
const sanitizedPath = path ? `${path}.` : path const sanitizedPath = path ? `${path}.` : path
@@ -88,6 +93,7 @@ export const traverseFields = <T extends Record<string, unknown>>({
relationships, relationships,
table, table,
texts, texts,
withinArrayOrBlockLocale,
}) })
} }
@@ -108,6 +114,7 @@ export const traverseFields = <T extends Record<string, unknown>>({
relationships, relationships,
table, table,
texts, texts,
withinArrayOrBlockLocale,
}) })
} }
@@ -145,6 +152,7 @@ export const traverseFields = <T extends Record<string, unknown>>({
relationships, relationships,
table: row, table: row,
texts, texts,
withinArrayOrBlockLocale: locale,
}) })
if ('_order' in rowResult) { if ('_order' in rowResult) {
@@ -157,7 +165,7 @@ export const traverseFields = <T extends Record<string, unknown>>({
return arrayResult return arrayResult
}, {}) }, {})
} else { } else {
result[field.name] = fieldData.map((row, i) => { result[field.name] = fieldData.reduce((acc, row, i) => {
if (row._uuid) { if (row._uuid) {
row.id = row._uuid row.id = row._uuid
delete row._uuid delete row._uuid
@@ -167,34 +175,48 @@ export const traverseFields = <T extends Record<string, unknown>>({
delete row._order delete row._order
} }
return traverseFields<T>({ if (
blocks, !withinArrayOrBlockLocale ||
config, (withinArrayOrBlockLocale && withinArrayOrBlockLocale === row._locale)
dataRef: row, ) {
deletions, if (row._locale) {
fieldPrefix: '', delete row._locale
fields: field.fields, }
numbers,
path: `${sanitizedPath}${field.name}.${i}`, acc.push(
relationships, traverseFields<T>({
table: row, blocks,
texts, config,
}) dataRef: row,
}) deletions,
fieldPrefix: '',
fields: field.fields,
numbers,
path: `${sanitizedPath}${field.name}.${i}`,
relationships,
table: row,
texts,
withinArrayOrBlockLocale,
}),
)
}
return acc
}, [])
} }
} }
return result return result
} }
if (field.type === 'blocks') { if (field.type === 'blocks') {
const blockFieldPath = `${sanitizedPath}${field.name}` const blockFieldPath = `${sanitizedPath}${field.name}`
const blocksByPath = blocks[blockFieldPath]
if (Array.isArray(blocks[blockFieldPath])) { if (Array.isArray(blocksByPath)) {
if (field.localized) { if (field.localized) {
result[field.name] = {} result[field.name] = {}
blocks[blockFieldPath].forEach((row) => { blocksByPath.forEach((row) => {
if (row._uuid) { if (row._uuid) {
row.id = row._uuid row.id = row._uuid
delete row._uuid delete row._uuid
@@ -223,6 +245,7 @@ export const traverseFields = <T extends Record<string, unknown>>({
relationships, relationships,
table: row, table: row,
texts, texts,
withinArrayOrBlockLocale: locale,
}) })
delete blockResult._order delete blockResult._order
@@ -233,7 +256,23 @@ export const traverseFields = <T extends Record<string, unknown>>({
}) })
}) })
} else { } else {
result[field.name] = blocks[blockFieldPath].map((row, i) => { // Add locale-specific index to have a proper blockFieldPath for current locale
// because blocks can be in the same array for different locales!
if (withinArrayOrBlockLocale && config.localization) {
for (const locale of config.localization.localeCodes) {
let localeIndex = 0
for (let i = 0; i < blocksByPath.length; i++) {
const row = blocksByPath[i]
if (row._locale === locale) {
row._index = localeIndex
localeIndex++
}
}
}
}
result[field.name] = blocksByPath.reduce((acc, row, i) => {
delete row._order delete row._order
if (row._uuid) { if (row._uuid) {
row.id = row._uuid row.id = row._uuid
@@ -242,23 +281,43 @@ export const traverseFields = <T extends Record<string, unknown>>({
const block = field.blocks.find(({ slug }) => slug === row.blockType) const block = field.blocks.find(({ slug }) => slug === row.blockType)
if (block) { if (block) {
return traverseFields<T>({ if (
blocks, !withinArrayOrBlockLocale ||
config, (withinArrayOrBlockLocale && withinArrayOrBlockLocale === row._locale)
dataRef: row, ) {
deletions, if (row._locale) {
fieldPrefix: '', delete row._locale
fields: block.fields, }
numbers, if (typeof row._index === 'number') {
path: `${blockFieldPath}.${i}`, i = row._index
relationships, delete row._index
table: row, }
texts,
}) acc.push(
traverseFields<T>({
blocks,
config,
dataRef: row,
deletions,
fieldPrefix: '',
fields: block.fields,
numbers,
path: `${blockFieldPath}.${i}`,
relationships,
table: row,
texts,
withinArrayOrBlockLocale,
}),
)
return acc
}
} else {
acc.push({})
} }
return {} return acc
}) }, [])
} }
} }
@@ -305,6 +364,7 @@ export const traverseFields = <T extends Record<string, unknown>>({
field, field,
ref: result, ref: result,
relations: relationPathMatch, relations: relationPathMatch,
withinArrayOrBlockLocale,
}) })
} }
@@ -339,6 +399,7 @@ export const traverseFields = <T extends Record<string, unknown>>({
field, field,
ref: result, ref: result,
textRows: textPathMatch, textRows: textPathMatch,
withinArrayOrBlockLocale,
}) })
} }
@@ -373,6 +434,7 @@ export const traverseFields = <T extends Record<string, unknown>>({
field, field,
numberRows: numberPathMatch, numberRows: numberPathMatch,
ref: result, ref: result,
withinArrayOrBlockLocale,
}) })
} }
@@ -391,7 +453,11 @@ export const traverseFields = <T extends Record<string, unknown>>({
return selectResult return selectResult
}, {}) }, {})
} else { } else {
result[field.name] = fieldData.map(({ value }) => value) let selectData = fieldData
if (withinArrayOrBlockLocale) {
selectData = selectData.filter(({ locale }) => locale === withinArrayOrBlockLocale)
}
result[field.name] = selectData.map(({ value }) => value)
} }
} }
return result return result
@@ -404,8 +470,20 @@ export const traverseFields = <T extends Record<string, unknown>>({
}[] = [] }[] = []
if (field.localized && Array.isArray(table._locales)) { if (field.localized && Array.isArray(table._locales)) {
if (!table._locales.length && config.localization) {
config.localization.localeCodes.forEach((_locale) =>
(table._locales as unknown[]).push({ _locale }),
)
}
table._locales.forEach((localeRow) => { table._locales.forEach((localeRow) => {
valuesToTransform.push({ ref: localizedFieldData, table: localeRow }) valuesToTransform.push({
ref: localizedFieldData,
table: {
...table,
...localeRow,
},
})
}) })
} else { } else {
valuesToTransform.push({ ref: result, table }) valuesToTransform.push({ ref: result, table })
@@ -419,50 +497,28 @@ export const traverseFields = <T extends Record<string, unknown>>({
case 'tab': case 'tab':
case 'group': { case 'group': {
const groupFieldPrefix = `${fieldPrefix || ''}${field.name}_` const groupFieldPrefix = `${fieldPrefix || ''}${field.name}_`
const groupData = {}
const locale = table._locale as string
const refKey = field.localized && locale ? locale : field.name
if (field.localized) { if (field.localized && locale) delete table._locale
if (typeof locale === 'string' && !ref[locale]) { ref[refKey] = traverseFields<Record<string, unknown>>({
ref[locale] = {} blocks,
delete table._locale config,
} dataRef: groupData as Record<string, unknown>,
deletions,
fieldPrefix: groupFieldPrefix,
fields: field.fields,
numbers,
path: `${sanitizedPath}${field.name}`,
relationships,
table,
texts,
withinArrayOrBlockLocale: locale || withinArrayOrBlockLocale,
})
Object.entries(ref).forEach(([groupLocale, groupLocaleData]) => { if ('_order' in ref) {
ref[groupLocale] = traverseFields<Record<string, unknown>>({ delete ref._order
blocks,
config,
dataRef: groupLocaleData as Record<string, unknown>,
deletions,
fieldPrefix: groupFieldPrefix,
fields: field.fields,
numbers,
path: `${sanitizedPath}${field.name}`,
relationships,
table,
texts,
})
})
if ('_order' in ref) {
delete ref._order
}
} else {
const groupData = {}
ref[field.name] = traverseFields<Record<string, unknown>>({
blocks,
config,
dataRef: groupData as Record<string, unknown>,
deletions,
fieldPrefix: groupFieldPrefix,
fields: field.fields,
numbers,
path: `${sanitizedPath}${field.name}`,
relationships,
table,
texts,
})
if ('_order' in ref) {
delete ref._order
}
} }
break break

View File

@@ -26,6 +26,11 @@ type Args = {
[tableName: string]: Record<string, unknown>[] [tableName: string]: Record<string, unknown>[]
} }
texts: Record<string, unknown>[] texts: Record<string, unknown>[]
/**
* Set to a locale code if this set of fields is traversed within a
* localized array or block field
*/
withinArrayOrBlockLocale?: string
} }
export const transformArray = ({ export const transformArray = ({
@@ -43,6 +48,7 @@ export const transformArray = ({
relationshipsToDelete, relationshipsToDelete,
selects, selects,
texts, texts,
withinArrayOrBlockLocale,
}: Args) => { }: Args) => {
const newRows: ArrayRowToInsert[] = [] const newRows: ArrayRowToInsert[] = []
@@ -78,6 +84,10 @@ export const transformArray = ({
newRow.row._locale = locale newRow.row._locale = locale
} }
if (withinArrayOrBlockLocale) {
newRow.row._locale = withinArrayOrBlockLocale
}
traverseFields({ traverseFields({
adapter, adapter,
arrays: newRow.arrays, arrays: newRow.arrays,
@@ -97,6 +107,7 @@ export const transformArray = ({
row: newRow.row, row: newRow.row,
selects, selects,
texts, texts,
withinArrayOrBlockLocale,
}) })
newRows.push(newRow) newRows.push(newRow)

View File

@@ -26,6 +26,11 @@ type Args = {
[tableName: string]: Record<string, unknown>[] [tableName: string]: Record<string, unknown>[]
} }
texts: Record<string, unknown>[] texts: Record<string, unknown>[]
/**
* Set to a locale code if this set of fields is traversed within a
* localized array or block field
*/
withinArrayOrBlockLocale?: string
} }
export const transformBlocks = ({ export const transformBlocks = ({
adapter, adapter,
@@ -41,6 +46,7 @@ export const transformBlocks = ({
relationshipsToDelete, relationshipsToDelete,
selects, selects,
texts, texts,
withinArrayOrBlockLocale,
}: Args) => { }: Args) => {
data.forEach((blockRow, i) => { data.forEach((blockRow, i) => {
if (typeof blockRow.blockType !== 'string') return if (typeof blockRow.blockType !== 'string') return
@@ -60,6 +66,7 @@ export const transformBlocks = ({
} }
if (field.localized && locale) newRow.row._locale = locale if (field.localized && locale) newRow.row._locale = locale
if (withinArrayOrBlockLocale) newRow.row._locale = withinArrayOrBlockLocale
const blockTableName = adapter.tableNameMap.get(`${baseTableName}_blocks_${blockType}`) const blockTableName = adapter.tableNameMap.get(`${baseTableName}_blocks_${blockType}`)
@@ -94,6 +101,7 @@ export const transformBlocks = ({
row: newRow.row, row: newRow.row,
selects, selects,
texts, texts,
withinArrayOrBlockLocale,
}) })
blocks[blockType].push(newRow) blocks[blockType].push(newRow)

View File

@@ -58,6 +58,11 @@ type Args = {
[tableName: string]: Record<string, unknown>[] [tableName: string]: Record<string, unknown>[]
} }
texts: Record<string, unknown>[] texts: Record<string, unknown>[]
/**
* Set to a locale code if this set of fields is traversed within a
* localized array or block field
*/
withinArrayOrBlockLocale?: string
} }
export const traverseFields = ({ export const traverseFields = ({
@@ -81,6 +86,7 @@ export const traverseFields = ({
row, row,
selects, selects,
texts, texts,
withinArrayOrBlockLocale,
}: Args) => { }: Args) => {
fields.forEach((field) => { fields.forEach((field) => {
let columnName = '' let columnName = ''
@@ -117,6 +123,7 @@ export const traverseFields = ({
relationshipsToDelete, relationshipsToDelete,
selects, selects,
texts, texts,
withinArrayOrBlockLocale: localeKey,
}) })
arrays[arrayTableName] = arrays[arrayTableName].concat(newRows) arrays[arrayTableName] = arrays[arrayTableName].concat(newRows)
@@ -138,6 +145,7 @@ export const traverseFields = ({
relationshipsToDelete, relationshipsToDelete,
selects, selects,
texts, texts,
withinArrayOrBlockLocale,
}) })
arrays[arrayTableName] = arrays[arrayTableName].concat(newRows) arrays[arrayTableName] = arrays[arrayTableName].concat(newRows)
@@ -169,6 +177,7 @@ export const traverseFields = ({
relationshipsToDelete, relationshipsToDelete,
selects, selects,
texts, texts,
withinArrayOrBlockLocale: localeKey,
}) })
} }
}) })
@@ -187,6 +196,7 @@ export const traverseFields = ({
relationshipsToDelete, relationshipsToDelete,
selects, selects,
texts, texts,
withinArrayOrBlockLocale,
}) })
} }
@@ -197,6 +207,9 @@ export const traverseFields = ({
if (typeof data[field.name] === 'object' && data[field.name] !== null) { if (typeof data[field.name] === 'object' && data[field.name] !== null) {
if (field.localized) { if (field.localized) {
Object.entries(data[field.name]).forEach(([localeKey, localeData]) => { Object.entries(data[field.name]).forEach(([localeKey, localeData]) => {
// preserve array ID if there is
localeData._uuid = data.id || data._uuid
traverseFields({ traverseFields({
adapter, adapter,
arrays, arrays,
@@ -218,9 +231,14 @@ export const traverseFields = ({
row, row,
selects, selects,
texts, texts,
withinArrayOrBlockLocale: localeKey,
}) })
}) })
} else { } else {
// preserve array ID if there is
const groupData = data[field.name] as Record<string, unknown>
groupData._uuid = data.id || data._uuid
traverseFields({ traverseFields({
adapter, adapter,
arrays, arrays,
@@ -228,7 +246,7 @@ export const traverseFields = ({
blocks, blocks,
blocksToDelete, blocksToDelete,
columnPrefix: `${columnName}_`, columnPrefix: `${columnName}_`,
data: data[field.name] as Record<string, unknown>, data: groupData,
existingLocales, existingLocales,
fieldPrefix: `${fieldName}_`, fieldPrefix: `${fieldName}_`,
fields: field.fields, fields: field.fields,
@@ -241,6 +259,7 @@ export const traverseFields = ({
row, row,
selects, selects,
texts, texts,
withinArrayOrBlockLocale,
}) })
} }
} }
@@ -254,6 +273,9 @@ export const traverseFields = ({
if (typeof data[tab.name] === 'object' && data[tab.name] !== null) { if (typeof data[tab.name] === 'object' && data[tab.name] !== null) {
if (tab.localized) { if (tab.localized) {
Object.entries(data[tab.name]).forEach(([localeKey, localeData]) => { Object.entries(data[tab.name]).forEach(([localeKey, localeData]) => {
// preserve array ID if there is
localeData._uuid = data.id || data._uuid
traverseFields({ traverseFields({
adapter, adapter,
arrays, arrays,
@@ -275,9 +297,14 @@ export const traverseFields = ({
row, row,
selects, selects,
texts, texts,
withinArrayOrBlockLocale: localeKey,
}) })
}) })
} else { } else {
const tabData = data[tab.name] as Record<string, unknown>
// preserve array ID if there is
tabData._uuid = data.id || data._uuid
traverseFields({ traverseFields({
adapter, adapter,
arrays, arrays,
@@ -285,7 +312,7 @@ export const traverseFields = ({
blocks, blocks,
blocksToDelete, blocksToDelete,
columnPrefix: `${columnPrefix || ''}${toSnakeCase(tab.name)}_`, columnPrefix: `${columnPrefix || ''}${toSnakeCase(tab.name)}_`,
data: data[tab.name] as Record<string, unknown>, data: tabData,
existingLocales, existingLocales,
fieldPrefix: `${fieldPrefix || ''}${tab.name}_`, fieldPrefix: `${fieldPrefix || ''}${tab.name}_`,
fields: tab.fields, fields: tab.fields,
@@ -298,6 +325,7 @@ export const traverseFields = ({
row, row,
selects, selects,
texts, texts,
withinArrayOrBlockLocale,
}) })
} }
} }
@@ -322,6 +350,7 @@ export const traverseFields = ({
row, row,
selects, selects,
texts, texts,
withinArrayOrBlockLocale,
}) })
} }
}) })
@@ -339,6 +368,7 @@ export const traverseFields = ({
existingLocales, existingLocales,
fieldPrefix, fieldPrefix,
fields: field.fields, fields: field.fields,
forcedLocale,
locales, locales,
numbers, numbers,
parentTableName, parentTableName,
@@ -348,6 +378,7 @@ export const traverseFields = ({
row, row,
selects, selects,
texts, texts,
withinArrayOrBlockLocale,
}) })
} }
@@ -384,6 +415,7 @@ export const traverseFields = ({
transformRelationship({ transformRelationship({
baseRow: { baseRow: {
locale: withinArrayOrBlockLocale,
path: relationshipPath, path: relationshipPath,
}, },
data: fieldData, data: fieldData,
@@ -416,6 +448,7 @@ export const traverseFields = ({
} else if (Array.isArray(fieldData)) { } else if (Array.isArray(fieldData)) {
transformTexts({ transformTexts({
baseRow: { baseRow: {
locale: withinArrayOrBlockLocale,
path: textPath, path: textPath,
}, },
data: fieldData, data: fieldData,
@@ -447,6 +480,7 @@ export const traverseFields = ({
} else if (Array.isArray(fieldData)) { } else if (Array.isArray(fieldData)) {
transformNumbers({ transformNumbers({
baseRow: { baseRow: {
locale: withinArrayOrBlockLocale,
path: numberPath, path: numberPath,
}, },
data: fieldData, data: fieldData,
@@ -479,6 +513,7 @@ export const traverseFields = ({
const newRows = transformSelects({ const newRows = transformSelects({
id: data._uuid || data.id, id: data._uuid || data.id,
data: data[field.name], data: data[field.name],
locale: withinArrayOrBlockLocale,
}) })
selects[selectTableName] = selects[selectTableName].concat(newRows) selects[selectTableName] = selects[selectTableName].concat(newRows)

View File

@@ -14,6 +14,7 @@ import type {
PgTableWithColumns, PgTableWithColumns,
PgTransaction, PgTransaction,
} from 'drizzle-orm/pg-core' } from 'drizzle-orm/pg-core'
import type { pgEnum } from 'drizzle-orm/pg-core'
import type { PgTableFn } from 'drizzle-orm/pg-core/table' import type { PgTableFn } from 'drizzle-orm/pg-core/table'
import type { Payload } from 'payload' import type { Payload } from 'payload'
import type { BaseDatabaseAdapter } from 'payload/database' import type { BaseDatabaseAdapter } from 'payload/database'
@@ -60,6 +61,13 @@ export type DrizzleTransaction = PgTransaction<
ExtractTablesWithRelations<Record<string, unknown>> ExtractTablesWithRelations<Record<string, unknown>>
> >
type Schema =
| {
enum: typeof pgEnum
table: PgTableFn
}
| PgSchema
export type PostgresAdapter = BaseDatabaseAdapter & { export type PostgresAdapter = BaseDatabaseAdapter & {
drizzle: DrizzleDB drizzle: DrizzleDB
enums: Record<string, GenericEnum> enums: Record<string, GenericEnum>
@@ -71,13 +79,13 @@ export type PostgresAdapter = BaseDatabaseAdapter & {
idType: Args['idType'] idType: Args['idType']
localesSuffix?: string localesSuffix?: string
logger: DrizzleConfig['logger'] logger: DrizzleConfig['logger']
pgSchema?: { table: PgTableFn } | PgSchema pgSchema?: Schema
pool: Pool pool: Pool
poolOptions: Args['pool'] poolOptions: Args['pool']
push: boolean push: boolean
relations: Record<string, GenericRelation> relations: Record<string, GenericRelation>
relationshipsSuffix?: string relationshipsSuffix?: string
schema: Record<string, GenericEnum | GenericRelation | GenericTable> schema: Record<string, unknown>
schemaName?: Args['schemaName'] schemaName?: Args['schemaName']
sessions: { sessions: {
[id: string]: { [id: string]: {
@@ -116,7 +124,7 @@ declare module 'payload' {
push: boolean push: boolean
relations: Record<string, GenericRelation> relations: Record<string, GenericRelation>
relationshipsSuffix?: string relationshipsSuffix?: string
schema: Record<string, GenericEnum | GenericRelation | GenericTable> schema: Record<string, unknown>
sessions: { sessions: {
[id: string]: { [id: string]: {
db: DrizzleTransaction db: DrizzleTransaction

View File

@@ -1,11 +1,18 @@
import { sql } from 'drizzle-orm' import { sql } from 'drizzle-orm'
import type { DrizzleDB } from '../types' import type { PostgresAdapter } from '../types.js'
export const migrationTableExists = async (db: DrizzleDB): Promise<boolean> => { export const migrationTableExists = async (adapter: PostgresAdapter): Promise<boolean> => {
const queryRes = await db.execute(sql`SELECT to_regclass('public.payload_migrations');`) let statement
// Returns table name 'payload_migrations' or null if (adapter.name === 'postgres') {
const exists = queryRes.rows?.[0]?.to_regclass === 'payload_migrations' const prependSchema = adapter.schemaName ? `"${adapter.schemaName}".` : ''
return exists statement = `SELECT to_regclass('${prependSchema}"payload_migrations"') AS exists;`
}
const result = await adapter.drizzle.execute(sql.raw(statement))
const [row] = result.rows
return row && typeof row === 'object' && 'exists' in row && !!row.exists
} }

View File

@@ -0,0 +1,76 @@
import { sql } from 'drizzle-orm'
import prompts from 'prompts'
import type { PostgresAdapter } from '../types.js'
import { requireDrizzleKit } from './requireDrizzleKit'
/**
* Pushes the development schema to the database using Drizzle.
*
* @param {PostgresAdapter} adapter - The PostgresAdapter instance connected to the database.
* @returns {Promise<void>} - A promise that resolves once the schema push is complete.
*/
export const pushDevSchema = async (adapter: PostgresAdapter) => {
const { pushSchema } = requireDrizzleKit()
// This will prompt if clarifications are needed for Drizzle to push new schema
const { apply, hasDataLoss, warnings } = await pushSchema(
adapter.schema,
adapter.drizzle,
adapter.schemaName ? [adapter.schemaName] : undefined,
)
if (warnings.length) {
let message = `Warnings detected during schema push: \n\n${warnings.join('\n')}\n\n`
if (hasDataLoss) {
message += `DATA LOSS WARNING: Possible data loss detected if schema is pushed.\n\n`
}
message += `Accept warnings and push schema to database?`
const { confirm: acceptWarnings } = await prompts(
{
name: 'confirm',
type: 'confirm',
initial: false,
message,
},
{
onCancel: () => {
process.exit(0)
},
},
)
// Exit if user does not accept warnings.
// Q: Is this the right type of exit for this interaction?
if (!acceptWarnings) {
process.exit(0)
}
}
await apply()
const migrationsTable = adapter.schemaName
? `"${adapter.schemaName}"."payload_migrations"`
: '"payload_migrations"'
const { drizzle } = adapter
const result = await drizzle.execute(
sql.raw(`SELECT * FROM ${migrationsTable} WHERE batch = '-1'`),
)
const devPush = result.rows
if (!devPush.length) {
await drizzle.execute(
sql.raw(`INSERT INTO ${migrationsTable} (name, batch) VALUES ('dev', '-1')`),
)
} else {
await drizzle.execute(
sql.raw(`UPDATE ${migrationsTable} SET updated_at = CURRENT_TIMESTAMP WHERE batch = '-1'`),
)
}
}

View File

@@ -0,0 +1,12 @@
import type { PostgresAdapter } from '../types'
type RequireDrizzleKit = () => {
generateDrizzleJson: (args: { schema: Record<string, unknown> }) => unknown
pushSchema: (
schema: Record<string, unknown>,
drizzle: PostgresAdapter['drizzle'],
filterSchema?: string[],
) => Promise<{ apply; hasDataLoss; warnings }>
}
export const requireDrizzleKit: RequireDrizzleKit = () => require('drizzle-kit/api')

View File

@@ -32,7 +32,7 @@
"eslint-plugin-perfectionist": "2.0.0", "eslint-plugin-perfectionist": "2.0.0",
"eslint-plugin-playwright": "0.16.0", "eslint-plugin-playwright": "0.16.0",
"eslint-plugin-react": "7.33.2", "eslint-plugin-react": "7.33.2",
"eslint-plugin-react-hooks": "4.6.0", "eslint-plugin-react-hooks": "4.6.2",
"eslint-plugin-regexp": "1.15.0" "eslint-plugin-regexp": "1.15.0"
}, },
"keywords": [] "keywords": []

View File

@@ -1,6 +1,6 @@
{ {
"name": "payload", "name": "payload",
"version": "2.29.0", "version": "2.30.3",
"description": "Node, React and MongoDB Headless CMS and Application Framework", "description": "Node, React and MongoDB Headless CMS and Application Framework",
"license": "MIT", "license": "MIT",
"main": "./dist/index.js", "main": "./dist/index.js",
@@ -55,13 +55,13 @@
"@date-io/date-fns": "2.16.0", "@date-io/date-fns": "2.16.0",
"@dnd-kit/core": "6.0.8", "@dnd-kit/core": "6.0.8",
"@dnd-kit/sortable": "7.0.2", "@dnd-kit/sortable": "7.0.2",
"@faceless-ui/modal": "2.0.1", "@faceless-ui/modal": "2.0.2",
"@faceless-ui/scroll-info": "1.3.0", "@faceless-ui/scroll-info": "1.3.0",
"@faceless-ui/window-info": "2.1.1", "@faceless-ui/window-info": "2.1.2",
"@monaco-editor/react": "4.5.1", "@monaco-editor/react": "4.5.1",
"@swc/core": "1.6.1", "@swc/core": "1.6.1",
"@swc/register": "0.1.10", "@swc/register": "0.1.10",
"body-parser": "1.20.2", "body-parser": "1.20.3",
"body-scroll-lock": "4.0.0-beta.0", "body-scroll-lock": "4.0.0-beta.0",
"bson-objectid": "2.0.4", "bson-objectid": "2.0.4",
"compression": "1.7.4", "compression": "1.7.4",
@@ -70,7 +70,7 @@
"console-table-printer": "2.11.2", "console-table-printer": "2.11.2",
"dataloader": "2.2.2", "dataloader": "2.2.2",
"date-fns": "2.30.0", "date-fns": "2.30.0",
"deep-equal": "2.2.2", "deep-equal": "2.2.3",
"deepmerge": "4.3.1", "deepmerge": "4.3.1",
"dotenv": "8.6.0", "dotenv": "8.6.0",
"express": "4.21.0", "express": "4.21.0",
@@ -97,14 +97,14 @@
"isomorphic-fetch": "3.0.0", "isomorphic-fetch": "3.0.0",
"joi": "17.9.2", "joi": "17.9.2",
"json-schema-to-typescript": "14.0.5", "json-schema-to-typescript": "14.0.5",
"jsonwebtoken": "9.0.1", "jsonwebtoken": "9.0.2",
"jwt-decode": "3.1.2", "jwt-decode": "3.1.2",
"md5": "2.3.0", "md5": "2.3.0",
"method-override": "3.0.0", "method-override": "3.0.0",
"minimist": "1.2.8", "minimist": "1.2.8",
"mkdirp": "1.0.4", "mkdirp": "1.0.4",
"monaco-editor": "0.38.0", "monaco-editor": "0.38.0",
"nodemailer": "6.9.8", "nodemailer": "6.9.15",
"object-to-formdata": "4.5.1", "object-to-formdata": "4.5.1",
"passport": "0.6.0", "passport": "0.6.0",
"passport-anonymous": "1.0.1", "passport-anonymous": "1.0.1",
@@ -132,11 +132,11 @@
"react-toastify": "10.0.5", "react-toastify": "10.0.5",
"sanitize-filename": "1.6.3", "sanitize-filename": "1.6.3",
"sass": "1.69.4", "sass": "1.69.4",
"scheduler": "0.23.0", "scheduler": "0.23.2",
"scmp": "2.1.0", "scmp": "2.1.0",
"sharp": "0.32.6", "sharp": "0.32.6",
"swc-loader": "0.2.3", "swc-loader": "0.2.6",
"terser-webpack-plugin": "5.3.9", "terser-webpack-plugin": "5.3.10",
"ts-essentials": "7.0.3", "ts-essentials": "7.0.3",
"use-context-selector": "1.4.1", "use-context-selector": "1.4.1",
"uuid": "9.0.1" "uuid": "9.0.1"
@@ -145,7 +145,7 @@
"@payloadcms/eslint-config": "workspace:*", "@payloadcms/eslint-config": "workspace:*",
"@release-it/conventional-changelog": "7.0.0", "@release-it/conventional-changelog": "7.0.0",
"@types/asap": "2.0.0", "@types/asap": "2.0.0",
"@types/body-parser": "1.19.2", "@types/body-parser": "1.19.5",
"@types/body-scroll-lock": "^3.1.0", "@types/body-scroll-lock": "^3.1.0",
"@types/compression": "1.7.2", "@types/compression": "1.7.2",
"@types/express": "4.17.17", "@types/express": "4.17.17",
@@ -158,14 +158,14 @@
"@types/isomorphic-fetch": "0.0.36", "@types/isomorphic-fetch": "0.0.36",
"@types/joi": "14.3.4", "@types/joi": "14.3.4",
"@types/json-schema": "7.0.12", "@types/json-schema": "7.0.12",
"@types/jsonwebtoken": "8.5.9", "@types/jsonwebtoken": "9.0.7",
"@types/method-override": "0.0.32", "@types/method-override": "0.0.32",
"@types/mime": "2.0.3", "@types/mime": "2.0.3",
"@types/mini-css-extract-plugin": "^1.4.3", "@types/mini-css-extract-plugin": "^1.4.3",
"@types/minimist": "1.2.2", "@types/minimist": "1.2.2",
"@types/mkdirp": "1.0.2", "@types/mkdirp": "1.0.2",
"@types/node-fetch": "2.6.4", "@types/node-fetch": "2.6.4",
"@types/nodemailer": "6.4.14", "@types/nodemailer": "6.4.16",
"@types/passport": "1.0.12", "@types/passport": "1.0.12",
"@types/passport-anonymous": "1.0.3", "@types/passport-anonymous": "1.0.3",
"@types/passport-jwt": "3.0.9", "@types/passport-jwt": "3.0.9",
@@ -202,9 +202,9 @@
"rimraf": "4.4.1", "rimraf": "4.4.1",
"sass-loader": "12.6.0", "sass-loader": "12.6.0",
"serve-static": "1.15.0", "serve-static": "1.15.0",
"swc-loader": "^0.2.3", "swc-loader": "^0.2.6",
"terser": "5.19.2", "terser": "5.19.2",
"terser-webpack-plugin": "^5.3.6", "terser-webpack-plugin": "^5.3.10",
"url-loader": "4.1.1", "url-loader": "4.1.1",
"vite": "^4.4.9", "vite": "^4.4.9",
"webpack": "^5.78.0" "webpack": "^5.78.0"

View File

@@ -0,0 +1,70 @@
import ObjectID from 'bson-objectid'
import { type BeforeDuplicate, type Field, tabHasName } from '../../../../exports/types'
/**
* Creates new IDs for blocks / arrays items to avoid errors with relational databases.
*/
export const baseBeforeDuplicate = (args: Parameters<BeforeDuplicate>[0]) => {
const {
collection: { fields },
data,
} = args
traverseFields(fields, data)
return data
}
function traverseFields(fields: Field[], data: unknown) {
if (typeof data === 'undefined' || data === null) return
fields.forEach((field) => {
switch (field.type) {
case 'array':
if (Array.isArray(data?.[field.name])) {
data[field.name].forEach((row) => {
if (!row) return
row.id = new ObjectID().toHexString()
traverseFields(field.fields, row)
})
}
break
case 'blocks': {
if (Array.isArray(data?.[field.name])) {
data[field.name].forEach((row) => {
if (!row) return
const configBlock = field.blocks.find((block) => block.slug === row.blockType)
if (!configBlock) return
row.id = new ObjectID().toHexString()
traverseFields(configBlock.fields, row)
})
}
break
}
case 'row':
case 'collapsible':
traverseFields(field.fields, data)
break
case 'tabs':
field.tabs.forEach((tab) => {
if (!tabHasName(tab)) {
traverseFields(tab.fields, data)
return
}
if (data && data[tab.name]) {
traverseFields(tab.fields, data[tab.name])
}
})
break
case 'group':
if (data && data[field.name]) {
traverseFields(field.fields, data[field.name])
}
break
default:
break
}
})
}

View File

@@ -13,6 +13,7 @@ import MinimalTemplate from '../../templates/Minimal'
import { useConfig } from '../../utilities/Config' import { useConfig } from '../../utilities/Config'
import Button from '../Button' import Button from '../Button'
import * as PopupList from '../Popup/PopupButtonList' import * as PopupList from '../Popup/PopupButtonList'
import { baseBeforeDuplicate } from './baseBeforeDuplicate'
import './index.scss' import './index.scss'
const baseClass = 'duplicate' const baseClass = 'duplicate'
@@ -65,6 +66,8 @@ const Duplicate: React.FC<Props> = ({ id, slug, collection }) => {
}) })
let data = await response.json() let data = await response.json()
data = baseBeforeDuplicate({ collection, data, locale })
if (typeof collection.admin.hooks?.beforeDuplicate === 'function') { if (typeof collection.admin.hooks?.beforeDuplicate === 'function') {
data = await collection.admin.hooks.beforeDuplicate({ data = await collection.admin.hooks.beforeDuplicate({
collection, collection,
@@ -73,6 +76,8 @@ const Duplicate: React.FC<Props> = ({ id, slug, collection }) => {
}) })
} }
delete data['id']
if (!duplicateID) { if (!duplicateID) {
if ('createdAt' in data) delete data.createdAt if ('createdAt' in data) delete data.createdAt
if ('updatedAt' in data) delete data.updatedAt if ('updatedAt' in data) delete data.updatedAt

View File

@@ -23,11 +23,13 @@ export {
useListDrawer, useListDrawer,
} from '../../admin/components/elements/ListDrawer' } from '../../admin/components/elements/ListDrawer'
export { useNav } from '../../admin/components/elements/Nav/context'
export { default as NavGroup } from '../../admin/components/elements/NavGroup'
export { export {
Description, Description,
DescriptionComponent, DescriptionComponent,
DescriptionFunction, DescriptionFunction,
} from '../../admin/components/forms/FieldDescription/types' } from '../../admin/components/forms/FieldDescription/types'
export { useNav } from '../../admin/components/elements/Nav/context' export { toast } from 'react-toastify'
export { default as NavGroup } from '../../admin/components/elements/NavGroup'

View File

@@ -228,23 +228,58 @@ export const generateFileData = async <T>({
withMetadata, withMetadata,
}) })
filesToSave.push({ // Apply resize after cropping to ensure it conforms to resizeOptions
buffer: croppedImage, if (resizeOptions) {
path: `${staticPath}/${fsSafeName}`, const resizedAfterCrop = await sharp(croppedImage)
}) .resize({
fit: resizeOptions?.fit || 'cover',
height: resizeOptions?.height,
position: resizeOptions?.position || 'center',
width: resizeOptions?.width,
})
.toBuffer({ resolveWithObject: true })
fileForResize = { filesToSave.push({
...file, buffer: resizedAfterCrop.data,
data: croppedImage, path: `${staticPath}/${fsSafeName}`,
size: info.size, })
fileForResize = {
...fileForResize,
data: resizedAfterCrop.data,
size: resizedAfterCrop.info.size,
}
fileData.width = resizedAfterCrop.info.width
fileData.height = resizedAfterCrop.info.height
if (fileIsAnimatedType) {
const metadata = await sharpFile.metadata()
fileData.height = metadata.pages
? resizedAfterCrop.info.height / metadata.pages
: resizedAfterCrop.info.height
}
fileData.filesize = resizedAfterCrop.info.size
} else {
// If resizeOptions is not present, just save the cropped image
filesToSave.push({
buffer: croppedImage,
path: `${staticPath}/${fsSafeName}`,
})
fileForResize = {
...file,
data: croppedImage,
size: info.size,
}
fileData.width = info.width
fileData.height = info.height
if (fileIsAnimatedType) {
const metadata = await sharpFile.metadata()
fileData.height = metadata.pages ? info.height / metadata.pages : info.height
}
fileData.filesize = info.size
} }
fileData.width = info.width
fileData.height = info.height
if (fileIsAnimatedType) {
const metadata = await sharpFile.metadata()
fileData.height = metadata.pages ? info.height / metadata.pages : info.height
}
fileData.filesize = info.size
if (file.tempFilePath) { if (file.tempFilePath) {
await fs.promises.writeFile(file.tempFilePath, croppedImage) // write fileBuffer to the temp path await fs.promises.writeFile(file.tempFilePath, croppedImage) // write fileBuffer to the temp path

View File

@@ -191,31 +191,6 @@ const getImageResizeAction = ({
return hasFocalPoint ? 'resizeWithFocalPoint' : 'resize' return hasFocalPoint ? 'resizeWithFocalPoint' : 'resize'
} }
/**
* Check if the image should be passed directly to sharp without payload adjusting properties.
*
* @param resizeConfig - object containing the requested dimensions and resize options
* @param original - the original image size
* @returns true if the image should passed directly to sharp
*/
const applyPayloadAdjustments = (
{ fit, height, width, withoutEnlargement, withoutReduction }: ImageSize,
original: ProbedImageSize,
) => {
if (fit === 'contain' || fit === 'inside') return false
if (!isNumber(height) && !isNumber(width)) return false
const targetAspectRatio = width / height
const originalAspectRatio = original.width / original.height
if (originalAspectRatio === targetAspectRatio) return false
const skipEnlargement = withoutEnlargement && (original.height < height || original.width < width)
const skipReduction = withoutReduction && (original.height > height || original.width > width)
if (skipEnlargement || skipReduction) return false
return true
}
/** /**
* Sanitize the resize config. If the resize config has the `withoutReduction` * Sanitize the resize config. If the resize config has the `withoutReduction`
* property set to true, the `fit` and `position` properties will be set to `contain` * property set to true, the `fit` and `position` properties will be set to `contain`
@@ -302,6 +277,18 @@ export default async function resizeAndTransformImageSizes({
const sharpBase: Sharp | undefined = sharp(file.tempFilePath || file.data, sharpOptions).rotate() // pass rotate() to auto-rotate based on EXIF data. https://github.com/payloadcms/payload/pull/3081 const sharpBase: Sharp | undefined = sharp(file.tempFilePath || file.data, sharpOptions).rotate() // pass rotate() to auto-rotate based on EXIF data. https://github.com/payloadcms/payload/pull/3081
const originalImageMeta = await sharpBase.metadata() const originalImageMeta = await sharpBase.metadata()
let adjustedDimensions = { ...dimensions }
// Images with an exif orientation of 5, 6, 7, or 8 are auto-rotated by sharp
// Need to adjust the dimensions to match the original image
if ([5, 6, 7, 8].includes(originalImageMeta.orientation)) {
adjustedDimensions = {
...dimensions,
height: dimensions.width,
width: dimensions.height,
}
}
const resizeImageMeta = { const resizeImageMeta = {
height: extractHeightFromImage(originalImageMeta), height: extractHeightFromImage(originalImageMeta),
width: originalImageMeta.width, width: originalImageMeta.width,
@@ -324,7 +311,7 @@ export default async function resizeAndTransformImageSizes({
if (resizeAction === 'resizeWithFocalPoint') { if (resizeAction === 'resizeWithFocalPoint') {
let { height: resizeHeight, width: resizeWidth } = imageResizeConfig let { height: resizeHeight, width: resizeWidth } = imageResizeConfig
const originalAspectRatio = dimensions.width / dimensions.height const originalAspectRatio = adjustedDimensions.width / adjustedDimensions.height
// Calculate resizeWidth based on original aspect ratio if it's undefined // Calculate resizeWidth based on original aspect ratio if it's undefined
if (resizeHeight && !resizeWidth) { if (resizeHeight && !resizeWidth) {

View File

@@ -1,6 +1,6 @@
import type { Where } from '../../types' import type { Where } from '../../types'
export const appendVersionToQueryKey = (query: Where): Where => { export const appendVersionToQueryKey = (query: Where = {}): Where => {
return Object.entries(query).reduce((res, [key, val]) => { return Object.entries(query).reduce((res, [key, val]) => {
if (['AND', 'OR', 'and', 'or'].includes(key) && Array.isArray(val)) { if (['AND', 'OR', 'and', 'or'].includes(key) && Array.isArray(val)) {
return { return {

View File

@@ -13,5 +13,9 @@ export const getQueryDraftsSort = (sort: string): string => {
orderBy = sort.substring(1) orderBy = sort.substring(1)
} }
if (orderBy === 'id') {
return `${direction}parent`
}
return `${direction}version.${orderBy}` return `${direction}version.${orderBy}`
} }

View File

@@ -4,7 +4,9 @@ import type { Payload } from '../payload'
import type { PayloadRequest } from '../types' import type { PayloadRequest } from '../types'
import type { TypeWithVersion } from './types' import type { TypeWithVersion } from './types'
import { combineQueries } from '../database/combineQueries'
import { docHasTimestamps } from '../types' import { docHasTimestamps } from '../types'
import { appendVersionToQueryKey } from './drafts/appendVersionToQueryKey'
type Args = { type Args = {
config: SanitizedCollectionConfig config: SanitizedCollectionConfig
@@ -32,7 +34,7 @@ export const getLatestCollectionVersion = async <T extends TypeWithID = any>({
pagination: false, pagination: false,
req, req,
sort: '-updatedAt', sort: '-updatedAt',
where: { parent: { equals: id } }, where: combineQueries(appendVersionToQueryKey(query.where), { parent: { equals: id } }),
}) })
;[latestVersion] = docs ;[latestVersion] = docs
} }

View File

@@ -96,11 +96,19 @@ From there, create the adapter, passing in all of its required properties:
```js ```js
import { azureBlobStorageAdapter } from '@payloadcms/plugin-cloud-storage/azure' import { azureBlobStorageAdapter } from '@payloadcms/plugin-cloud-storage/azure'
// if you need to obtain credentials you may do so by following the instructions here: https://docs.microsoft.com/en-us/azure/storage/common/storage-auth-aad-app?tabs=javascript
// or you can use the connection string directly.
const adapter = azureBlobStorageAdapter({ const adapter = azureBlobStorageAdapter({
connectionString: process.env.AZURE_STORAGE_CONNECTION_STRING, connectionString: process.env.AZURE_STORAGE_CONNECTION_STRING,
containerName: process.env.AZURE_STORAGE_CONTAINER_NAME, containerName: process.env.AZURE_STORAGE_CONTAINER_NAME,
allowContainerCreate: process.env.AZURE_STORAGE_ALLOW_CONTAINER_CREATE === 'true', allowContainerCreate: process.env.AZURE_STORAGE_ALLOW_CONTAINER_CREATE === 'true',
baseURL: process.env.AZURE_STORAGE_ACCOUNT_BASEURL, baseURL: process.env.AZURE_STORAGE_ACCOUNT_BASEURL,
/**
* Optional: You may wish to obtain credentials that cannot be passed through in the connectionString connection option. In that case the connectionString will only be the URL to the storage account.
* Can be one of AnonymousCredential | StorageSharedKeyCredential | TokenCredential
**/
credentials: new StorageSharedKeyCredential(process.env.AZURE_STORAGE_ACCOUNT_NAME, process.env.AZURE_STORAGE_ACCOUNT_KEY),
}) })
// Now you can pass this adapter to the plugin // Now you can pass this adapter to the plugin

View File

@@ -1,7 +1,7 @@
{ {
"name": "@payloadcms/plugin-cloud-storage", "name": "@payloadcms/plugin-cloud-storage",
"description": "The official cloud storage plugin for Payload CMS", "description": "The official cloud storage plugin for Payload CMS",
"version": "1.1.3", "version": "1.2.0",
"main": "dist/index.js", "main": "dist/index.js",
"types": "dist/index.d.ts", "types": "dist/index.d.ts",
"license": "MIT", "license": "MIT",
@@ -53,6 +53,7 @@
"@aws-sdk/client-s3": "^3.142.0", "@aws-sdk/client-s3": "^3.142.0",
"@aws-sdk/lib-storage": "^3.267.0", "@aws-sdk/lib-storage": "^3.267.0",
"@azure/storage-blob": "^12.11.0", "@azure/storage-blob": "^12.11.0",
"@azure/core-http": "^3.0.0",
"@google-cloud/storage": "^6.4.1", "@google-cloud/storage": "^6.4.1",
"@types/express": "^4.17.9", "@types/express": "^4.17.9",
"@types/find-node-modules": "^2.1.2", "@types/find-node-modules": "^2.1.2",

View File

@@ -1,4 +1,9 @@
import type { ContainerClient } from '@azure/storage-blob' import type { TokenCredential } from '@azure/core-http'
import type {
AnonymousCredential,
ContainerClient,
StorageSharedKeyCredential,
} from '@azure/storage-blob'
import { BlobServiceClient } from '@azure/storage-blob' import { BlobServiceClient } from '@azure/storage-blob'
@@ -15,6 +20,7 @@ export interface Args {
baseURL: string baseURL: string
connectionString: string connectionString: string
containerName: string containerName: string
credential?: AnonymousCredential | StorageSharedKeyCredential | TokenCredential
} }
export const azureBlobStorageAdapter = ({ export const azureBlobStorageAdapter = ({
@@ -22,11 +28,14 @@ export const azureBlobStorageAdapter = ({
baseURL, baseURL,
connectionString, connectionString,
containerName, containerName,
credential,
}: Args): Adapter => { }: Args): Adapter => {
let storageClient: ContainerClient | null = null let storageClient: ContainerClient | null = null
const getStorageClient = () => { const getStorageClient = () => {
if (storageClient) return storageClient if (storageClient) return storageClient
const blobServiceClient = BlobServiceClient.fromConnectionString(connectionString) const blobServiceClient = credential
? new BlobServiceClient(connectionString, credential)
: BlobServiceClient.fromConnectionString(connectionString)
return (storageClient = blobServiceClient.getContainerClient(containerName)) return (storageClient = blobServiceClient.getContainerClient(containerName))
} }

View File

@@ -28,12 +28,12 @@
"@aws-sdk/credential-providers": "^3.289.0", "@aws-sdk/credential-providers": "^3.289.0",
"@aws-sdk/lib-storage": "^3.267.0", "@aws-sdk/lib-storage": "^3.267.0",
"amazon-cognito-identity-js": "^6.1.2", "amazon-cognito-identity-js": "^6.1.2",
"nodemailer": "6.9.9" "nodemailer": "6.9.15"
}, },
"devDependencies": { "devDependencies": {
"@types/express": "^4.17.9", "@types/express": "^4.17.9",
"@types/jest": "^29.5.1", "@types/jest": "^29.5.1",
"@types/nodemailer": "6.4.14", "@types/nodemailer": "6.4.16",
"payload": "workspace:*", "payload": "workspace:*",
"ts-jest": "^29.1.0", "ts-jest": "^29.1.0",
"webpack": "^5.78.0" "webpack": "^5.78.0"

View File

@@ -1,6 +1,6 @@
{ {
"name": "@payloadcms/richtext-lexical", "name": "@payloadcms/richtext-lexical",
"version": "0.11.3", "version": "0.11.4",
"description": "The officially supported Lexical richtext adapter for Payload", "description": "The officially supported Lexical richtext adapter for Payload",
"repository": { "repository": {
"type": "git", "type": "git",
@@ -22,7 +22,7 @@
"prepublishOnly": "pnpm clean && pnpm build" "prepublishOnly": "pnpm clean && pnpm build"
}, },
"dependencies": { "dependencies": {
"@faceless-ui/modal": "2.0.1", "@faceless-ui/modal": "2.0.2",
"@lexical/headless": "0.13.1", "@lexical/headless": "0.13.1",
"@lexical/link": "0.13.1", "@lexical/link": "0.13.1",
"@lexical/list": "0.13.1", "@lexical/list": "0.13.1",
@@ -39,7 +39,7 @@
"json-schema": "^0.4.0", "json-schema": "^0.4.0",
"lexical": "0.13.1", "lexical": "0.13.1",
"lodash": "4.17.21", "lodash": "4.17.21",
"react-error-boundary": "4.0.12", "react-error-boundary": "4.0.13",
"react-i18next": "11.18.6", "react-i18next": "11.18.6",
"ts-essentials": "7.0.3" "ts-essentials": "7.0.3"
}, },

View File

@@ -118,13 +118,14 @@ export const LinkFeature = (props: LinkFeatureProps): FeatureProvider => {
}) })
const rel: string = node.fields.newTab ? ' rel="noopener noreferrer"' : '' const rel: string = node.fields.newTab ? ' rel="noopener noreferrer"' : ''
const target: string = node.fields.newTab ? ' target="_blank"' : ''
const href: string = const href: string =
node.fields.linkType === 'custom' node.fields.linkType === 'custom'
? node.fields.url ? node.fields.url
: (node.fields.doc?.value as string) : (node.fields.doc?.value as string)
return `<a href="${href}"${rel}>${childrenText}</a>` return `<a href="${href}"${target}${rel}>${childrenText}</a>`
}, },
nodeTypes: [LinkNode.getType()], nodeTypes: [LinkNode.getType()],
} as HTMLConverter<SerializedLinkNode>, } as HTMLConverter<SerializedLinkNode>,

View File

@@ -21,7 +21,7 @@
"prepublishOnly": "pnpm clean && pnpm build" "prepublishOnly": "pnpm clean && pnpm build"
}, },
"dependencies": { "dependencies": {
"@faceless-ui/modal": "2.0.1", "@faceless-ui/modal": "2.0.2",
"i18next": "22.5.1", "i18next": "22.5.1",
"is-hotkey": "0.2.0", "is-hotkey": "0.2.0",
"react-i18next": "11.18.6", "react-i18next": "11.18.6",

1437
pnpm-lock.yaml generated

File diff suppressed because it is too large Load Diff

View File

@@ -177,7 +177,7 @@ If you are migrating an existing site or moving content to a new URL, you can us
## Website ## Website
This template includes a beautifully designed, production-ready front-end built with the [Next.js App Router](https://nextjs.org), served right alongside your Payload app in a single Express server. This makes is so that you can deploy both apps simultaneously and host them together. If you prefer a different front-end framework, this pattern works for any framework that supports a custom server. If you prefer to host your website separately from Payload, you can easily [Eject](#eject) the front-end out from this template to swap in your own, or to use it as a standalone CMS. For more details, see the official [Custom Server Example](https://github.com/payloadcms/payload/tree/main/examples/custom-server). This template includes a beautifully designed, production-ready front-end built with the [Next.js App Router](https://nextjs.org), served right alongside your Payload app in a single Express server. This makes it so that you can deploy both apps simultaneously and host them together. If you prefer a different front-end framework, this pattern works for any framework that supports a custom server. If you prefer to host your website separately from Payload, you can easily [Eject](#eject) the front-end out from this template to swap in your own, or to use it as a standalone CMS. For more details, see the official [Custom Server Example](https://github.com/payloadcms/payload/tree/main/examples/custom-server).
Core features: Core features:

View File

@@ -877,6 +877,20 @@ describe('collections-rest', () => {
expect(result.totalDocs).toEqual(1) expect(result.totalDocs).toEqual(1)
}) })
it('like - id should not crash', async () => {
await createPost({ title: 'post' })
const response = await client.find({
query: {
id: {
like: 'words partial',
},
},
})
expect(response.status).toEqual(200)
})
it('exists - true', async () => { it('exists - true', async () => {
const postWithDesc = await createPost({ description: 'exists' }) const postWithDesc = await createPost({ description: 'exists' })
await createPost({ description: undefined }) await createPost({ description: undefined })

View File

@@ -1 +1,4 @@
migrations migrations
v5_migrations/*
!v5_migrations/20241018_162142_test_v5.ts
!v5_migrations/20241018_162142_test_v5.json

View File

@@ -59,7 +59,7 @@ describe('database', () => {
}) })
afterAll(() => { afterAll(() => {
removeFiles(path.normalize(payload.db.migrationDir)) removeFiles(path.normalize(payload.db.migrationDir), (name) => !name.includes('test_v5'))
}) })
it('should run migrate:create', async () => { it('should run migrate:create', async () => {
@@ -74,6 +74,32 @@ describe('database', () => {
expect(migrationFile).toContain('_test') expect(migrationFile).toContain('_test')
}) })
it('should run migrate:create with older drizzle version schema', async () => {
const db = payload.db as unknown as PostgresAdapter
// eslint-disable-next-line jest/no-if
if (db.name !== 'postgres') return
// eslint-disable-next-line jest/no-if
if (db.schemaName && db.schemaName !== 'public') {
return
}
const args = {
_: ['migrate:create', 'test'],
forceAcceptWarning: true,
}
const ogMigrationDir = payload.db.migrationDir
payload.db.migrationDir = path.resolve(__dirname, 'v5_migrations')
await migrate(args)
// read files names in migrationsDir
const migrationFile = path.normalize(fs.readdirSync(payload.db.migrationDir)[2])
expect(migrationFile).toContain('_test')
removeFiles(path.normalize(payload.db.migrationDir), (name) => !name.includes('test_v5'))
payload.db.migrationDir = ogMigrationDir
})
it('should run migrate', async () => { it('should run migrate', async () => {
const args = { const args = {
_: ['migrate'], _: ['migrate'],

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,476 @@
import { MigrateUpArgs, MigrateDownArgs } from '@payloadcms/db-postgres'
import { sql } from 'drizzle-orm'
export async function up({ payload }: MigrateUpArgs): Promise<void> {
await payload.db.drizzle.execute(sql`
DO $$ BEGIN
CREATE TYPE "_locales" AS ENUM('en', 'es');
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
CREATE TYPE "selectEnum" AS ENUM('a', 'b', 'c');
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
CREATE TYPE "radioEnum" AS ENUM('a', 'b', 'c');
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
CREATE TYPE "enum_customs_status" AS ENUM('draft', 'published');
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
CREATE TYPE "enum__customs_v_version_status" AS ENUM('draft', 'published');
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
CREATE TABLE IF NOT EXISTS "posts" (
"id" serial PRIMARY KEY NOT NULL,
"title" varchar NOT NULL,
"throw_after_change" boolean,
"updated_at" timestamp(3) with time zone DEFAULT now() NOT NULL,
"created_at" timestamp(3) with time zone DEFAULT now() NOT NULL
);
CREATE TABLE IF NOT EXISTS "relation_a" (
"id" serial PRIMARY KEY NOT NULL,
"rich_text" jsonb,
"updated_at" timestamp(3) with time zone DEFAULT now() NOT NULL,
"created_at" timestamp(3) with time zone DEFAULT now() NOT NULL
);
CREATE TABLE IF NOT EXISTS "relation_a_rels" (
"id" serial PRIMARY KEY NOT NULL,
"order" integer,
"parent_id" integer NOT NULL,
"path" varchar NOT NULL,
"relation_b_id" integer
);
CREATE TABLE IF NOT EXISTS "relation_b" (
"id" serial PRIMARY KEY NOT NULL,
"rich_text" jsonb,
"updated_at" timestamp(3) with time zone DEFAULT now() NOT NULL,
"created_at" timestamp(3) with time zone DEFAULT now() NOT NULL
);
CREATE TABLE IF NOT EXISTS "relation_b_rels" (
"id" serial PRIMARY KEY NOT NULL,
"order" integer,
"parent_id" integer NOT NULL,
"path" varchar NOT NULL,
"relation_a_id" integer
);
CREATE TABLE IF NOT EXISTS "customs_customSelect" (
"order" integer NOT NULL,
"parent_id" integer NOT NULL,
"value" "selectEnum",
"id" serial PRIMARY KEY NOT NULL
);
CREATE TABLE IF NOT EXISTS "customArrays" (
"_order" integer NOT NULL,
"_parent_id" integer NOT NULL,
"id" varchar PRIMARY KEY NOT NULL,
"text" varchar
);
CREATE TABLE IF NOT EXISTS "customArrays_locales" (
"localized_text" varchar,
"id" serial PRIMARY KEY NOT NULL,
"_locale" "_locales" NOT NULL,
"_parent_id" varchar NOT NULL,
CONSTRAINT "customArrays_locales_locale_parent_id_unique" UNIQUE("_locale","_parent_id")
);
CREATE TABLE IF NOT EXISTS "customBlocks" (
"_order" integer NOT NULL,
"_parent_id" integer NOT NULL,
"_path" text NOT NULL,
"id" varchar PRIMARY KEY NOT NULL,
"text" varchar,
"block_name" varchar
);
CREATE TABLE IF NOT EXISTS "customBlocks_locales" (
"localized_text" varchar,
"id" serial PRIMARY KEY NOT NULL,
"_locale" "_locales" NOT NULL,
"_parent_id" varchar NOT NULL,
CONSTRAINT "customBlocks_locales_locale_parent_id_unique" UNIQUE("_locale","_parent_id")
);
CREATE TABLE IF NOT EXISTS "customs" (
"id" serial PRIMARY KEY NOT NULL,
"text" varchar,
"radio" "radioEnum",
"updated_at" timestamp(3) with time zone DEFAULT now() NOT NULL,
"created_at" timestamp(3) with time zone DEFAULT now() NOT NULL,
"_status" "enum_customs_status"
);
CREATE TABLE IF NOT EXISTS "customs_locales" (
"localized_text" varchar,
"id" serial PRIMARY KEY NOT NULL,
"_locale" "_locales" NOT NULL,
"_parent_id" integer NOT NULL,
CONSTRAINT "customs_locales_locale_parent_id_unique" UNIQUE("_locale","_parent_id")
);
CREATE TABLE IF NOT EXISTS "customs_rels" (
"id" serial PRIMARY KEY NOT NULL,
"order" integer,
"parent_id" integer NOT NULL,
"path" varchar NOT NULL,
"relation_a_id" integer
);
CREATE TABLE IF NOT EXISTS "_customs_v_version_customSelect" (
"order" integer NOT NULL,
"parent_id" integer NOT NULL,
"value" "selectEnum",
"id" serial PRIMARY KEY NOT NULL
);
CREATE TABLE IF NOT EXISTS "_customArrays_v" (
"_order" integer NOT NULL,
"_parent_id" integer NOT NULL,
"id" serial PRIMARY KEY NOT NULL,
"text" varchar,
"_uuid" varchar
);
CREATE TABLE IF NOT EXISTS "_customArrays_v_locales" (
"localized_text" varchar,
"id" serial PRIMARY KEY NOT NULL,
"_locale" "_locales" NOT NULL,
"_parent_id" integer NOT NULL,
CONSTRAINT "_customArrays_v_locales_locale_parent_id_unique" UNIQUE("_locale","_parent_id")
);
CREATE TABLE IF NOT EXISTS "_customBlocks_v" (
"_order" integer NOT NULL,
"_parent_id" integer NOT NULL,
"_path" text NOT NULL,
"id" serial PRIMARY KEY NOT NULL,
"text" varchar,
"_uuid" varchar,
"block_name" varchar
);
CREATE TABLE IF NOT EXISTS "_customBlocks_v_locales" (
"localized_text" varchar,
"id" serial PRIMARY KEY NOT NULL,
"_locale" "_locales" NOT NULL,
"_parent_id" integer NOT NULL,
CONSTRAINT "_customBlocks_v_locales_locale_parent_id_unique" UNIQUE("_locale","_parent_id")
);
CREATE TABLE IF NOT EXISTS "_customs_v" (
"id" serial PRIMARY KEY NOT NULL,
"version_text" varchar,
"version_radio" "radioEnum",
"version_updated_at" timestamp(3) with time zone,
"version_created_at" timestamp(3) with time zone,
"version__status" "enum__customs_v_version_status",
"created_at" timestamp(3) with time zone DEFAULT now() NOT NULL,
"updated_at" timestamp(3) with time zone DEFAULT now() NOT NULL,
"latest" boolean
);
CREATE TABLE IF NOT EXISTS "_customs_v_locales" (
"version_localized_text" varchar,
"id" serial PRIMARY KEY NOT NULL,
"_locale" "_locales" NOT NULL,
"_parent_id" integer NOT NULL,
CONSTRAINT "_customs_v_locales_locale_parent_id_unique" UNIQUE("_locale","_parent_id")
);
CREATE TABLE IF NOT EXISTS "_customs_v_rels" (
"id" serial PRIMARY KEY NOT NULL,
"order" integer,
"parent_id" integer NOT NULL,
"path" varchar NOT NULL,
"customs_id" integer,
"relation_a_id" integer
);
CREATE TABLE IF NOT EXISTS "users" (
"id" serial PRIMARY KEY NOT NULL,
"updated_at" timestamp(3) with time zone DEFAULT now() NOT NULL,
"created_at" timestamp(3) with time zone DEFAULT now() NOT NULL,
"email" varchar NOT NULL,
"reset_password_token" varchar,
"reset_password_expiration" timestamp(3) with time zone,
"salt" varchar,
"hash" varchar,
"login_attempts" numeric,
"lock_until" timestamp(3) with time zone
);
CREATE TABLE IF NOT EXISTS "payload_preferences" (
"id" serial PRIMARY KEY NOT NULL,
"key" varchar,
"value" jsonb,
"updated_at" timestamp(3) with time zone DEFAULT now() NOT NULL,
"created_at" timestamp(3) with time zone DEFAULT now() NOT NULL
);
CREATE TABLE IF NOT EXISTS "payload_preferences_rels" (
"id" serial PRIMARY KEY NOT NULL,
"order" integer,
"parent_id" integer NOT NULL,
"path" varchar NOT NULL,
"users_id" integer
);
CREATE TABLE IF NOT EXISTS "payload_migrations" (
"id" serial PRIMARY KEY NOT NULL,
"name" varchar,
"batch" numeric,
"updated_at" timestamp(3) with time zone DEFAULT now() NOT NULL,
"created_at" timestamp(3) with time zone DEFAULT now() NOT NULL
);
CREATE TABLE IF NOT EXISTS "customGlobal" (
"id" serial PRIMARY KEY NOT NULL,
"text" varchar,
"updated_at" timestamp(3) with time zone,
"created_at" timestamp(3) with time zone
);
CREATE TABLE IF NOT EXISTS "_customGlobal_v" (
"id" serial PRIMARY KEY NOT NULL,
"version_text" varchar,
"version_updated_at" timestamp(3) with time zone,
"version_created_at" timestamp(3) with time zone,
"created_at" timestamp(3) with time zone DEFAULT now() NOT NULL,
"updated_at" timestamp(3) with time zone DEFAULT now() NOT NULL
);
CREATE INDEX IF NOT EXISTS "posts_created_at_idx" ON "posts" ("created_at");
CREATE INDEX IF NOT EXISTS "relation_a_created_at_idx" ON "relation_a" ("created_at");
CREATE INDEX IF NOT EXISTS "relation_a_rels_order_idx" ON "relation_a_rels" ("order");
CREATE INDEX IF NOT EXISTS "relation_a_rels_parent_idx" ON "relation_a_rels" ("parent_id");
CREATE INDEX IF NOT EXISTS "relation_a_rels_path_idx" ON "relation_a_rels" ("path");
CREATE INDEX IF NOT EXISTS "relation_b_created_at_idx" ON "relation_b" ("created_at");
CREATE INDEX IF NOT EXISTS "relation_b_rels_order_idx" ON "relation_b_rels" ("order");
CREATE INDEX IF NOT EXISTS "relation_b_rels_parent_idx" ON "relation_b_rels" ("parent_id");
CREATE INDEX IF NOT EXISTS "relation_b_rels_path_idx" ON "relation_b_rels" ("path");
CREATE INDEX IF NOT EXISTS "customs_customSelect_order_idx" ON "customs_customSelect" ("order");
CREATE INDEX IF NOT EXISTS "customs_customSelect_parent_idx" ON "customs_customSelect" ("parent_id");
CREATE INDEX IF NOT EXISTS "customArrays_order_idx" ON "customArrays" ("_order");
CREATE INDEX IF NOT EXISTS "customArrays_parent_id_idx" ON "customArrays" ("_parent_id");
CREATE INDEX IF NOT EXISTS "customBlocks_order_idx" ON "customBlocks" ("_order");
CREATE INDEX IF NOT EXISTS "customBlocks_parent_id_idx" ON "customBlocks" ("_parent_id");
CREATE INDEX IF NOT EXISTS "customBlocks_path_idx" ON "customBlocks" ("_path");
CREATE INDEX IF NOT EXISTS "customs_created_at_idx" ON "customs" ("created_at");
CREATE INDEX IF NOT EXISTS "customs__status_idx" ON "customs" ("_status");
CREATE INDEX IF NOT EXISTS "customs_rels_order_idx" ON "customs_rels" ("order");
CREATE INDEX IF NOT EXISTS "customs_rels_parent_idx" ON "customs_rels" ("parent_id");
CREATE INDEX IF NOT EXISTS "customs_rels_path_idx" ON "customs_rels" ("path");
CREATE INDEX IF NOT EXISTS "_customs_v_version_customSelect_order_idx" ON "_customs_v_version_customSelect" ("order");
CREATE INDEX IF NOT EXISTS "_customs_v_version_customSelect_parent_idx" ON "_customs_v_version_customSelect" ("parent_id");
CREATE INDEX IF NOT EXISTS "_customArrays_v_order_idx" ON "_customArrays_v" ("_order");
CREATE INDEX IF NOT EXISTS "_customArrays_v_parent_id_idx" ON "_customArrays_v" ("_parent_id");
CREATE INDEX IF NOT EXISTS "_customBlocks_v_order_idx" ON "_customBlocks_v" ("_order");
CREATE INDEX IF NOT EXISTS "_customBlocks_v_parent_id_idx" ON "_customBlocks_v" ("_parent_id");
CREATE INDEX IF NOT EXISTS "_customBlocks_v_path_idx" ON "_customBlocks_v" ("_path");
CREATE INDEX IF NOT EXISTS "_customs_v_version_version_created_at_idx" ON "_customs_v" ("version_created_at");
CREATE INDEX IF NOT EXISTS "_customs_v_version_version__status_idx" ON "_customs_v" ("version__status");
CREATE INDEX IF NOT EXISTS "_customs_v_created_at_idx" ON "_customs_v" ("created_at");
CREATE INDEX IF NOT EXISTS "_customs_v_updated_at_idx" ON "_customs_v" ("updated_at");
CREATE INDEX IF NOT EXISTS "_customs_v_latest_idx" ON "_customs_v" ("latest");
CREATE INDEX IF NOT EXISTS "_customs_v_rels_order_idx" ON "_customs_v_rels" ("order");
CREATE INDEX IF NOT EXISTS "_customs_v_rels_parent_idx" ON "_customs_v_rels" ("parent_id");
CREATE INDEX IF NOT EXISTS "_customs_v_rels_path_idx" ON "_customs_v_rels" ("path");
CREATE INDEX IF NOT EXISTS "users_created_at_idx" ON "users" ("created_at");
CREATE UNIQUE INDEX IF NOT EXISTS "users_email_idx" ON "users" ("email");
CREATE INDEX IF NOT EXISTS "payload_preferences_key_idx" ON "payload_preferences" ("key");
CREATE INDEX IF NOT EXISTS "payload_preferences_created_at_idx" ON "payload_preferences" ("created_at");
CREATE INDEX IF NOT EXISTS "payload_preferences_rels_order_idx" ON "payload_preferences_rels" ("order");
CREATE INDEX IF NOT EXISTS "payload_preferences_rels_parent_idx" ON "payload_preferences_rels" ("parent_id");
CREATE INDEX IF NOT EXISTS "payload_preferences_rels_path_idx" ON "payload_preferences_rels" ("path");
CREATE INDEX IF NOT EXISTS "payload_migrations_created_at_idx" ON "payload_migrations" ("created_at");
DO $$ BEGIN
ALTER TABLE "relation_a_rels" ADD CONSTRAINT "relation_a_rels_parent_fk" FOREIGN KEY ("parent_id") REFERENCES "relation_a"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
ALTER TABLE "relation_a_rels" ADD CONSTRAINT "relation_a_rels_relation_b_fk" FOREIGN KEY ("relation_b_id") REFERENCES "relation_b"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
ALTER TABLE "relation_b_rels" ADD CONSTRAINT "relation_b_rels_parent_fk" FOREIGN KEY ("parent_id") REFERENCES "relation_b"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
ALTER TABLE "relation_b_rels" ADD CONSTRAINT "relation_b_rels_relation_a_fk" FOREIGN KEY ("relation_a_id") REFERENCES "relation_a"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
ALTER TABLE "customs_customSelect" ADD CONSTRAINT "customs_customSelect_parent_fk" FOREIGN KEY ("parent_id") REFERENCES "customs"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
ALTER TABLE "customArrays" ADD CONSTRAINT "customArrays_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "customs"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
ALTER TABLE "customArrays_locales" ADD CONSTRAINT "customArrays_locales_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "customArrays"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
ALTER TABLE "customBlocks" ADD CONSTRAINT "customBlocks_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "customs"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
ALTER TABLE "customBlocks_locales" ADD CONSTRAINT "customBlocks_locales_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "customBlocks"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
ALTER TABLE "customs_locales" ADD CONSTRAINT "customs_locales_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "customs"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
ALTER TABLE "customs_rels" ADD CONSTRAINT "customs_rels_parent_fk" FOREIGN KEY ("parent_id") REFERENCES "customs"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
ALTER TABLE "customs_rels" ADD CONSTRAINT "customs_rels_relation_a_fk" FOREIGN KEY ("relation_a_id") REFERENCES "relation_a"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
ALTER TABLE "_customs_v_version_customSelect" ADD CONSTRAINT "_customs_v_version_customSelect_parent_fk" FOREIGN KEY ("parent_id") REFERENCES "_customs_v"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
ALTER TABLE "_customArrays_v" ADD CONSTRAINT "_customArrays_v_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "_customs_v"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
ALTER TABLE "_customArrays_v_locales" ADD CONSTRAINT "_customArrays_v_locales_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "_customArrays_v"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
ALTER TABLE "_customBlocks_v" ADD CONSTRAINT "_customBlocks_v_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "_customs_v"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
ALTER TABLE "_customBlocks_v_locales" ADD CONSTRAINT "_customBlocks_v_locales_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "_customBlocks_v"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
ALTER TABLE "_customs_v_locales" ADD CONSTRAINT "_customs_v_locales_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "_customs_v"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
ALTER TABLE "_customs_v_rels" ADD CONSTRAINT "_customs_v_rels_parent_fk" FOREIGN KEY ("parent_id") REFERENCES "_customs_v"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
ALTER TABLE "_customs_v_rels" ADD CONSTRAINT "_customs_v_rels_custom_schema_fk" FOREIGN KEY ("customs_id") REFERENCES "customs"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
ALTER TABLE "_customs_v_rels" ADD CONSTRAINT "_customs_v_rels_relation_a_fk" FOREIGN KEY ("relation_a_id") REFERENCES "relation_a"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
ALTER TABLE "payload_preferences_rels" ADD CONSTRAINT "payload_preferences_rels_parent_fk" FOREIGN KEY ("parent_id") REFERENCES "payload_preferences"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
ALTER TABLE "payload_preferences_rels" ADD CONSTRAINT "payload_preferences_rels_users_fk" FOREIGN KEY ("users_id") REFERENCES "users"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
`)
}
export async function down({ payload }: MigrateDownArgs): Promise<void> {
await payload.db.drizzle.execute(sql`
DROP TABLE "posts";
DROP TABLE "relation_a";
DROP TABLE "relation_a_rels";
DROP TABLE "relation_b";
DROP TABLE "relation_b_rels";
DROP TABLE "customs_customSelect";
DROP TABLE "customArrays";
DROP TABLE "customArrays_locales";
DROP TABLE "customBlocks";
DROP TABLE "customBlocks_locales";
DROP TABLE "customs";
DROP TABLE "customs_locales";
DROP TABLE "customs_rels";
DROP TABLE "_customs_v_version_customSelect";
DROP TABLE "_customArrays_v";
DROP TABLE "_customArrays_v_locales";
DROP TABLE "_customBlocks_v";
DROP TABLE "_customBlocks_v_locales";
DROP TABLE "_customs_v";
DROP TABLE "_customs_v_locales";
DROP TABLE "_customs_v_rels";
DROP TABLE "users";
DROP TABLE "payload_preferences";
DROP TABLE "payload_preferences_rels";
DROP TABLE "payload_migrations";
DROP TABLE "customGlobal";
DROP TABLE "_customGlobal_v";`)
}

View File

@@ -227,6 +227,86 @@ const GroupFields: CollectionConfig = {
}, },
], ],
}, },
{
name: 'localizedGroupArr',
type: 'group',
localized: true,
fields: [
{
name: 'array',
type: 'array',
fields: [
{
type: 'text',
name: 'text',
},
],
},
],
},
{
name: 'localizedGroupSelect',
type: 'group',
localized: true,
fields: [
{
type: 'select',
hasMany: true,
options: ['one', 'two'],
name: 'select',
},
],
},
{
name: 'localizedGroupRel',
type: 'group',
localized: true,
fields: [
{
type: 'relationship',
relationTo: 'text-fields',
name: 'rel',
},
],
},
{
name: 'localizedGroupManyRel',
type: 'group',
localized: true,
fields: [
{
type: 'relationship',
relationTo: 'text-fields',
name: 'email',
hasMany: true,
},
],
},
{
name: 'localizedGroupPolyRel',
type: 'group',
localized: true,
fields: [
{
type: 'relationship',
relationTo: ['text-fields'],
name: 'email',
},
],
},
{
name: 'localizedGroupPolyHasManyRel',
type: 'group',
localized: true,
fields: [
{
type: 'relationship',
relationTo: ['text-fields'],
name: 'email',
hasMany: true,
},
],
},
], ],
} }

View File

@@ -82,6 +82,88 @@ const SelectFields: CollectionConfig = {
}, },
], ],
}, },
{
name: 'array',
type: 'array',
fields: [
{
name: 'selectHasMany',
hasMany: true,
type: 'select',
admin: {
isClearable: true,
isSortable: true,
},
options: [
{
label: 'Value One',
value: 'one',
},
{
label: 'Value Two',
value: 'two',
},
{
label: 'Value Three',
value: 'three',
},
{
label: 'Value Four',
value: 'four',
},
{
label: 'Value Five',
value: 'five',
},
{
label: 'Value Six',
value: 'six',
},
],
},
{
name: 'group',
type: 'group',
fields: [
{
name: 'selectHasMany',
hasMany: true,
type: 'select',
admin: {
isClearable: true,
isSortable: true,
},
options: [
{
label: 'Value One',
value: 'one',
},
{
label: 'Value Two',
value: 'two',
},
{
label: 'Value Three',
value: 'three',
},
{
label: 'Value Four',
value: 'four',
},
{
label: 'Value Five',
value: 'five',
},
{
label: 'Value Six',
value: 'six',
},
],
},
],
},
],
},
{ {
name: 'selectHasManyLocalized', name: 'selectHasManyLocalized',
type: 'select', type: 'select',

View File

@@ -573,6 +573,54 @@ describe('Fields', () => {
expect(resInSecond.totalDocs).toBe(1) expect(resInSecond.totalDocs).toBe(1)
}) })
it('should CRUD within array hasMany', async () => {
const doc = await payload.create({
collection: 'select-fields',
data: { array: [{ selectHasMany: ['one', 'two'] }] },
})
expect(doc.array[0].selectHasMany).toStrictEqual(['one', 'two'])
const upd = await payload.update({
collection: 'select-fields',
id: doc.id,
data: {
array: [
{
id: doc.array[0].id,
selectHasMany: ['six'],
},
],
},
})
expect(upd.array[0].selectHasMany).toStrictEqual(['six'])
})
it('should CRUD within array + group hasMany', async () => {
const doc = await payload.create({
collection: 'select-fields',
data: { array: [{ group: { selectHasMany: ['one', 'two'] } }] },
})
expect(doc.array[0].group.selectHasMany).toStrictEqual(['one', 'two'])
const upd = await payload.update({
collection: 'select-fields',
id: doc.id,
data: {
array: [
{
id: doc.array[0].id,
group: { selectHasMany: ['six'] },
},
],
},
})
expect(upd.array[0].group.selectHasMany).toStrictEqual(['six'])
})
}) })
describe('number', () => { describe('number', () => {
@@ -842,6 +890,37 @@ describe('Fields', () => {
expect(resInSecond.totalDocs).toBe(1) expect(resInSecond.totalDocs).toBe(1)
}) })
it('should properly query numbers with exists operator', async () => {
await payload.create({
collection: 'number-fields',
data: {
number: null,
},
})
const numbersExist = await payload.find({
collection: 'number-fields',
where: {
number: {
exists: true,
},
},
})
expect(numbersExist.totalDocs).toBe(4)
const numbersNotExists = await payload.find({
collection: 'number-fields',
where: {
number: {
exists: false,
},
},
})
expect(numbersNotExists.docs).toHaveLength(1)
})
}) })
if (isMongoose(payload)) { if (isMongoose(payload)) {
@@ -1299,6 +1378,324 @@ describe('Fields', () => {
expect(res.camelCaseGroup.array[0].array[0].text).toBe('nested') expect(res.camelCaseGroup.array[0].array[0].text).toBe('nested')
expect(res.camelCaseGroup.nesGroup.arr[0].text).toBe('nestedCamel') expect(res.camelCaseGroup.nesGroup.arr[0].text).toBe('nestedCamel')
}) })
it('should insert/update/read localized group with array inside', async () => {
const doc = await payload.create({
collection: 'group-fields',
locale: 'en',
data: {
group: { text: 'req' },
localizedGroupArr: {
array: [{ text: 'text-en' }],
},
},
})
expect(doc.localizedGroupArr.array[0].text).toBe('text-en')
const esDoc = await payload.update({
collection: 'group-fields',
locale: 'es',
id: doc.id,
data: {
localizedGroupArr: {
array: [{ text: 'text-es' }],
},
},
})
expect(esDoc.localizedGroupArr.array[0].text).toBe('text-es')
const allDoc = await payload.findByID({
collection: 'group-fields',
id: doc.id,
locale: 'all',
})
expect(allDoc.localizedGroupArr.en.array[0].text).toBe('text-en')
expect(allDoc.localizedGroupArr.es.array[0].text).toBe('text-es')
})
it('should insert/update/read localized group with select hasMany inside', async () => {
const doc = await payload.create({
collection: 'group-fields',
locale: 'en',
data: {
group: { text: 'req' },
localizedGroupSelect: {
select: ['one', 'two'],
},
},
})
expect(doc.localizedGroupSelect.select).toStrictEqual(['one', 'two'])
const esDoc = await payload.update({
collection: 'group-fields',
locale: 'es',
id: doc.id,
data: {
localizedGroupSelect: {
select: ['one'],
},
},
})
expect(esDoc.localizedGroupSelect.select).toStrictEqual(['one'])
const allDoc = await payload.findByID({
collection: 'group-fields',
id: doc.id,
locale: 'all',
})
expect(allDoc.localizedGroupSelect.en.select).toStrictEqual(['one', 'two'])
expect(allDoc.localizedGroupSelect.es.select).toStrictEqual(['one'])
})
it('should insert/update/read localized group with relationship inside', async () => {
const rel_1 = await payload.create({
collection: 'text-fields',
data: { text: 'pro123@gmail.com' },
})
const rel_2 = await payload.create({
collection: 'text-fields',
data: { text: 'frank@gmail.com' },
})
const doc = await payload.create({
collection: 'group-fields',
depth: 0,
data: {
group: { text: 'requireddd' },
localizedGroupRel: {
rel: rel_1.id,
},
},
})
expect(doc.localizedGroupRel.rel).toBe(rel_1.id)
const upd = await payload.update({
collection: 'group-fields',
depth: 0,
id: doc.id,
locale: 'es',
data: {
localizedGroupRel: {
rel: rel_2.id,
},
},
})
expect(upd.localizedGroupRel.rel).toBe(rel_2.id)
const docAll = await payload.findByID({
collection: 'group-fields',
id: doc.id,
locale: 'all',
depth: 0,
})
expect(docAll.localizedGroupRel.en.rel).toBe(rel_1.id)
expect(docAll.localizedGroupRel.es.rel).toBe(rel_2.id)
})
it('should insert/update/read localized group with hasMany relationship inside', async () => {
const rel_1 = await payload.create({
collection: 'text-fields',
data: { text: 'pro123@gmail.com' },
})
const rel_2 = await payload.create({
collection: 'text-fields',
data: { text: 'frank@gmail.com' },
})
const doc = await payload.create({
collection: 'group-fields',
depth: 0,
data: {
group: { text: 'requireddd' },
localizedGroupManyRel: {
email: [rel_1.id],
},
},
})
expect(doc.localizedGroupManyRel.email).toStrictEqual([rel_1.id])
const upd = await payload.update({
collection: 'group-fields',
depth: 0,
id: doc.id,
locale: 'es',
data: {
localizedGroupManyRel: {
email: [rel_2.id],
},
},
})
expect(upd.localizedGroupManyRel.email).toStrictEqual([rel_2.id])
const docAll = await payload.findByID({
collection: 'group-fields',
id: doc.id,
locale: 'all',
depth: 0,
})
expect(docAll.localizedGroupManyRel.en.email).toStrictEqual([rel_1.id])
expect(docAll.localizedGroupManyRel.es.email).toStrictEqual([rel_2.id])
})
it('should insert/update/read localized group with poly relationship inside', async () => {
const rel_1 = await payload.create({
collection: 'text-fields',
data: { text: 'pro123@gmail.com' },
})
const rel_2 = await payload.create({
collection: 'text-fields',
data: { text: 'frank@gmail.com' },
})
const doc = await payload.create({
collection: 'group-fields',
depth: 0,
data: {
group: { text: 'requireddd' },
localizedGroupPolyRel: {
email: {
relationTo: 'text-fields',
value: rel_1.id,
},
},
},
})
expect(doc.localizedGroupPolyRel.email).toStrictEqual({
relationTo: 'text-fields',
value: rel_1.id,
})
const upd = await payload.update({
collection: 'group-fields',
depth: 0,
id: doc.id,
locale: 'es',
data: {
localizedGroupPolyRel: {
email: {
value: rel_2.id,
relationTo: 'text-fields',
},
},
},
})
expect(upd.localizedGroupPolyRel.email).toStrictEqual({
value: rel_2.id,
relationTo: 'text-fields',
})
const docAll = await payload.findByID({
collection: 'group-fields',
id: doc.id,
locale: 'all',
depth: 0,
})
expect(docAll.localizedGroupPolyRel.en.email).toStrictEqual({
value: rel_1.id,
relationTo: 'text-fields',
})
expect(docAll.localizedGroupPolyRel.es.email).toStrictEqual({
value: rel_2.id,
relationTo: 'text-fields',
})
})
it('should insert/update/read localized group with poly hasMany relationship inside', async () => {
const rel_1 = await payload.create({
collection: 'text-fields',
data: { text: 'pro123@gmail.com' },
})
const rel_2 = await payload.create({
collection: 'text-fields',
data: { text: 'frank@gmail.com' },
})
const doc = await payload.create({
collection: 'group-fields',
depth: 0,
data: {
group: { text: 'requireddd' },
localizedGroupPolyHasManyRel: {
email: [
{
relationTo: 'text-fields',
value: rel_1.id,
},
],
},
},
})
expect(doc.localizedGroupPolyHasManyRel.email).toStrictEqual([
{
relationTo: 'text-fields',
value: rel_1.id,
},
])
const upd = await payload.update({
collection: 'group-fields',
depth: 0,
id: doc.id,
locale: 'es',
data: {
localizedGroupPolyHasManyRel: {
email: [
{
value: rel_2.id,
relationTo: 'text-fields',
},
],
},
},
})
expect(upd.localizedGroupPolyHasManyRel.email).toStrictEqual([
{
value: rel_2.id,
relationTo: 'text-fields',
},
])
const docAll = await payload.findByID({
collection: 'group-fields',
id: doc.id,
locale: 'all',
depth: 0,
})
expect(docAll.localizedGroupPolyHasManyRel.en.email).toStrictEqual([
{
value: rel_1.id,
relationTo: 'text-fields',
},
])
expect(docAll.localizedGroupPolyHasManyRel.es.email).toStrictEqual([
{
value: rel_2.id,
relationTo: 'text-fields',
},
])
})
}) })
describe('tabs', () => { describe('tabs', () => {

View File

@@ -271,6 +271,17 @@ export interface ArrayField {
id?: string | null id?: string | null
}[] }[]
| null | null
nestedArrayLocalized?:
| {
array?:
| {
text?: string | null
id?: string | null
}[]
| null
id?: string | null
}[]
| null
updatedAt: string updatedAt: string
createdAt: string createdAt: string
} }
@@ -841,6 +852,37 @@ export interface GroupField {
| null | null
} }
} }
localizedGroupArr?: {
array?:
| {
text?: string | null
id?: string | null
}[]
| null
}
localizedGroupSelect?: {
select?: ('one' | 'two')[] | null
}
localizedGroupRel?: {
rel?: (string | null) | TextField
}
localizedGroupManyRel?: {
email?: (string | TextField)[] | null
}
localizedGroupPolyRel?: {
email?: {
relationTo: 'text-fields'
value: string | TextField
} | null
}
localizedGroupPolyHasManyRel?: {
email?:
| {
relationTo: 'text-fields'
value: string | TextField
}[]
| null
}
updatedAt: string updatedAt: string
createdAt: string createdAt: string
} }
@@ -1116,6 +1158,15 @@ export interface SelectField {
select?: ('one' | 'two' | 'three') | null select?: ('one' | 'two' | 'three') | null
selectReadOnly?: ('one' | 'two' | 'three') | null selectReadOnly?: ('one' | 'two' | 'three') | null
selectHasMany?: ('one' | 'two' | 'three' | 'four' | 'five' | 'six')[] | null selectHasMany?: ('one' | 'two' | 'three' | 'four' | 'five' | 'six')[] | null
array?:
| {
selectHasMany?: ('one' | 'two' | 'three' | 'four' | 'five' | 'six')[] | null
group?: {
selectHasMany?: ('one' | 'two' | 'three' | 'four' | 'five' | 'six')[] | null
}
id?: string | null
}[]
| null
selectHasManyLocalized?: ('one' | 'two')[] | null selectHasManyLocalized?: ('one' | 'two')[] | null
selectI18n?: ('one' | 'two' | 'three') | null selectI18n?: ('one' | 'two' | 'three') | null
simple?: ('One' | 'Two' | 'Three') | null simple?: ('One' | 'Two' | 'Three') | null

View File

@@ -1,9 +1,10 @@
import fs from 'fs' import fs from 'fs'
const removeFiles = (dir) => { const removeFiles = (dir, nameFilter?: (name: string) => boolean) => {
if (!fs.existsSync(dir)) return if (!fs.existsSync(dir)) return
fs.readdirSync(dir).forEach((f) => { fs.readdirSync(dir).forEach((f) => {
if (nameFilter && !nameFilter(f)) return
return fs.rmSync(`${dir}/${f}`, { recursive: true }) return fs.rmSync(`${dir}/${f}`, { recursive: true })
}) })
} }

View File

@@ -0,0 +1,37 @@
import type { CollectionConfig } from 'payload/types'
export const blocksCollectionSlug = 'blocks-fields'
export const BlocksCollection: CollectionConfig = {
slug: blocksCollectionSlug,
fields: [
{
name: 'content',
label: 'Content',
type: 'blocks',
localized: true,
blocks: [
{
slug: 'blockInsideBlock',
fields: [
{
name: 'content',
type: 'blocks',
blocks: [
{
slug: 'textBlock',
fields: [
{
name: 'text',
type: 'text',
},
],
},
],
},
],
},
],
},
],
}

View File

@@ -0,0 +1,81 @@
import type { CollectionConfig } from 'payload'
export const groupSlug = 'groups'
export const Group: CollectionConfig = {
slug: groupSlug,
fields: [
{
name: 'groupLocalizedRow',
type: 'group',
localized: true,
fields: [
{
type: 'row',
fields: [
{
name: 'text',
type: 'text',
},
],
},
],
},
{
name: 'groupLocalized',
type: 'group',
fields: [
{
name: 'title',
type: 'text',
},
],
localized: true,
},
{
name: 'group',
type: 'group',
fields: [
{
name: 'title',
type: 'text',
localized: true,
},
],
},
{
name: 'deep',
type: 'group',
fields: [
{
name: 'array',
type: 'array',
fields: [
{
name: 'title',
type: 'text',
localized: true,
},
],
},
{
name: 'blocks',
type: 'blocks',
blocks: [
{
slug: 'first',
fields: [
{
name: 'title',
type: 'text',
localized: true,
},
],
},
],
},
],
},
],
}

View File

@@ -0,0 +1,42 @@
import type { CollectionConfig } from 'payload/types'
export const NestedArray: CollectionConfig = {
slug: 'nested-arrays',
fields: [
{
name: 'arrayWithBlocks',
type: 'array',
localized: true,
fields: [
{
name: 'blocksWithinArray',
type: 'blocks',
blocks: [
{
slug: 'someBlock',
fields: [
{
name: 'relationWithinBlock',
type: 'relationship',
relationTo: 'localized-posts',
},
],
},
],
},
],
},
{
name: 'arrayWithLocalizedRelation',
type: 'array',
fields: [
{
name: 'localizedRelation',
type: 'relationship',
localized: true,
relationTo: 'localized-posts',
},
],
},
],
}

View File

@@ -0,0 +1,86 @@
import type { CollectionConfig } from 'payload/types'
export const NestedFields: CollectionConfig = {
slug: 'nested-field-tables',
fields: [
{
name: 'array',
type: 'array',
localized: true,
fields: [
{
name: 'relation',
type: 'relationship',
relationTo: ['localized-posts'],
},
{
name: 'hasManyRelation',
type: 'relationship',
hasMany: true,
relationTo: 'localized-posts',
},
{
name: 'hasManyPolyRelation',
type: 'relationship',
hasMany: true,
relationTo: ['localized-posts'],
},
{
name: 'select',
type: 'select',
hasMany: true,
options: ['one', 'two', 'three'],
},
{
name: 'number',
type: 'number',
hasMany: true,
},
{
name: 'text',
type: 'text',
hasMany: true,
},
],
},
{
name: 'blocks',
type: 'blocks',
localized: true,
blocks: [
{
slug: 'block',
fields: [
{
name: 'nestedBlocks',
type: 'blocks',
blocks: [
{
slug: 'content',
fields: [
{
name: 'relation',
type: 'relationship',
relationTo: ['localized-posts'],
},
],
},
],
},
{
name: 'array',
type: 'array',
fields: [
{
name: 'relation',
type: 'relationship',
relationTo: ['localized-posts'],
},
],
},
],
},
],
},
],
}

View File

@@ -0,0 +1,66 @@
import type { CollectionConfig } from 'payload/types'
export const tabSlug = 'tabs'
export const Tab: CollectionConfig = {
slug: tabSlug,
fields: [
{
type: 'tabs',
tabs: [
{
name: 'tabLocalized',
localized: true,
fields: [
{
name: 'title',
type: 'text',
},
],
},
{
name: 'tab',
fields: [
{
localized: true,
name: 'title',
type: 'text',
},
],
},
{
name: 'deep',
fields: [
{
name: 'array',
type: 'array',
fields: [
{
localized: true,
type: 'text',
name: 'title',
},
],
},
{
name: 'blocks',
type: 'blocks',
blocks: [
{
slug: 'first',
fields: [
{
localized: true,
type: 'text',
name: 'title',
},
],
},
],
},
],
},
],
},
],
}

View File

@@ -4,8 +4,13 @@ import { buildConfigWithDefaults } from '../buildConfigWithDefaults'
import { devUser } from '../credentials' import { devUser } from '../credentials'
import { englishLocale } from '../globals/config' import { englishLocale } from '../globals/config'
import { ArrayCollection } from './collections/Array' import { ArrayCollection } from './collections/Array'
import { BlocksCollection } from './collections/Blocks'
import { Group } from './collections/Group'
import { NestedArray } from './collections/NestedArray'
import { NestedFields } from './collections/NestedFields'
import { NestedToArrayAndBlock } from './collections/NestedToArrayAndBlock' import { NestedToArrayAndBlock } from './collections/NestedToArrayAndBlock'
import { RestrictedByLocaleCollection } from './collections/RestrictedByLocale' import { RestrictedByLocaleCollection } from './collections/RestrictedByLocale'
import { Tab } from './collections/Tab'
import { import {
blocksWithLocalizedSameName, blocksWithLocalizedSameName,
defaultLocale, defaultLocale,
@@ -41,6 +46,9 @@ const openAccess = {
export default buildConfigWithDefaults({ export default buildConfigWithDefaults({
collections: [ collections: [
BlocksCollection,
NestedArray,
NestedFields,
{ {
auth: true, auth: true,
fields: [ fields: [
@@ -127,6 +135,16 @@ export default buildConfigWithDefaults({
name: 'text', name: 'text',
type: 'text', type: 'text',
}, },
{
name: 'nestedArray',
type: 'array',
fields: [
{
name: 'text',
type: 'text',
},
],
},
], ],
slug: 'text', slug: 'text',
}, },
@@ -144,6 +162,41 @@ export default buildConfigWithDefaults({
required: true, required: true,
type: 'blocks', type: 'blocks',
}, },
{
type: 'tabs',
tabs: [
{
name: 'myTab',
fields: [
{
name: 'text',
type: 'text',
},
{
name: 'group',
type: 'group',
localized: true,
fields: [
{
name: 'nestedArray2',
type: 'array',
fields: [
{
name: 'nestedText',
type: 'text',
},
],
},
{
name: 'nestedText',
type: 'text',
},
],
},
],
},
],
},
], ],
slug: withRequiredLocalizedFields, slug: withRequiredLocalizedFields,
}, },
@@ -235,6 +288,8 @@ export default buildConfigWithDefaults({
slug: 'dummy', slug: 'dummy',
}, },
NestedToArrayAndBlock, NestedToArrayAndBlock,
Group,
Tab,
{ {
slug: localizedSortSlug, slug: localizedSortSlug,
access: openAccess, access: openAccess,

View File

@@ -11,7 +11,9 @@ import { initPayloadTest } from '../helpers/configHelpers'
import { idToString } from '../helpers/idToString' import { idToString } from '../helpers/idToString'
import { RESTClient } from '../helpers/rest' import { RESTClient } from '../helpers/rest'
import { arrayCollectionSlug } from './collections/Array' import { arrayCollectionSlug } from './collections/Array'
import { groupSlug } from './collections/Group'
import { nestedToArrayAndBlockCollectionSlug } from './collections/NestedToArrayAndBlock' import { nestedToArrayAndBlockCollectionSlug } from './collections/NestedToArrayAndBlock'
import { tabSlug } from './collections/Tab'
import configPromise from './config' import configPromise from './config'
import { defaultLocale, hungarianLocale, localizedSortSlug } from './shared' import { defaultLocale, hungarianLocale, localizedSortSlug } from './shared'
import { import {
@@ -52,7 +54,6 @@ describe('Localization', () => {
config = await configPromise config = await configPromise
// @ts-expect-error Force typing
post1 = await payload.create({ post1 = await payload.create({
collection, collection,
data: { data: {
@@ -60,7 +61,6 @@ describe('Localization', () => {
}, },
}) })
// @ts-expect-error Force typing
postWithLocalizedData = await payload.create({ postWithLocalizedData = await payload.create({
collection, collection,
data: { data: {
@@ -1060,6 +1060,747 @@ describe('Localization', () => {
expect(rowSpanish.textNotLocalized).toEqual('test') expect(rowSpanish.textNotLocalized).toEqual('test')
}) })
}) })
describe('Localized group and tabs', () => {
it('should properly create/update/read localized group field', async () => {
const result = await payload.create({
collection: groupSlug,
data: {
groupLocalized: {
title: 'hello en',
},
},
locale: englishLocale,
})
expect(result.groupLocalized?.title).toBe('hello en')
await payload.update({
collection: groupSlug,
locale: spanishLocale,
id: result.id,
data: {
groupLocalized: {
title: 'hello es',
},
},
})
const docEn = await payload.findByID({
collection: groupSlug,
locale: englishLocale,
id: result.id,
})
const docEs = await payload.findByID({
collection: groupSlug,
locale: spanishLocale,
id: result.id,
})
expect(docEn.groupLocalized.title).toBe('hello en')
expect(docEs.groupLocalized.title).toBe('hello es')
})
it('should properly create/update/read localized field inside of group', async () => {
const result = await payload.create({
collection: groupSlug,
locale: englishLocale,
data: {
group: {
title: 'hello en',
},
},
})
expect(result.group.title).toBe('hello en')
await payload.update({
collection: groupSlug,
locale: spanishLocale,
id: result.id,
data: {
group: {
title: 'hello es',
},
},
})
const docEn = await payload.findByID({
collection: groupSlug,
locale: englishLocale,
id: result.id,
})
const docEs = await payload.findByID({
collection: groupSlug,
locale: spanishLocale,
id: result.id,
})
expect(docEn.group.title).toBe('hello en')
expect(docEs.group.title).toBe('hello es')
})
it('should properly create/update/read deep localized field inside of group', async () => {
const result = await payload.create({
collection: groupSlug,
locale: englishLocale,
data: {
deep: {
blocks: [
{
blockType: 'first',
title: 'hello en',
},
],
array: [{ title: 'hello en' }],
},
},
})
expect(result.deep.array[0].title).toBe('hello en')
await payload.update({
collection: groupSlug,
locale: spanishLocale,
id: result.id,
data: {
deep: {
blocks: [
{
blockType: 'first',
title: 'hello es',
id: result.deep.blocks[0].id,
},
],
array: [
{
id: result.deep.array[0].id,
title: 'hello es',
},
],
},
},
})
const docEn = await payload.findByID({
collection: groupSlug,
locale: englishLocale,
id: result.id,
})
const docEs = await payload.findByID({
collection: groupSlug,
locale: spanishLocale,
id: result.id,
})
expect(docEn.deep.array[0].title).toBe('hello en')
expect(docEn.deep.blocks[0].title).toBe('hello en')
expect(docEs.deep.array[0].title).toBe('hello es')
expect(docEs.deep.blocks[0].title).toBe('hello es')
})
it('should create/updated/read localized group with row field', async () => {
const doc = await payload.create({
collection: 'groups',
data: {
groupLocalizedRow: {
text: 'hello world',
},
},
locale: 'en',
})
expect(doc.groupLocalizedRow.text).toBe('hello world')
const docES = await payload.update({
collection: 'groups',
data: {
groupLocalizedRow: {
text: 'hola world or something',
},
},
locale: 'es',
id: doc.id,
})
expect(docES.groupLocalizedRow.text).toBe('hola world or something')
// check if docES didnt break EN
const docEN = await payload.findByID({ collection: 'groups', id: doc.id, locale: 'en' })
expect(docEN.groupLocalizedRow.text).toBe('hello world')
const all = await payload.findByID({ collection: 'groups', id: doc.id, locale: 'all' })
expect(all.groupLocalizedRow.en.text).toBe('hello world')
expect(all.groupLocalizedRow.es.text).toBe('hola world or something')
})
it('should properly create/update/read localized tab field', async () => {
const result = await payload.create({
collection: tabSlug,
locale: englishLocale,
data: {
tabLocalized: {
title: 'hello en',
},
},
})
expect(result.tabLocalized?.title).toBe('hello en')
await payload.update({
collection: tabSlug,
locale: spanishLocale,
id: result.id,
data: {
tabLocalized: {
title: 'hello es',
},
},
})
const docEn = await payload.findByID({
collection: tabSlug,
locale: englishLocale,
id: result.id,
})
const docEs = await payload.findByID({
collection: tabSlug,
locale: spanishLocale,
id: result.id,
})
expect(docEn.tabLocalized.title).toBe('hello en')
expect(docEs.tabLocalized.title).toBe('hello es')
})
it('should properly create/update/read localized field inside of tab', async () => {
const result = await payload.create({
collection: tabSlug,
locale: englishLocale,
data: {
tab: {
title: 'hello en',
},
},
})
expect(result.tab.title).toBe('hello en')
await payload.update({
collection: tabSlug,
locale: spanishLocale,
id: result.id,
data: {
tab: {
title: 'hello es',
},
},
})
const docEn = await payload.findByID({
collection: tabSlug,
locale: englishLocale,
id: result.id,
})
const docEs = await payload.findByID({
collection: tabSlug,
locale: spanishLocale,
id: result.id,
})
expect(docEn.tab.title).toBe('hello en')
expect(docEs.tab.title).toBe('hello es')
})
it('should properly create/update/read deep localized field inside of tab', async () => {
const result = await payload.create({
collection: tabSlug,
locale: englishLocale,
data: {
deep: {
blocks: [
{
blockType: 'first',
title: 'hello en',
},
],
array: [{ title: 'hello en' }],
},
},
})
expect(result.deep.array[0].title).toBe('hello en')
await payload.update({
collection: tabSlug,
locale: spanishLocale,
id: result.id,
data: {
deep: {
blocks: [
{
blockType: 'first',
title: 'hello es',
id: result.deep.blocks[0].id,
},
],
array: [
{
id: result.deep.array[0].id,
title: 'hello es',
},
],
},
},
})
const docEn = await payload.findByID({
collection: tabSlug,
locale: englishLocale,
id: result.id,
})
const docEs = await payload.findByID({
collection: tabSlug,
locale: spanishLocale,
id: result.id,
})
expect(docEn.deep.array[0].title).toBe('hello en')
expect(docEn.deep.blocks[0].title).toBe('hello en')
expect(docEs.deep.array[0].title).toBe('hello es')
expect(docEs.deep.blocks[0].title).toBe('hello es')
})
})
describe('nested blocks', () => {
let id
it('should allow creating nested blocks per locale', async () => {
const doc = await payload.create({
collection: 'blocks-fields',
data: {
content: [
{
blockType: 'blockInsideBlock',
content: [
{
blockType: 'textBlock',
text: 'hello',
},
],
},
],
},
})
id = doc.id
await payload.update({
collection: 'blocks-fields',
id,
locale: 'es',
data: {
content: [
{
blockType: 'blockInsideBlock',
content: [
{
blockType: 'textBlock',
text: 'hola',
},
],
},
],
},
})
const retrieved = await payload.findByID({
collection: 'blocks-fields',
id,
locale: 'all',
})
expect(retrieved.content.en[0].content).toHaveLength(1)
expect(retrieved.content.es[0].content).toHaveLength(1)
})
})
describe('nested arrays', () => {
it('should not duplicate block rows for blocks within localized array fields', async () => {
const randomDoc = (
await payload.find({
collection: 'localized-posts',
depth: 0,
})
).docs[0]
const randomDoc2 = (
await payload.find({
collection: 'localized-posts',
depth: 0,
})
).docs[1]
const blocksWithinArrayEN = [
{
blockName: '1',
blockType: 'someBlock',
relationWithinBlock: randomDoc.id,
},
{
blockName: '2',
blockType: 'someBlock',
relationWithinBlock: randomDoc.id,
},
{
blockName: '3',
blockType: 'someBlock',
relationWithinBlock: randomDoc.id,
},
]
const blocksWithinArrayES = [
{
blockName: '1',
blockType: 'someBlock',
relationWithinBlock: randomDoc2.id,
},
{
blockName: '2',
blockType: 'someBlock',
relationWithinBlock: randomDoc2.id,
},
{
blockName: '3',
blockType: 'someBlock',
relationWithinBlock: randomDoc2.id,
},
]
const createdEnDoc = await payload.create({
collection: 'nested-arrays',
locale: 'en',
depth: 0,
data: {
arrayWithBlocks: [
{
blocksWithinArray: blocksWithinArrayEN as any,
},
],
},
})
const updatedEsDoc = await payload.update({
collection: 'nested-arrays',
id: createdEnDoc.id,
depth: 0,
locale: 'es',
data: {
arrayWithBlocks: [
{
blocksWithinArray: blocksWithinArrayES as any,
},
],
},
})
const esArrayBlocks = updatedEsDoc.arrayWithBlocks[0].blocksWithinArray
// recursively remove any id field within esArrayRow
const removeId = (obj) => {
if (obj instanceof Object) {
delete obj.id
Object.values(obj).forEach(removeId)
}
}
removeId(esArrayBlocks)
removeId(createdEnDoc.arrayWithBlocks[0].blocksWithinArray)
expect(esArrayBlocks).toEqual(blocksWithinArrayES)
expect(createdEnDoc.arrayWithBlocks[0].blocksWithinArray).toEqual(blocksWithinArrayEN)
// pull enDoc again and make sure the update of esDoc did not mess with the data of enDoc
const enDoc2 = await payload.findByID({
id: createdEnDoc.id,
collection: 'nested-arrays',
locale: 'en',
depth: 0,
})
removeId(enDoc2.arrayWithBlocks[0].blocksWithinArray)
expect(enDoc2.arrayWithBlocks[0].blocksWithinArray).toEqual(blocksWithinArrayEN)
})
it('should update localized relation within unLocalized array', async () => {
const randomTextDoc = (
await payload.find({
collection: 'localized-posts',
depth: 0,
})
).docs[0]
const randomTextDoc2 = (
await payload.find({
collection: 'localized-posts',
depth: 0,
})
).docs[1]
const createdEnDoc = await payload.create({
collection: 'nested-arrays',
locale: 'en',
depth: 0,
data: {
arrayWithLocalizedRelation: [
{
localizedRelation: randomTextDoc.id,
},
],
},
})
const updatedEsDoc = await payload.update({
collection: 'nested-arrays',
id: createdEnDoc.id,
depth: 0,
locale: 'es',
data: {
arrayWithLocalizedRelation: [
{
id: createdEnDoc.arrayWithLocalizedRelation[0].id,
localizedRelation: randomTextDoc2.id,
},
],
},
})
expect(updatedEsDoc.arrayWithLocalizedRelation).toHaveLength(1)
expect(updatedEsDoc.arrayWithLocalizedRelation[0].localizedRelation).toBe(randomTextDoc2.id)
expect(createdEnDoc.arrayWithLocalizedRelation).toHaveLength(1)
expect(createdEnDoc.arrayWithLocalizedRelation[0].localizedRelation).toBe(randomTextDoc.id)
// pull enDoc again and make sure the update of esDoc did not mess with the data of enDoc
const enDoc2 = await payload.findByID({
id: createdEnDoc.id,
collection: 'nested-arrays',
locale: 'en',
depth: 0,
})
expect(enDoc2.arrayWithLocalizedRelation).toHaveLength(1)
expect(enDoc2.arrayWithLocalizedRelation[0].localizedRelation).toBe(randomTextDoc.id)
})
})
describe('nested fields', () => {
it('should allow for fields which could contain new tables within localized arrays to be stored', async () => {
const randomDoc = (
await payload.find({
collection: 'localized-posts',
depth: 0,
})
).docs[0]
const randomDoc2 = (
await payload.find({
collection: 'localized-posts',
depth: 0,
})
).docs[1]
const newDoc = await payload.create({
collection: 'nested-field-tables',
data: {
array: [
{
relation: {
value: randomDoc.id,
relationTo: 'localized-posts',
},
hasManyRelation: [randomDoc.id, randomDoc2.id],
hasManyPolyRelation: [
{
relationTo: 'localized-posts',
value: randomDoc.id,
},
{
relationTo: 'localized-posts',
value: randomDoc2.id,
},
],
number: [1, 2],
text: ['hello', 'goodbye'],
select: ['one'],
},
],
},
})
await payload.update({
collection: 'nested-field-tables',
id: newDoc.id,
locale: 'es',
data: {
array: [
{
relation: {
value: randomDoc2.id,
relationTo: 'localized-posts',
},
hasManyRelation: [randomDoc2.id, randomDoc.id],
hasManyPolyRelation: [
{
relationTo: 'localized-posts',
value: randomDoc2.id,
},
{
relationTo: 'localized-posts',
value: randomDoc.id,
},
],
select: ['two', 'three'],
text: ['hola', 'adios'],
number: [3, 4],
},
],
},
})
const retrieved = await payload.findByID({
collection: 'nested-field-tables',
id: newDoc.id,
depth: 0,
locale: 'all',
})
expect(retrieved.array.en[0].relation.value).toStrictEqual(randomDoc.id)
expect(retrieved.array.es[0].relation.value).toStrictEqual(randomDoc2.id)
expect(retrieved.array.en[0].hasManyRelation).toEqual([randomDoc.id, randomDoc2.id])
expect(retrieved.array.es[0].hasManyRelation).toEqual([randomDoc2.id, randomDoc.id])
expect(retrieved.array.en[0].hasManyPolyRelation).toEqual([
{ value: randomDoc.id, relationTo: 'localized-posts' },
{ value: randomDoc2.id, relationTo: 'localized-posts' },
])
expect(retrieved.array.es[0].hasManyPolyRelation).toEqual([
{ value: randomDoc2.id, relationTo: 'localized-posts' },
{ value: randomDoc.id, relationTo: 'localized-posts' },
])
expect(retrieved.array.en[0].number).toEqual([1, 2])
expect(retrieved.array.es[0].number).toEqual([3, 4])
expect(retrieved.array.en[0].select).toEqual(['one'])
expect(retrieved.array.es[0].select).toEqual(['two', 'three'])
expect(retrieved.array.en[0].text).toEqual(['hello', 'goodbye'])
expect(retrieved.array.es[0].text).toEqual(['hola', 'adios'])
})
it('should duplicate with localized blocks', async () => {
// This test covers a few things:
// - make sure localized arrays / blocks work inside of localized groups / tabs
// - this is covered with myTab.group.nestedArray2
const englishText = 'english'
const spanishText = 'spanish'
const doc = await payload.create({
collection: withRequiredLocalizedFields,
data: {
layout: [
{
blockType: 'text',
text: englishText,
nestedArray: [
{
text: 'hello',
},
{
text: 'goodbye',
},
],
},
],
myTab: {
text: 'hello',
group: {
nestedText: 'hello',
nestedArray2: [
{
nestedText: 'hello',
},
{
nestedText: 'goodbye',
},
],
},
},
title: 'hello',
},
locale: defaultLocale,
})
await payload.update({
id: doc.id,
collection: withRequiredLocalizedFields,
data: {
layout: [
{
blockType: 'text',
text: spanishText,
nestedArray: [
{
text: 'hola',
},
{
text: 'adios',
},
],
},
],
title: 'hello',
myTab: {
text: 'hola',
group: {
nestedText: 'hola',
nestedArray2: [
{
nestedText: 'hola',
},
{
nestedText: 'adios',
},
],
},
},
},
locale: spanishLocale,
})
const result = await payload.findByID({
id: doc.id,
collection: withRequiredLocalizedFields,
locale: defaultLocale,
})
const allLocales = await payload.findByID({
id: result.id,
collection: withRequiredLocalizedFields,
locale: 'all',
})
// check fields
expect(result.layout[0].text).toStrictEqual(englishText)
expect(allLocales.layout.en[0].text).toStrictEqual(englishText)
expect(allLocales.layout.es[0].text).toStrictEqual(spanishText)
expect(allLocales.myTab.group.en.nestedText).toStrictEqual('hello')
expect(allLocales.myTab.group.en.nestedArray2[0].nestedText).toStrictEqual('hello')
expect(allLocales.myTab.group.en.nestedArray2[1].nestedText).toStrictEqual('goodbye')
expect(allLocales.myTab.group.es.nestedText).toStrictEqual('hola')
expect(allLocales.myTab.group.es.nestedArray2[0].nestedText).toStrictEqual('hola')
expect(allLocales.myTab.group.es.nestedArray2[1].nestedText).toStrictEqual('adios')
})
})
}) })
async function createLocalizedPost(data: { async function createLocalizedPost(data: {

View File

@@ -8,6 +8,9 @@
export interface Config { export interface Config {
collections: { collections: {
'blocks-fields': BlocksField
'nested-arrays': NestedArray
'nested-field-tables': NestedFieldTable
users: User users: User
'localized-posts': LocalizedPost 'localized-posts': LocalizedPost
'array-fields': ArrayField 'array-fields': ArrayField
@@ -15,6 +18,12 @@ export interface Config {
'with-localized-relationship': WithLocalizedRelationship 'with-localized-relationship': WithLocalizedRelationship
'relationship-localized': RelationshipLocalized 'relationship-localized': RelationshipLocalized
dummy: Dummy dummy: Dummy
'nested-to-array-and-block': NestedToArrayAndBlock
groups: Group
tabs: Tab
'localized-sort': LocalizedSort
'blocks-same-name': BlocksSameName
'restricted-by-locale': RestrictedByLocale
'payload-preferences': PayloadPreference 'payload-preferences': PayloadPreference
'payload-migrations': PayloadMigration 'payload-migrations': PayloadMigration
} }
@@ -22,9 +31,139 @@ export interface Config {
'global-array': GlobalArray 'global-array': GlobalArray
} }
} }
/**
* This interface was referenced by `Config`'s JSON-Schema
* via the `definition` "blocks-fields".
*/
export interface BlocksField {
id: string
content?:
| {
content?:
| {
text?: string | null
id?: string | null
blockName?: string | null
blockType: 'textBlock'
}[]
| null
id?: string | null
blockName?: string | null
blockType: 'blockInsideBlock'
}[]
| null
updatedAt: string
createdAt: string
}
/**
* This interface was referenced by `Config`'s JSON-Schema
* via the `definition` "nested-arrays".
*/
export interface NestedArray {
id: string
arrayWithBlocks?:
| {
blocksWithinArray?:
| {
relationWithinBlock?: (string | null) | LocalizedPost
id?: string | null
blockName?: string | null
blockType: 'someBlock'
}[]
| null
id?: string | null
}[]
| null
arrayWithLocalizedRelation?:
| {
localizedRelation?: (string | null) | LocalizedPost
id?: string | null
}[]
| null
updatedAt: string
createdAt: string
}
/**
* This interface was referenced by `Config`'s JSON-Schema
* via the `definition` "localized-posts".
*/
export interface LocalizedPost {
id: string
title?: string | null
description?: string | null
localizedDescription?: string | null
localizedCheckbox?: boolean | null
children?: (string | LocalizedPost)[] | null
group?: {
children?: string | null
}
updatedAt: string
createdAt: string
}
/**
* This interface was referenced by `Config`'s JSON-Schema
* via the `definition` "nested-field-tables".
*/
export interface NestedFieldTable {
id: string
array?:
| {
relation?: {
relationTo: 'localized-posts'
value: string | LocalizedPost
} | null
hasManyRelation?: (string | LocalizedPost)[] | null
hasManyPolyRelation?:
| {
relationTo: 'localized-posts'
value: string | LocalizedPost
}[]
| null
select?: ('one' | 'two' | 'three')[] | null
number?: number[] | null
text?: string[] | null
id?: string | null
}[]
| null
blocks?:
| {
nestedBlocks?:
| {
relation?: {
relationTo: 'localized-posts'
value: string | LocalizedPost
} | null
id?: string | null
blockName?: string | null
blockType: 'content'
}[]
| null
array?:
| {
relation?: {
relationTo: 'localized-posts'
value: string | LocalizedPost
} | null
id?: string | null
}[]
| null
id?: string | null
blockName?: string | null
blockType: 'block'
}[]
| null
updatedAt: string
createdAt: string
}
/**
* This interface was referenced by `Config`'s JSON-Schema
* via the `definition` "users".
*/
export interface User { export interface User {
id: string id: string
relation?: (string | null) | LocalizedPost relation?: (string | null) | LocalizedPost
assignedLocales?: ('en' | 'es' | 'pt' | 'ar')[] | null
roles?: ('admin' | 'editor') | null
updatedAt: string updatedAt: string
createdAt: string createdAt: string
email: string email: string
@@ -36,14 +175,10 @@ export interface User {
lockUntil?: string | null lockUntil?: string | null
password: string | null password: string | null
} }
export interface LocalizedPost { /**
id: string * This interface was referenced by `Config`'s JSON-Schema
title?: string | null * via the `definition` "array-fields".
description?: string | null */
localizedCheckbox?: boolean | null
updatedAt: string
createdAt: string
}
export interface ArrayField { export interface ArrayField {
id: string id: string
items?: items?:
@@ -55,12 +190,22 @@ export interface ArrayField {
updatedAt: string updatedAt: string
createdAt: string createdAt: string
} }
/**
* This interface was referenced by `Config`'s JSON-Schema
* via the `definition` "localized-required".
*/
export interface LocalizedRequired { export interface LocalizedRequired {
id: string id: string
title: string title: string
layout: ( layout: (
| { | {
text?: string | null text?: string | null
nestedArray?:
| {
text?: string | null
id?: string | null
}[]
| null
id?: string | null id?: string | null
blockName?: string | null blockName?: string | null
blockType: 'text' blockType: 'text'
@@ -72,9 +217,25 @@ export interface LocalizedRequired {
blockType: 'number' blockType: 'number'
} }
)[] )[]
myTab: {
text?: string | null
group?: {
nestedArray2?:
| {
nestedText?: string | null
id?: string | null
}[]
| null
nestedText?: string | null
}
}
updatedAt: string updatedAt: string
createdAt: string createdAt: string
} }
/**
* This interface was referenced by `Config`'s JSON-Schema
* via the `definition` "with-localized-relationship".
*/
export interface WithLocalizedRelationship { export interface WithLocalizedRelationship {
id: string id: string
localizedRelationship?: (string | null) | LocalizedPost localizedRelationship?: (string | null) | LocalizedPost
@@ -103,12 +264,20 @@ export interface WithLocalizedRelationship {
updatedAt: string updatedAt: string
createdAt: string createdAt: string
} }
/**
* This interface was referenced by `Config`'s JSON-Schema
* via the `definition` "dummy".
*/
export interface Dummy { export interface Dummy {
id: string id: string
name?: string | null name?: string | null
updatedAt: string updatedAt: string
createdAt: string createdAt: string
} }
/**
* This interface was referenced by `Config`'s JSON-Schema
* via the `definition` "relationship-localized".
*/
export interface RelationshipLocalized { export interface RelationshipLocalized {
id: string id: string
relationship?: (string | null) | LocalizedPost relationship?: (string | null) | LocalizedPost
@@ -143,6 +312,144 @@ export interface RelationshipLocalized {
updatedAt: string updatedAt: string
createdAt: string createdAt: string
} }
/**
* This interface was referenced by `Config`'s JSON-Schema
* via the `definition` "nested-to-array-and-block".
*/
export interface NestedToArrayAndBlock {
id: string
blocks?:
| {
array?:
| {
text?: string | null
textNotLocalized?: string | null
id?: string | null
}[]
| null
id?: string | null
blockName?: string | null
blockType: 'block'
}[]
| null
updatedAt: string
createdAt: string
}
/**
* This interface was referenced by `Config`'s JSON-Schema
* via the `definition` "groups".
*/
export interface Group {
id: string
groupLocalizedRow?: {
text?: string | null
}
groupLocalized?: {
title?: string | null
}
group?: {
title?: string | null
}
deep?: {
array?:
| {
title?: string | null
id?: string | null
}[]
| null
blocks?:
| {
title?: string | null
id?: string | null
blockName?: string | null
blockType: 'first'
}[]
| null
}
updatedAt: string
createdAt: string
}
/**
* This interface was referenced by `Config`'s JSON-Schema
* via the `definition` "tabs".
*/
export interface Tab {
id: string
tabLocalized: {
title?: string | null
}
tab: {
title?: string | null
}
deep: {
array?:
| {
title?: string | null
id?: string | null
}[]
| null
blocks?:
| {
title?: string | null
id?: string | null
blockName?: string | null
blockType: 'first'
}[]
| null
}
updatedAt: string
createdAt: string
}
/**
* This interface was referenced by `Config`'s JSON-Schema
* via the `definition` "localized-sort".
*/
export interface LocalizedSort {
id: string
title?: string | null
date?: string | null
updatedAt: string
createdAt: string
}
/**
* This interface was referenced by `Config`'s JSON-Schema
* via the `definition` "blocks-same-name".
*/
export interface BlocksSameName {
id: string
blocks?:
| (
| {
title?: string | null
id?: string | null
blockName?: string | null
blockType: 'block_first'
}
| {
title?: string | null
id?: string | null
blockName?: string | null
blockType: 'block_second'
}
)[]
| null
updatedAt: string
createdAt: string
}
/**
* This interface was referenced by `Config`'s JSON-Schema
* via the `definition` "restricted-by-locale".
*/
export interface RestrictedByLocale {
id: string
title?: string | null
updatedAt: string
createdAt: string
}
/**
* This interface was referenced by `Config`'s JSON-Schema
* via the `definition` "payload-preferences".
*/
export interface PayloadPreference { export interface PayloadPreference {
id: string id: string
user: { user: {
@@ -162,6 +469,10 @@ export interface PayloadPreference {
updatedAt: string updatedAt: string
createdAt: string createdAt: string
} }
/**
* This interface was referenced by `Config`'s JSON-Schema
* via the `definition` "payload-migrations".
*/
export interface PayloadMigration { export interface PayloadMigration {
id: string id: string
name?: string | null name?: string | null
@@ -169,6 +480,10 @@ export interface PayloadMigration {
updatedAt: string updatedAt: string
createdAt: string createdAt: string
} }
/**
* This interface was referenced by `Config`'s JSON-Schema
* via the `definition` "global-array".
*/
export interface GlobalArray { export interface GlobalArray {
id: string id: string
array?: array?:

View File

@@ -123,6 +123,18 @@ describe('uploads', () => {
await saveDocAndAssert(page) await saveDocAndAssert(page)
}) })
test('should properly create IOS file upload', async () => {
await page.goto(mediaURL.create)
await page.setInputFiles('input[type="file"]', path.resolve(__dirname, './ios-image.jpeg'))
const filename = page.locator('.file-field__filename')
await expect(filename).toHaveValue('ios-image.jpeg')
await saveDocAndAssert(page)
})
test('should create animated file upload', async () => { test('should create animated file upload', async () => {
await page.goto(animatedTypeMediaURL.create) await page.goto(animatedTypeMediaURL.create)
@@ -559,6 +571,34 @@ describe('uploads', () => {
// without focal point update this generated size was equal to 1736 // without focal point update this generated size was equal to 1736
expect(redDoc.sizes.focalTest.filesize).toEqual(1598) expect(redDoc.sizes.focalTest.filesize).toEqual(1598)
}) })
test('should resize image after crop if resizeOptions defined', async () => {
await page.goto(animatedTypeMediaURL.create)
await page.waitForURL(animatedTypeMediaURL.create)
const fileChooserPromise = page.waitForEvent('filechooser')
await page.getByText('Select a file').click()
const fileChooser = await fileChooserPromise
await wait(1000)
await fileChooser.setFiles(path.join(__dirname, 'horizontal-squares.jpg'))
await page.locator('.file-field__edit').click()
// set crop
await page.locator('.edit-upload__input input[name="Width (px)"]').fill('400')
await page.locator('.edit-upload__input input[name="Height (px)"]').fill('800')
// set focal point
await page.locator('.edit-upload__input input[name="X %"]').fill('75') // init left focal point
await page.locator('.edit-upload__input input[name="Y %"]').fill('50') // init top focal point
await page.locator('button:has-text("Apply Changes")').click()
await page.waitForSelector('button#action-save')
await page.locator('button#action-save').click()
await wait(1000) // Wait for the save
const resizeOptionMedia = page.locator('.file-meta .file-meta__size-type')
await expect(resizeOptionMedia).toContainText('200x200')
})
}) })
test('should see upload previews in relation list if allowed in config', async () => { test('should see upload previews in relation list if allowed in config', async () => {

BIN
test/uploads/ios-image.jpeg Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.1 MiB