Compare commits
51 Commits
revert-121
...
fix/form-i
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
6ef6f2e55c | ||
|
|
18693775e4 | ||
|
|
b3cac753d6 | ||
|
|
05ae957cd5 | ||
|
|
800c424777 | ||
|
|
9a6bb44e50 | ||
|
|
38186346f7 | ||
|
|
a6d76d6058 | ||
|
|
0d10f436cc | ||
|
|
dcd4e37ccc | ||
|
|
446938b9cb | ||
|
|
292b462f34 | ||
|
|
2628b43639 | ||
|
|
3fb81ef43b | ||
|
|
3c9ee5d3b4 | ||
|
|
11018ebfe0 | ||
|
|
b480f81387 | ||
|
|
d7d37447aa | ||
|
|
ddf40d59ac | ||
|
|
1ef1c5564d | ||
|
|
055a263af3 | ||
|
|
a62cdc89d8 | ||
|
|
b6b02ac97c | ||
|
|
5365d4f1c2 | ||
|
|
e5683913b4 | ||
|
|
78d3af7dc9 | ||
|
|
c08c7071ee | ||
|
|
b9868c4a3b | ||
|
|
e5b28c98dc | ||
|
|
35c0404817 | ||
|
|
cfe8c97ab7 | ||
|
|
6133a1d183 | ||
|
|
710fe0949b | ||
|
|
4a56597b92 | ||
|
|
27d644f2f9 | ||
|
|
564fdb0e17 | ||
|
|
47a1eee765 | ||
|
|
8fee0163b5 | ||
|
|
1b17df9e0b | ||
|
|
3df1329e19 | ||
|
|
5492542c1a | ||
|
|
9948040ad2 | ||
|
|
b7ae4ee60a | ||
|
|
34ead72c85 | ||
|
|
caae5986f5 | ||
|
|
2f21d46de6 | ||
|
|
6b83086c6c | ||
|
|
5bd852c9b5 | ||
|
|
c85fb808b9 | ||
|
|
ab03f4f305 | ||
|
|
2157450805 |
@@ -298,3 +298,15 @@ Passing your migrations as shown above will tell Payload, in production only, to
|
||||
may slow down serverless cold starts on platforms such as Vercel. Generally,
|
||||
this option should only be used for long-running servers / containers.
|
||||
</Banner>
|
||||
|
||||
## Environment-Specific Configurations and Migrations
|
||||
|
||||
Your configuration may include environment-specific settings (e.g., enabling a plugin only in production). If you generate migrations without considering the environment, it can lead to discrepancies and issues. When running migrations locally, Payload uses the development environment, which might miss production-specific configurations. Similarly, running migrations in production could miss development-specific entities.
|
||||
|
||||
This is an easy oversight, so be mindful of any environment-specific logic in your config when handling migrations.
|
||||
|
||||
**Ways to address this:**
|
||||
|
||||
- Manually update your migration file after it is generated to include any environment-specific configurations.
|
||||
- Temporarily enable any required production environment variables in your local setup when generating the migration to capture the necessary updates.
|
||||
- Use separate migration files for each environment to ensure the correct migration is executed in the corresponding environment.
|
||||
|
||||
@@ -94,6 +94,7 @@ The Relationship Field inherits all of the default options from the base [Field
|
||||
| **`allowCreate`** | Set to `false` if you'd like to disable the ability to create new documents from within the relationship field. |
|
||||
| **`allowEdit`** | Set to `false` if you'd like to disable the ability to edit documents from within the relationship field. |
|
||||
| **`sortOptions`** | Define a default sorting order for the options within a Relationship field's dropdown. [More](#sort-options) |
|
||||
| **`placeholder`** | Define a custom text or function to replace the generic default placeholder |
|
||||
| **`appearance`** | Set to `drawer` or `select` to change the behavior of the field. Defaults to `select`. |
|
||||
|
||||
### Sort Options
|
||||
@@ -149,7 +150,7 @@ The `filterOptions` property can either be a `Where` query, or a function return
|
||||
| `id` | The `id` of the current document being edited. Will be `undefined` during the `create` operation or when called on a `Filter` component within the list view. |
|
||||
| `relationTo` | The collection `slug` to filter against, limited to this field's `relationTo` property. |
|
||||
| `req` | The Payload Request, which contains references to `payload`, `user`, `locale`, and more. |
|
||||
| `siblingData` | An object containing document data that is scoped to only fields within the same parent of this field. Will be an emprt object when called on a `Filter` component within the list view. |
|
||||
| `siblingData` | An object containing document data that is scoped to only fields within the same parent of this field. Will be an empty object when called on a `Filter` component within the list view. |
|
||||
| `user` | An object containing the currently authenticated user. |
|
||||
|
||||
## Example
|
||||
|
||||
@@ -89,6 +89,7 @@ The Select Field inherits all of the default options from the base [Field Admin
|
||||
| ----------------- | ------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| **`isClearable`** | Set to `true` if you'd like this field to be clearable within the Admin UI. |
|
||||
| **`isSortable`** | Set to `true` if you'd like this field to be sortable within the Admin UI using drag and drop. (Only works when `hasMany` is set to `true`) |
|
||||
| **`placeholder`** | Define a custom text or function to replace the generic default placeholder |
|
||||
|
||||
## Example
|
||||
|
||||
|
||||
@@ -81,7 +81,7 @@ To install a Database Adapter, you can run **one** of the following commands:
|
||||
|
||||
#### 2. Copy Payload files into your Next.js app folder
|
||||
|
||||
Payload installs directly in your Next.js `/app` folder, and you'll need to place some files into that folder for Payload to run. You can copy these files from the [Blank Template](<https://github.com/payloadcms/payload/tree/main/templates/blank/src/app/(payload)>) on GitHub. Once you have the required Payload files in place in your `/app` folder, you should have something like this:
|
||||
Payload installs directly in your Next.js `/app` folder, and you'll need to place some files into that folder for Payload to run. You can copy these files from the [Blank Template](https://github.com/payloadcms/payload/tree/main/templates/blank/src/app/(payload)) on GitHub. Once you have the required Payload files in place in your `/app` folder, you should have something like this:
|
||||
|
||||
```plaintext
|
||||
app/
|
||||
|
||||
@@ -55,10 +55,11 @@ All collection `find` queries are paginated automatically. Responses are returne
|
||||
|
||||
All Payload APIs support the pagination controls below. With them, you can create paginated lists of documents within your application:
|
||||
|
||||
| Control | Description |
|
||||
| ------- | --------------------------------------- |
|
||||
| `limit` | Limits the number of documents returned |
|
||||
| `page` | Get a specific page number |
|
||||
| Control | Default | Description |
|
||||
| ------------ | ------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| `limit` | `10` | Limits the number of documents returned per page - set to `0` to show all documents, we automatically disabled pagination for you when `limit` is `0` for optimisation |
|
||||
| `pagination` | `true` | Set to `false` to disable pagination and return all documents |
|
||||
| `page` | `1` | Get a specific page number |
|
||||
|
||||
### Disabling pagination within Local API
|
||||
|
||||
|
||||
@@ -84,6 +84,7 @@ pnpm add @payloadcms/storage-s3
|
||||
- The `config` object can be any [`S3ClientConfig`](https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/client/s3) object (from [`@aws-sdk/client-s3`](https://github.com/aws/aws-sdk-js-v3)). _This is highly dependent on your AWS setup_. Check the AWS documentation for more information.
|
||||
- When enabled, this package will automatically set `disableLocalStorage` to `true` for each collection.
|
||||
- When deploying to Vercel, server uploads are limited with 4.5MB. Set `clientUploads` to `true` to do uploads directly on the client. You must allow CORS PUT method for the bucket to your website.
|
||||
- Configure `signedDownloads` (either globally of per-collection in `collections`) to use [presigned URLs](https://docs.aws.amazon.com/AmazonS3/latest/userguide/using-presigned-url.html) for files downloading. This can improve performance for large files (like videos) while still respecting your access control.
|
||||
|
||||
```ts
|
||||
import { s3Storage } from '@payloadcms/storage-s3'
|
||||
|
||||
@@ -58,7 +58,7 @@ See the [Collections](https://payloadcms.com/docs/configuration/collections) doc
|
||||
}
|
||||
```
|
||||
|
||||
For more details on how to extend this functionality, see the [Live Preview](https://payloadcms.com/docs/live-preview) docs.
|
||||
For more details on how to extend this functionality, see the [Live Preview](https://payloadcms.com/docs/live-preview/overview) docs.
|
||||
|
||||
## Front-end
|
||||
|
||||
|
||||
@@ -36,7 +36,7 @@ export const home: Partial<Page> = {
|
||||
type: 'link',
|
||||
children: [{ text: 'Live Preview' }],
|
||||
newTab: true,
|
||||
url: 'https://payloadcms.com/docs/live-preview',
|
||||
url: 'https://payloadcms.com/docs/live-preview/overview',
|
||||
},
|
||||
{
|
||||
text: ' you can edit this page in the admin panel and see the changes reflected here in real time.',
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "payload-monorepo",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"private": true,
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/admin-bar",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "An admin bar for React apps using Payload",
|
||||
"homepage": "https://payloadcms.com",
|
||||
"repository": {
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "create-payload-app",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"homepage": "https://payloadcms.com",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
|
||||
@@ -10,6 +10,7 @@ import type { CliArgs, DbType, ProjectExample, ProjectTemplate } from '../types.
|
||||
import { createProject } from './create-project.js'
|
||||
import { dbReplacements } from './replacements.js'
|
||||
import { getValidTemplates } from './templates.js'
|
||||
import { manageEnvFiles } from './manage-env-files.js'
|
||||
|
||||
describe('createProject', () => {
|
||||
let projectDir: string
|
||||
@@ -154,5 +155,75 @@ describe('createProject', () => {
|
||||
expect(content).toContain(dbReplacement.configReplacement().join('\n'))
|
||||
})
|
||||
})
|
||||
describe('managing env files', () => {
|
||||
it('updates .env files without overwriting existing data', async () => {
|
||||
const envFilePath = path.join(projectDir, '.env')
|
||||
const envExampleFilePath = path.join(projectDir, '.env.example')
|
||||
|
||||
fse.ensureDirSync(projectDir)
|
||||
fse.ensureFileSync(envFilePath)
|
||||
fse.ensureFileSync(envExampleFilePath)
|
||||
|
||||
const initialEnvContent = `CUSTOM_VAR=custom-value\nDATABASE_URI=old-connection\n`
|
||||
const initialEnvExampleContent = `CUSTOM_VAR=custom-value\nDATABASE_URI=old-connection\nPAYLOAD_SECRET=YOUR_SECRET_HERE\n`
|
||||
|
||||
fse.writeFileSync(envFilePath, initialEnvContent)
|
||||
fse.writeFileSync(envExampleFilePath, initialEnvExampleContent)
|
||||
|
||||
await manageEnvFiles({
|
||||
cliArgs: {
|
||||
'--debug': true,
|
||||
} as CliArgs,
|
||||
databaseType: 'mongodb',
|
||||
databaseUri: 'mongodb://localhost:27017/test',
|
||||
payloadSecret: 'test-secret',
|
||||
projectDir,
|
||||
template: undefined,
|
||||
})
|
||||
|
||||
const updatedEnvContent = fse.readFileSync(envFilePath, 'utf-8')
|
||||
|
||||
expect(updatedEnvContent).toContain('CUSTOM_VAR=custom-value')
|
||||
expect(updatedEnvContent).toContain('DATABASE_URI=mongodb://localhost:27017/test')
|
||||
expect(updatedEnvContent).toContain('PAYLOAD_SECRET=test-secret')
|
||||
|
||||
const updatedEnvExampleContent = fse.readFileSync(envExampleFilePath, 'utf-8')
|
||||
|
||||
expect(updatedEnvExampleContent).toContain('CUSTOM_VAR=custom-value')
|
||||
expect(updatedEnvContent).toContain('DATABASE_URI=mongodb://localhost:27017/test')
|
||||
expect(updatedEnvContent).toContain('PAYLOAD_SECRET=test-secret')
|
||||
})
|
||||
|
||||
it('creates .env and .env.example if they do not exist', async () => {
|
||||
const envFilePath = path.join(projectDir, '.env')
|
||||
const envExampleFilePath = path.join(projectDir, '.env.example')
|
||||
|
||||
fse.ensureDirSync(projectDir)
|
||||
|
||||
if (fse.existsSync(envFilePath)) fse.removeSync(envFilePath)
|
||||
if (fse.existsSync(envExampleFilePath)) fse.removeSync(envExampleFilePath)
|
||||
|
||||
await manageEnvFiles({
|
||||
cliArgs: {
|
||||
'--debug': true,
|
||||
} as CliArgs,
|
||||
databaseUri: '',
|
||||
payloadSecret: '',
|
||||
projectDir,
|
||||
template: undefined,
|
||||
})
|
||||
|
||||
expect(fse.existsSync(envFilePath)).toBe(true)
|
||||
expect(fse.existsSync(envExampleFilePath)).toBe(true)
|
||||
|
||||
const updatedEnvContent = fse.readFileSync(envFilePath, 'utf-8')
|
||||
expect(updatedEnvContent).toContain('DATABASE_URI=your-connection-string-here')
|
||||
expect(updatedEnvContent).toContain('PAYLOAD_SECRET=YOUR_SECRET_HERE')
|
||||
|
||||
const updatedEnvExampleContent = fse.readFileSync(envExampleFilePath, 'utf-8')
|
||||
expect(updatedEnvExampleContent).toContain('DATABASE_URI=your-connection-string-here')
|
||||
expect(updatedEnvExampleContent).toContain('PAYLOAD_SECRET=YOUR_SECRET_HERE')
|
||||
})
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
@@ -6,66 +6,55 @@ import type { CliArgs, DbType, ProjectTemplate } from '../types.js'
|
||||
import { debug, error } from '../utils/log.js'
|
||||
import { dbChoiceRecord } from './select-db.js'
|
||||
|
||||
const updateEnvExampleVariables = (contents: string, databaseType: DbType | undefined): string => {
|
||||
return contents
|
||||
const updateEnvExampleVariables = (
|
||||
contents: string,
|
||||
databaseType: DbType | undefined,
|
||||
payloadSecret?: string,
|
||||
databaseUri?: string,
|
||||
): string => {
|
||||
const seenKeys = new Set<string>()
|
||||
const updatedEnv = contents
|
||||
.split('\n')
|
||||
.map((line) => {
|
||||
if (line.startsWith('#') || !line.includes('=')) {
|
||||
return line // Preserve comments and unrelated lines
|
||||
return line
|
||||
}
|
||||
|
||||
const [key] = line.split('=')
|
||||
|
||||
if (!key) {return}
|
||||
|
||||
if (key === 'DATABASE_URI' || key === 'POSTGRES_URL' || key === 'MONGODB_URI') {
|
||||
const dbChoice = databaseType ? dbChoiceRecord[databaseType] : null
|
||||
|
||||
if (dbChoice) {
|
||||
const placeholderUri = `${dbChoice.dbConnectionPrefix}your-database-name${
|
||||
dbChoice.dbConnectionSuffix || ''
|
||||
}`
|
||||
return databaseType === 'vercel-postgres'
|
||||
? `POSTGRES_URL=${placeholderUri}`
|
||||
: `DATABASE_URI=${placeholderUri}`
|
||||
const placeholderUri = databaseUri
|
||||
? databaseUri
|
||||
: `${dbChoice.dbConnectionPrefix}your-database-name${dbChoice.dbConnectionSuffix || ''}`
|
||||
line =
|
||||
databaseType === 'vercel-postgres'
|
||||
? `POSTGRES_URL=${placeholderUri}`
|
||||
: `DATABASE_URI=${placeholderUri}`
|
||||
}
|
||||
|
||||
return `DATABASE_URI=your-database-connection-here` // Fallback
|
||||
}
|
||||
|
||||
if (key === 'PAYLOAD_SECRET' || key === 'PAYLOAD_SECRET_KEY') {
|
||||
return `PAYLOAD_SECRET=YOUR_SECRET_HERE`
|
||||
line = `PAYLOAD_SECRET=${payloadSecret || 'YOUR_SECRET_HERE'}`
|
||||
}
|
||||
|
||||
// handles dupes
|
||||
if (seenKeys.has(key)) {
|
||||
return null
|
||||
}
|
||||
|
||||
seenKeys.add(key)
|
||||
|
||||
return line
|
||||
})
|
||||
.filter(Boolean)
|
||||
.reverse()
|
||||
.join('\n')
|
||||
}
|
||||
|
||||
const generateEnvContent = (
|
||||
existingEnv: string,
|
||||
databaseType: DbType | undefined,
|
||||
databaseUri: string,
|
||||
payloadSecret: string,
|
||||
): string => {
|
||||
const dbKey = databaseType === 'vercel-postgres' ? 'POSTGRES_URL' : 'DATABASE_URI'
|
||||
|
||||
const envVars: Record<string, string> = {}
|
||||
existingEnv
|
||||
.split('\n')
|
||||
.filter((line) => line.includes('=') && !line.startsWith('#'))
|
||||
.forEach((line) => {
|
||||
const [key, value] = line.split('=')
|
||||
// @ts-expect-error - vestiges of when tsconfig was not strict. Feel free to improve
|
||||
envVars[key] = value
|
||||
})
|
||||
|
||||
// Override specific keys
|
||||
envVars[dbKey] = databaseUri
|
||||
envVars['PAYLOAD_SECRET'] = payloadSecret
|
||||
|
||||
// Rebuild content
|
||||
return Object.entries(envVars)
|
||||
.map(([key, value]) => `${key}=${value}`)
|
||||
.join('\n')
|
||||
return updatedEnv
|
||||
}
|
||||
|
||||
/** Parse and swap .env.example values and write .env */
|
||||
@@ -88,42 +77,71 @@ export async function manageEnvFiles(args: {
|
||||
|
||||
const envExamplePath = path.join(projectDir, '.env.example')
|
||||
const envPath = path.join(projectDir, '.env')
|
||||
|
||||
const emptyEnvContent = `# Added by Payload\nDATABASE_URI=your-connection-string-here\nPAYLOAD_SECRET=YOUR_SECRET_HERE\n`
|
||||
try {
|
||||
let updatedExampleContents: string
|
||||
|
||||
// Update .env.example
|
||||
if (template?.type === 'starter') {
|
||||
if (!fs.existsSync(envExamplePath)) {
|
||||
error(`.env.example file not found at ${envExamplePath}`)
|
||||
process.exit(1)
|
||||
if (template?.type === 'plugin') {
|
||||
if (debugFlag) {
|
||||
debug(`plugin template detected - no .env added .env.example added`)
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
if (!fs.existsSync(envExamplePath)) {
|
||||
updatedExampleContents = updateEnvExampleVariables(
|
||||
emptyEnvContent,
|
||||
databaseType,
|
||||
payloadSecret,
|
||||
databaseUri,
|
||||
)
|
||||
|
||||
await fs.writeFile(envExamplePath, updatedExampleContents)
|
||||
if (debugFlag) {
|
||||
debug(`.env.example file successfully created`)
|
||||
}
|
||||
} else {
|
||||
const envExampleContents = await fs.readFile(envExamplePath, 'utf8')
|
||||
updatedExampleContents = updateEnvExampleVariables(envExampleContents, databaseType)
|
||||
|
||||
await fs.writeFile(envExamplePath, updatedExampleContents.trimEnd() + '\n')
|
||||
const mergedEnvs = envExampleContents + '\n' + emptyEnvContent
|
||||
updatedExampleContents = updateEnvExampleVariables(
|
||||
mergedEnvs,
|
||||
databaseType,
|
||||
payloadSecret,
|
||||
databaseUri,
|
||||
)
|
||||
|
||||
await fs.writeFile(envExamplePath, updatedExampleContents)
|
||||
if (debugFlag) {
|
||||
debug(`.env.example file successfully updated`)
|
||||
}
|
||||
} else {
|
||||
updatedExampleContents = `# Added by Payload\nDATABASE_URI=your-connection-string-here\nPAYLOAD_SECRET=YOUR_SECRET_HERE\n`
|
||||
await fs.writeFile(envExamplePath, updatedExampleContents.trimEnd() + '\n')
|
||||
}
|
||||
|
||||
// Merge existing variables and create or update .env
|
||||
const envExampleContents = await fs.readFile(envExamplePath, 'utf8')
|
||||
const envContent = generateEnvContent(
|
||||
envExampleContents,
|
||||
databaseType,
|
||||
databaseUri,
|
||||
payloadSecret,
|
||||
)
|
||||
await fs.writeFile(envPath, `# Added by Payload\n${envContent.trimEnd()}\n`)
|
||||
if (!fs.existsSync(envPath)) {
|
||||
const envContent = updateEnvExampleVariables(
|
||||
emptyEnvContent,
|
||||
databaseType,
|
||||
payloadSecret,
|
||||
databaseUri,
|
||||
)
|
||||
await fs.writeFile(envPath, envContent)
|
||||
|
||||
if (debugFlag) {
|
||||
debug(`.env file successfully created or updated`)
|
||||
if (debugFlag) {
|
||||
debug(`.env file successfully created`)
|
||||
}
|
||||
} else {
|
||||
const envContents = await fs.readFile(envPath, 'utf8')
|
||||
const mergedEnvs = envContents + '\n' + emptyEnvContent
|
||||
const updatedEnvContents = updateEnvExampleVariables(
|
||||
mergedEnvs,
|
||||
databaseType,
|
||||
payloadSecret,
|
||||
databaseUri,
|
||||
)
|
||||
|
||||
await fs.writeFile(envPath, updatedEnvContents)
|
||||
if (debugFlag) {
|
||||
debug(`.env file successfully updated`)
|
||||
}
|
||||
}
|
||||
} catch (err: unknown) {
|
||||
error('Unable to manage environment files')
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/db-mongodb",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "The officially supported MongoDB database adapter for Payload",
|
||||
"homepage": "https://payloadcms.com",
|
||||
"repository": {
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/db-postgres",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "The officially supported Postgres database adapter for Payload",
|
||||
"homepage": "https://payloadcms.com",
|
||||
"repository": {
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/db-sqlite",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "The officially supported SQLite database adapter for Payload",
|
||||
"homepage": "https://payloadcms.com",
|
||||
"repository": {
|
||||
|
||||
@@ -16,7 +16,7 @@ export const countDistinct: CountDistinct = async function countDistinct(
|
||||
})
|
||||
.from(this.tables[tableName])
|
||||
.where(where)
|
||||
return Number(countResult[0]?.count)
|
||||
return Number(countResult?.[0]?.count ?? 0)
|
||||
}
|
||||
|
||||
let query: SQLiteSelect = db
|
||||
@@ -39,5 +39,5 @@ export const countDistinct: CountDistinct = async function countDistinct(
|
||||
// Instead, COUNT (GROUP BY id) can be used which is still slower than COUNT(*) but acceptable.
|
||||
const countResult = await query
|
||||
|
||||
return Number(countResult[0]?.count)
|
||||
return Number(countResult?.[0]?.count ?? 0)
|
||||
}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/db-vercel-postgres",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "Vercel Postgres adapter for Payload",
|
||||
"homepage": "https://payloadcms.com",
|
||||
"repository": {
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/drizzle",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "A library of shared functions used by different payload database adapters",
|
||||
"homepage": "https://payloadcms.com",
|
||||
"repository": {
|
||||
|
||||
@@ -46,6 +46,7 @@ export const findMany = async function find({
|
||||
const offset = skip || (page - 1) * limit
|
||||
|
||||
if (limit === 0) {
|
||||
pagination = false
|
||||
limit = undefined
|
||||
}
|
||||
|
||||
|
||||
@@ -42,33 +42,36 @@ export const migrate: DrizzleAdapter['migrate'] = async function migrate(
|
||||
limit: 0,
|
||||
sort: '-name',
|
||||
}))
|
||||
|
||||
if (migrationsInDB.find((m) => m.batch === -1)) {
|
||||
const { confirm: runMigrations } = await prompts(
|
||||
{
|
||||
name: 'confirm',
|
||||
type: 'confirm',
|
||||
initial: false,
|
||||
message:
|
||||
"It looks like you've run Payload in dev mode, meaning you've dynamically pushed changes to your database.\n\n" +
|
||||
"If you'd like to run migrations, data loss will occur. Would you like to proceed?",
|
||||
},
|
||||
{
|
||||
onCancel: () => {
|
||||
process.exit(0)
|
||||
},
|
||||
},
|
||||
)
|
||||
|
||||
if (!runMigrations) {
|
||||
process.exit(0)
|
||||
}
|
||||
// ignore the dev migration so that the latest batch number increments correctly
|
||||
migrationsInDB = migrationsInDB.filter((m) => m.batch !== -1)
|
||||
}
|
||||
|
||||
if (Number(migrationsInDB?.[0]?.batch) > 0) {
|
||||
latestBatch = Number(migrationsInDB[0]?.batch)
|
||||
}
|
||||
}
|
||||
|
||||
if (migrationsInDB.find((m) => m.batch === -1)) {
|
||||
const { confirm: runMigrations } = await prompts(
|
||||
{
|
||||
name: 'confirm',
|
||||
type: 'confirm',
|
||||
initial: false,
|
||||
message:
|
||||
"It looks like you've run Payload in dev mode, meaning you've dynamically pushed changes to your database.\n\n" +
|
||||
"If you'd like to run migrations, data loss will occur. Would you like to proceed?",
|
||||
},
|
||||
{
|
||||
onCancel: () => {
|
||||
process.exit(0)
|
||||
},
|
||||
},
|
||||
)
|
||||
|
||||
if (!runMigrations) {
|
||||
process.exit(0)
|
||||
}
|
||||
}
|
||||
|
||||
const newBatch = latestBatch + 1
|
||||
|
||||
// Execute 'up' function for each migration sequentially
|
||||
|
||||
@@ -16,7 +16,8 @@ export const countDistinct: CountDistinct = async function countDistinct(
|
||||
})
|
||||
.from(this.tables[tableName])
|
||||
.where(where)
|
||||
return Number(countResult[0].count)
|
||||
|
||||
return Number(countResult?.[0]?.count ?? 0)
|
||||
}
|
||||
|
||||
let query = db
|
||||
@@ -39,5 +40,5 @@ export const countDistinct: CountDistinct = async function countDistinct(
|
||||
// Instead, COUNT (GROUP BY id) can be used which is still slower than COUNT(*) but acceptable.
|
||||
const countResult = await query
|
||||
|
||||
return Number(countResult[0].count)
|
||||
return Number(countResult?.[0]?.count ?? 0)
|
||||
}
|
||||
|
||||
@@ -36,7 +36,6 @@ type Args = {
|
||||
*/
|
||||
export const migratePostgresV2toV3 = async ({ debug, payload, req }: Args) => {
|
||||
const adapter = payload.db as unknown as BasePostgresAdapter
|
||||
const db = await getTransaction(adapter, req)
|
||||
const dir = payload.db.migrationDir
|
||||
|
||||
// get the drizzle migrateUpSQL from drizzle using the last schema
|
||||
@@ -89,6 +88,8 @@ export const migratePostgresV2toV3 = async ({ debug, payload, req }: Args) => {
|
||||
payload.logger.info(addColumnsStatement)
|
||||
}
|
||||
|
||||
const db = await getTransaction(adapter, req)
|
||||
|
||||
await db.execute(sql.raw(addColumnsStatement))
|
||||
|
||||
for (const collection of payload.config.collections) {
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/email-nodemailer",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "Payload Nodemailer Email Adapter",
|
||||
"homepage": "https://payloadcms.com",
|
||||
"repository": {
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/email-resend",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "Payload Resend Email Adapter",
|
||||
"homepage": "https://payloadcms.com",
|
||||
"repository": {
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/graphql",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"homepage": "https://payloadcms.com",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
|
||||
@@ -11,6 +11,7 @@ export type ObjectTypeConfig = {
|
||||
|
||||
type Args = {
|
||||
baseFields?: ObjectTypeConfig
|
||||
collectionSlug?: string
|
||||
config: SanitizedConfig
|
||||
fields: Field[]
|
||||
forceNullable?: boolean
|
||||
@@ -23,6 +24,7 @@ type Args = {
|
||||
export function buildObjectType({
|
||||
name,
|
||||
baseFields = {},
|
||||
collectionSlug,
|
||||
config,
|
||||
fields,
|
||||
forceNullable,
|
||||
@@ -43,6 +45,7 @@ export function buildObjectType({
|
||||
return {
|
||||
...objectTypeConfig,
|
||||
...fieldSchema({
|
||||
collectionSlug,
|
||||
config,
|
||||
field,
|
||||
forceNullable,
|
||||
|
||||
@@ -10,11 +10,11 @@ export const buildPaginatedListType = (name, docType) =>
|
||||
hasNextPage: { type: new GraphQLNonNull(GraphQLBoolean) },
|
||||
hasPrevPage: { type: new GraphQLNonNull(GraphQLBoolean) },
|
||||
limit: { type: new GraphQLNonNull(GraphQLInt) },
|
||||
nextPage: { type: new GraphQLNonNull(GraphQLInt) },
|
||||
nextPage: { type: GraphQLInt },
|
||||
offset: { type: GraphQLInt },
|
||||
page: { type: new GraphQLNonNull(GraphQLInt) },
|
||||
pagingCounter: { type: new GraphQLNonNull(GraphQLInt) },
|
||||
prevPage: { type: new GraphQLNonNull(GraphQLInt) },
|
||||
prevPage: { type: GraphQLInt },
|
||||
totalDocs: { type: new GraphQLNonNull(GraphQLInt) },
|
||||
totalPages: { type: new GraphQLNonNull(GraphQLInt) },
|
||||
},
|
||||
|
||||
@@ -8,6 +8,7 @@ import type {
|
||||
DateField,
|
||||
EmailField,
|
||||
Field,
|
||||
FlattenedJoinField,
|
||||
GraphQLInfo,
|
||||
GroupField,
|
||||
JoinField,
|
||||
@@ -68,6 +69,7 @@ function formattedNameResolver({
|
||||
}
|
||||
|
||||
type SharedArgs = {
|
||||
collectionSlug?: string
|
||||
config: SanitizedConfig
|
||||
forceNullable?: boolean
|
||||
graphqlResult: GraphQLInfo
|
||||
@@ -340,7 +342,7 @@ export const fieldToSchemaMap: FieldToSchemaMap = {
|
||||
},
|
||||
}
|
||||
},
|
||||
join: ({ field, graphqlResult, objectTypeConfig, parentName }) => {
|
||||
join: ({ collectionSlug, field, graphqlResult, objectTypeConfig, parentName }) => {
|
||||
const joinName = combineParentName(parentName, toWords(field.name, true))
|
||||
|
||||
const joinType = {
|
||||
@@ -385,27 +387,54 @@ export const fieldToSchemaMap: FieldToSchemaMap = {
|
||||
|
||||
const draft = Boolean(args.draft ?? context.req.query?.draft)
|
||||
|
||||
const fullWhere = combineQueries(where, {
|
||||
[field.on]: { equals: parent._id ?? parent.id },
|
||||
})
|
||||
const targetField = (field as FlattenedJoinField).targetField
|
||||
|
||||
const fullWhere = combineQueries(
|
||||
where,
|
||||
Array.isArray(targetField.relationTo)
|
||||
? {
|
||||
[field.on]: {
|
||||
equals: {
|
||||
relationTo: collectionSlug,
|
||||
value: parent._id ?? parent.id,
|
||||
},
|
||||
},
|
||||
}
|
||||
: {
|
||||
[field.on]: { equals: parent._id ?? parent.id },
|
||||
},
|
||||
)
|
||||
|
||||
if (Array.isArray(collection)) {
|
||||
throw new Error('GraphQL with array of join.field.collection is not implemented')
|
||||
}
|
||||
|
||||
return await req.payload.find({
|
||||
const { docs } = await req.payload.find({
|
||||
collection,
|
||||
depth: 0,
|
||||
draft,
|
||||
fallbackLocale: req.fallbackLocale,
|
||||
limit,
|
||||
// Fetch one extra document to determine if there are more documents beyond the requested limit (used for hasNextPage calculation).
|
||||
limit: typeof limit === 'number' && limit > 0 ? limit + 1 : 0,
|
||||
locale: req.locale,
|
||||
overrideAccess: false,
|
||||
page,
|
||||
pagination: false,
|
||||
req,
|
||||
sort,
|
||||
where: fullWhere,
|
||||
})
|
||||
|
||||
let shouldSlice = false
|
||||
|
||||
if (typeof limit === 'number' && limit !== 0 && limit < docs.length) {
|
||||
shouldSlice = true
|
||||
}
|
||||
|
||||
return {
|
||||
docs: shouldSlice ? docs.slice(0, -1) : docs,
|
||||
hasNextPage: limit === 0 ? false : limit < docs.length,
|
||||
}
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
@@ -29,6 +29,7 @@ import { recursivelyBuildNestedPaths } from './recursivelyBuildNestedPaths.js'
|
||||
import { withOperators } from './withOperators.js'
|
||||
|
||||
type Args = {
|
||||
collectionSlug?: string
|
||||
nestedFieldName?: string
|
||||
parentName: string
|
||||
}
|
||||
|
||||
@@ -111,6 +111,7 @@ export function initCollections({ config, graphqlResult }: InitCollectionsGraphQ
|
||||
collection.graphQL.type = buildObjectType({
|
||||
name: singularName,
|
||||
baseFields,
|
||||
collectionSlug: collectionConfig.slug,
|
||||
config,
|
||||
fields,
|
||||
forceNullable: forceNullableObjectType,
|
||||
@@ -339,6 +340,7 @@ export function initCollections({ config, graphqlResult }: InitCollectionsGraphQ
|
||||
|
||||
collection.graphQL.versionType = buildObjectType({
|
||||
name: `${singularName}Version`,
|
||||
collectionSlug: collectionConfig.slug,
|
||||
config,
|
||||
fields: versionCollectionFields,
|
||||
forceNullable: forceNullableObjectType,
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/live-preview-react",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "The official React SDK for Payload Live Preview",
|
||||
"homepage": "https://payloadcms.com",
|
||||
"repository": {
|
||||
|
||||
@@ -7,7 +7,7 @@ import { useCallback, useEffect, useRef, useState } from 'react'
|
||||
// To prevent the flicker of stale data while the post message is being sent,
|
||||
// you can conditionally render loading UI based on the `isLoading` state
|
||||
|
||||
export const useLivePreview = <T extends any>(props: {
|
||||
export const useLivePreview = <T extends Record<string, unknown>>(props: {
|
||||
apiRoute?: string
|
||||
depth?: number
|
||||
initialData: T
|
||||
@@ -21,7 +21,7 @@ export const useLivePreview = <T extends any>(props: {
|
||||
const [isLoading, setIsLoading] = useState<boolean>(true)
|
||||
const hasSentReadyMessage = useRef<boolean>(false)
|
||||
|
||||
const onChange = useCallback((mergedData) => {
|
||||
const onChange = useCallback((mergedData: T) => {
|
||||
setData(mergedData)
|
||||
setIsLoading(false)
|
||||
}, [])
|
||||
|
||||
@@ -1,9 +1,4 @@
|
||||
{
|
||||
"extends": "../../tsconfig.base.json",
|
||||
"compilerOptions": {
|
||||
/* TODO: remove the following lines */
|
||||
"strict": false,
|
||||
"noUncheckedIndexedAccess": false,
|
||||
},
|
||||
"references": [{ "path": "../payload" }]
|
||||
}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/live-preview-vue",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "The official Vue SDK for Payload Live Preview",
|
||||
"homepage": "https://payloadcms.com",
|
||||
"repository": {
|
||||
|
||||
@@ -8,7 +8,7 @@ import { onMounted, onUnmounted, ref } from 'vue'
|
||||
*
|
||||
* {@link https://payloadcms.com/docs/live-preview/frontend View the documentation}
|
||||
*/
|
||||
export const useLivePreview = <T>(props: {
|
||||
export const useLivePreview = <T extends Record<string, unknown>>(props: {
|
||||
apiRoute?: string
|
||||
depth?: number
|
||||
initialData: T
|
||||
@@ -27,7 +27,7 @@ export const useLivePreview = <T>(props: {
|
||||
isLoading.value = false
|
||||
}
|
||||
|
||||
let subscription: (event: MessageEvent) => void
|
||||
let subscription: (event: MessageEvent) => Promise<void> | void
|
||||
|
||||
onMounted(() => {
|
||||
subscription = subscribe({
|
||||
|
||||
@@ -1,9 +1,4 @@
|
||||
{
|
||||
"extends": "../../tsconfig.base.json",
|
||||
"compilerOptions": {
|
||||
/* TODO: remove the following lines */
|
||||
"strict": false,
|
||||
"noUncheckedIndexedAccess": false,
|
||||
},
|
||||
"references": [{ "path": "../payload" }] // db-mongodb depends on payload
|
||||
}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/live-preview",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "The official live preview JavaScript SDK for Payload",
|
||||
"homepage": "https://payloadcms.com",
|
||||
"repository": {
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import type { FieldSchemaJSON } from 'payload'
|
||||
|
||||
import type { LivePreviewMessageEvent } from './types.js'
|
||||
import type { CollectionPopulationRequestHandler, LivePreviewMessageEvent } from './types.js'
|
||||
|
||||
import { isLivePreviewEvent } from './isLivePreviewEvent.js'
|
||||
import { mergeData } from './mergeData.js'
|
||||
@@ -29,9 +29,10 @@ export const handleMessage = async <T extends Record<string, any>>(args: {
|
||||
depth?: number
|
||||
event: LivePreviewMessageEvent<T>
|
||||
initialData: T
|
||||
requestHandler?: CollectionPopulationRequestHandler
|
||||
serverURL: string
|
||||
}): Promise<T> => {
|
||||
const { apiRoute, depth, event, initialData, serverURL } = args
|
||||
const { apiRoute, depth, event, initialData, requestHandler, serverURL } = args
|
||||
|
||||
if (isLivePreviewEvent(event, serverURL)) {
|
||||
const { data, externallyUpdatedRelationship, fieldSchemaJSON, locale } = event.data
|
||||
@@ -57,6 +58,7 @@ export const handleMessage = async <T extends Record<string, any>>(args: {
|
||||
incomingData: data,
|
||||
initialData: _payloadLivePreview?.previousData || initialData,
|
||||
locale,
|
||||
requestHandler,
|
||||
serverURL,
|
||||
})
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import type { DocumentEvent, FieldSchemaJSON, PaginatedDocs } from 'payload'
|
||||
|
||||
import type { PopulationsByCollection } from './types.js'
|
||||
import type { CollectionPopulationRequestHandler, PopulationsByCollection } from './types.js'
|
||||
|
||||
import { traverseFields } from './traverseFields.js'
|
||||
|
||||
@@ -29,21 +29,17 @@ let prevLocale: string | undefined
|
||||
|
||||
export const mergeData = async <T extends Record<string, any>>(args: {
|
||||
apiRoute?: string
|
||||
collectionPopulationRequestHandler?: ({
|
||||
apiPath,
|
||||
endpoint,
|
||||
serverURL,
|
||||
}: {
|
||||
apiPath: string
|
||||
endpoint: string
|
||||
serverURL: string
|
||||
}) => Promise<Response>
|
||||
/**
|
||||
* @deprecated Use `requestHandler` instead
|
||||
*/
|
||||
collectionPopulationRequestHandler?: CollectionPopulationRequestHandler
|
||||
depth?: number
|
||||
externallyUpdatedRelationship?: DocumentEvent
|
||||
fieldSchema: FieldSchemaJSON
|
||||
incomingData: Partial<T>
|
||||
initialData: T
|
||||
locale?: string
|
||||
requestHandler?: CollectionPopulationRequestHandler
|
||||
returnNumberOfRequests?: boolean
|
||||
serverURL: string
|
||||
}): Promise<
|
||||
@@ -81,7 +77,8 @@ export const mergeData = async <T extends Record<string, any>>(args: {
|
||||
let res: PaginatedDocs
|
||||
|
||||
const ids = new Set(populations.map(({ id }) => id))
|
||||
const requestHandler = args.collectionPopulationRequestHandler || defaultRequestHandler
|
||||
const requestHandler =
|
||||
args.collectionPopulationRequestHandler || args.requestHandler || defaultRequestHandler
|
||||
|
||||
try {
|
||||
res = await requestHandler({
|
||||
|
||||
@@ -1,3 +1,5 @@
|
||||
import type { CollectionPopulationRequestHandler } from './types.js'
|
||||
|
||||
import { handleMessage } from './handleMessage.js'
|
||||
|
||||
export const subscribe = <T extends Record<string, any>>(args: {
|
||||
@@ -5,9 +7,10 @@ export const subscribe = <T extends Record<string, any>>(args: {
|
||||
callback: (data: T) => void
|
||||
depth?: number
|
||||
initialData: T
|
||||
requestHandler?: CollectionPopulationRequestHandler
|
||||
serverURL: string
|
||||
}): ((event: MessageEvent) => Promise<void> | void) => {
|
||||
const { apiRoute, callback, depth, initialData, serverURL } = args
|
||||
const { apiRoute, callback, depth, initialData, requestHandler, serverURL } = args
|
||||
|
||||
const onMessage = async (event: MessageEvent) => {
|
||||
const mergedData = await handleMessage<T>({
|
||||
@@ -15,6 +18,7 @@ export const subscribe = <T extends Record<string, any>>(args: {
|
||||
depth,
|
||||
event,
|
||||
initialData,
|
||||
requestHandler,
|
||||
serverURL,
|
||||
})
|
||||
|
||||
|
||||
@@ -1,5 +1,15 @@
|
||||
import type { DocumentEvent, FieldSchemaJSON } from 'payload'
|
||||
|
||||
export type CollectionPopulationRequestHandler = ({
|
||||
apiPath,
|
||||
endpoint,
|
||||
serverURL,
|
||||
}: {
|
||||
apiPath: string
|
||||
endpoint: string
|
||||
serverURL: string
|
||||
}) => Promise<Response>
|
||||
|
||||
export type LivePreviewArgs = {}
|
||||
|
||||
export type LivePreview = void
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/next",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"homepage": "https://payloadcms.com",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
|
||||
@@ -2,7 +2,14 @@
|
||||
|
||||
import type { PaginatedDocs, Where } from 'payload'
|
||||
|
||||
import { fieldBaseClass, Pill, ReactSelect, useConfig, useTranslation } from '@payloadcms/ui'
|
||||
import {
|
||||
fieldBaseClass,
|
||||
Pill,
|
||||
ReactSelect,
|
||||
useConfig,
|
||||
useDocumentInfo,
|
||||
useTranslation,
|
||||
} from '@payloadcms/ui'
|
||||
import { formatDate } from '@payloadcms/ui/shared'
|
||||
import { stringify } from 'qs-esm'
|
||||
import React, { useCallback, useEffect, useState } from 'react'
|
||||
@@ -37,6 +44,8 @@ export const SelectComparison: React.FC<Props> = (props) => {
|
||||
},
|
||||
} = useConfig()
|
||||
|
||||
const { hasPublishedDoc } = useDocumentInfo()
|
||||
|
||||
const [options, setOptions] = useState<
|
||||
{
|
||||
label: React.ReactNode | string
|
||||
@@ -109,7 +118,10 @@ export const SelectComparison: React.FC<Props> = (props) => {
|
||||
},
|
||||
published: {
|
||||
currentLabel: t('version:currentPublishedVersion'),
|
||||
latestVersion: latestPublishedVersion,
|
||||
// The latest published version does not necessarily equal the current published version,
|
||||
// because the latest published version might have been unpublished in the meantime.
|
||||
// Hence, we should only use the latest published version if there is a published document.
|
||||
latestVersion: hasPublishedDoc ? latestPublishedVersion : undefined,
|
||||
pillStyle: 'success',
|
||||
previousLabel: t('version:previouslyPublished'),
|
||||
},
|
||||
|
||||
@@ -85,13 +85,34 @@ export async function VersionsView(props: DocumentViewServerProps) {
|
||||
payload,
|
||||
status: 'draft',
|
||||
})
|
||||
latestPublishedVersion = await getLatestVersion({
|
||||
slug: collectionSlug,
|
||||
type: 'collection',
|
||||
parentID: id,
|
||||
payload,
|
||||
status: 'published',
|
||||
const publishedDoc = await payload.count({
|
||||
collection: collectionSlug,
|
||||
depth: 0,
|
||||
overrideAccess: true,
|
||||
req,
|
||||
where: {
|
||||
id: {
|
||||
equals: id,
|
||||
},
|
||||
_status: {
|
||||
equals: 'published',
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
// If we pass a latestPublishedVersion to buildVersionColumns,
|
||||
// this will be used to display it as the "current published version".
|
||||
// However, the latest published version might have been unpublished in the meantime.
|
||||
// Hence, we should only pass the latest published version if there is a published document.
|
||||
latestPublishedVersion =
|
||||
publishedDoc.totalDocs > 0 &&
|
||||
(await getLatestVersion({
|
||||
slug: collectionSlug,
|
||||
type: 'collection',
|
||||
parentID: id,
|
||||
payload,
|
||||
status: 'published',
|
||||
}))
|
||||
}
|
||||
} catch (err) {
|
||||
logError({ err, payload })
|
||||
|
||||
@@ -140,6 +140,13 @@ export const withPayload = (nextConfig = {}, options = {}) => {
|
||||
{ module: /node_modules\/mongodb\/lib\/bson\.js/ },
|
||||
{ file: /node_modules\/mongodb\/lib\/bson\.js/ },
|
||||
],
|
||||
plugins: [
|
||||
...(incomingWebpackConfig?.plugins || []),
|
||||
// Fix cloudflare:sockets error: https://github.com/vercel/next.js/discussions/50177
|
||||
new webpackOptions.webpack.IgnorePlugin({
|
||||
resourceRegExp: /^pg-native$|^cloudflare:sockets$/,
|
||||
}),
|
||||
],
|
||||
resolve: {
|
||||
...(incomingWebpackConfig?.resolve || {}),
|
||||
alias: {
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/payload-cloud",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "The official Payload Cloud plugin",
|
||||
"homepage": "https://payloadcms.com",
|
||||
"repository": {
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "payload",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "Node, React, Headless CMS and Application Framework built on Next.js",
|
||||
"keywords": [
|
||||
"admin panel",
|
||||
|
||||
@@ -1,7 +0,0 @@
|
||||
const isLocked = (date: number): boolean => {
|
||||
if (!date) {
|
||||
return false
|
||||
}
|
||||
return date > Date.now()
|
||||
}
|
||||
export default isLocked
|
||||
6
packages/payload/src/auth/isUserLocked.ts
Normal file
6
packages/payload/src/auth/isUserLocked.ts
Normal file
@@ -0,0 +1,6 @@
|
||||
export const isUserLocked = (date: number): boolean => {
|
||||
if (!date) {
|
||||
return false
|
||||
}
|
||||
return date > Date.now()
|
||||
}
|
||||
@@ -138,15 +138,17 @@ export const forgotPasswordOperation = async <TSlug extends CollectionSlug>(
|
||||
return null
|
||||
}
|
||||
|
||||
user.resetPasswordToken = token
|
||||
user.resetPasswordExpiration = new Date(
|
||||
const resetPasswordExpiration = new Date(
|
||||
Date.now() + (collectionConfig.auth?.forgotPassword?.expiration ?? expiration ?? 3600000),
|
||||
).toISOString()
|
||||
|
||||
user = await payload.update({
|
||||
id: user.id,
|
||||
collection: collectionConfig.slug,
|
||||
data: user,
|
||||
data: {
|
||||
resetPasswordExpiration,
|
||||
resetPasswordToken: token,
|
||||
},
|
||||
req,
|
||||
})
|
||||
|
||||
|
||||
@@ -3,6 +3,7 @@ import type {
|
||||
AuthOperationsFromCollectionSlug,
|
||||
Collection,
|
||||
DataFromCollectionSlug,
|
||||
SanitizedCollectionConfig,
|
||||
} from '../../collections/config/types.js'
|
||||
import type { CollectionSlug } from '../../index.js'
|
||||
import type { PayloadRequest, Where } from '../../types/index.js'
|
||||
@@ -21,7 +22,7 @@ import { killTransaction } from '../../utilities/killTransaction.js'
|
||||
import sanitizeInternalFields from '../../utilities/sanitizeInternalFields.js'
|
||||
import { getFieldsToSign } from '../getFieldsToSign.js'
|
||||
import { getLoginOptions } from '../getLoginOptions.js'
|
||||
import isLocked from '../isLocked.js'
|
||||
import { isUserLocked } from '../isUserLocked.js'
|
||||
import { jwtSign } from '../jwt.js'
|
||||
import { authenticateLocalStrategy } from '../strategies/local/authenticate.js'
|
||||
import { incrementLoginAttempts } from '../strategies/local/incrementLoginAttempts.js'
|
||||
@@ -42,6 +43,32 @@ export type Arguments<TSlug extends CollectionSlug> = {
|
||||
showHiddenFields?: boolean
|
||||
}
|
||||
|
||||
type CheckLoginPermissionArgs = {
|
||||
collection: SanitizedCollectionConfig
|
||||
loggingInWithUsername?: boolean
|
||||
req: PayloadRequest
|
||||
user: any
|
||||
}
|
||||
|
||||
export const checkLoginPermission = ({
|
||||
collection,
|
||||
loggingInWithUsername,
|
||||
req,
|
||||
user,
|
||||
}: CheckLoginPermissionArgs) => {
|
||||
if (!user) {
|
||||
throw new AuthenticationError(req.t, Boolean(loggingInWithUsername))
|
||||
}
|
||||
|
||||
if (collection.auth.verify && user._verified === false) {
|
||||
throw new UnverifiedEmail({ t: req.t })
|
||||
}
|
||||
|
||||
if (isUserLocked(new Date(user.lockUntil).getTime())) {
|
||||
throw new LockedAuth(req.t)
|
||||
}
|
||||
}
|
||||
|
||||
export const loginOperation = async <TSlug extends CollectionSlug>(
|
||||
incomingArgs: Arguments<TSlug>,
|
||||
): Promise<{ user: DataFromCollectionSlug<TSlug> } & Result> => {
|
||||
@@ -184,21 +211,16 @@ export const loginOperation = async <TSlug extends CollectionSlug>(
|
||||
where: whereConstraint,
|
||||
})
|
||||
|
||||
if (!user) {
|
||||
throw new AuthenticationError(req.t, Boolean(canLoginWithUsername && sanitizedUsername))
|
||||
}
|
||||
|
||||
if (args.collection.config.auth.verify && user._verified === false) {
|
||||
throw new UnverifiedEmail({ t: req.t })
|
||||
}
|
||||
checkLoginPermission({
|
||||
collection: collectionConfig,
|
||||
loggingInWithUsername: Boolean(canLoginWithUsername && sanitizedUsername),
|
||||
req,
|
||||
user,
|
||||
})
|
||||
|
||||
user.collection = collectionConfig.slug
|
||||
user._strategy = 'local-jwt'
|
||||
|
||||
if (isLocked(new Date(user.lockUntil).getTime())) {
|
||||
throw new LockedAuth(req.t)
|
||||
}
|
||||
|
||||
const authResult = await authenticateLocalStrategy({ doc: user, password })
|
||||
|
||||
user = sanitizeInternalFields(user)
|
||||
|
||||
@@ -247,6 +247,7 @@ export const createOperation = async <
|
||||
let doc
|
||||
|
||||
const select = sanitizeSelect({
|
||||
fields: collectionConfig.flattenedFields,
|
||||
forceSelect: collectionConfig.forceSelect,
|
||||
select: incomingSelect,
|
||||
})
|
||||
|
||||
@@ -110,6 +110,7 @@ export const deleteOperation = async <
|
||||
const fullWhere = combineQueries(where, accessResult)
|
||||
|
||||
const select = sanitizeSelect({
|
||||
fields: collectionConfig.flattenedFields,
|
||||
forceSelect: collectionConfig.forceSelect,
|
||||
select: incomingSelect,
|
||||
})
|
||||
|
||||
@@ -168,6 +168,7 @@ export const deleteByIDOperation = async <TSlug extends CollectionSlug, TSelect
|
||||
}
|
||||
|
||||
const select = sanitizeSelect({
|
||||
fields: collectionConfig.flattenedFields,
|
||||
forceSelect: collectionConfig.forceSelect,
|
||||
select: incomingSelect,
|
||||
})
|
||||
|
||||
@@ -102,6 +102,7 @@ export const findOperation = async <
|
||||
} = args
|
||||
|
||||
const select = sanitizeSelect({
|
||||
fields: collectionConfig.flattenedFields,
|
||||
forceSelect: collectionConfig.forceSelect,
|
||||
select: incomingSelect,
|
||||
})
|
||||
|
||||
@@ -87,6 +87,7 @@ export const findByIDOperation = async <
|
||||
} = args
|
||||
|
||||
const select = sanitizeSelect({
|
||||
fields: collectionConfig.flattenedFields,
|
||||
forceSelect: collectionConfig.forceSelect,
|
||||
select: incomingSelect,
|
||||
})
|
||||
|
||||
@@ -11,6 +11,7 @@ import { APIError, Forbidden, NotFound } from '../../errors/index.js'
|
||||
import { afterRead } from '../../fields/hooks/afterRead/index.js'
|
||||
import { killTransaction } from '../../utilities/killTransaction.js'
|
||||
import { sanitizeSelect } from '../../utilities/sanitizeSelect.js'
|
||||
import { buildVersionCollectionFields } from '../../versions/buildCollectionFields.js'
|
||||
import { getQueryDraftsSelect } from '../../versions/drafts/getQueryDraftsSelect.js'
|
||||
|
||||
export type Arguments = {
|
||||
@@ -70,8 +71,10 @@ export const findVersionByIDOperation = async <TData extends TypeWithID = any>(
|
||||
// /////////////////////////////////////
|
||||
|
||||
const select = sanitizeSelect({
|
||||
fields: buildVersionCollectionFields(payload.config, collectionConfig, true),
|
||||
forceSelect: getQueryDraftsSelect({ select: collectionConfig.forceSelect }),
|
||||
select: incomingSelect,
|
||||
versions: true,
|
||||
})
|
||||
|
||||
const versionsQuery = await payload.db.findVersions<TData>({
|
||||
|
||||
@@ -72,8 +72,10 @@ export const findVersionsOperation = async <TData extends TypeWithVersion<TData>
|
||||
const fullWhere = combineQueries(where, accessResults)
|
||||
|
||||
const select = sanitizeSelect({
|
||||
fields: buildVersionCollectionFields(payload.config, collectionConfig, true),
|
||||
forceSelect: getQueryDraftsSelect({ select: collectionConfig.forceSelect }),
|
||||
select: incomingSelect,
|
||||
versions: true,
|
||||
})
|
||||
|
||||
// /////////////////////////////////////
|
||||
|
||||
@@ -117,6 +117,7 @@ export const restoreVersionOperation = async <TData extends TypeWithID = any>(
|
||||
// /////////////////////////////////////
|
||||
|
||||
const select = sanitizeSelect({
|
||||
fields: collectionConfig.flattenedFields,
|
||||
forceSelect: collectionConfig.forceSelect,
|
||||
select: incomingSelect,
|
||||
})
|
||||
|
||||
@@ -201,6 +201,7 @@ export const updateOperation = async <
|
||||
|
||||
try {
|
||||
const select = sanitizeSelect({
|
||||
fields: collectionConfig.flattenedFields,
|
||||
forceSelect: collectionConfig.forceSelect,
|
||||
select: incomingSelect,
|
||||
})
|
||||
|
||||
@@ -161,6 +161,7 @@ export const updateByIDOperation = async <
|
||||
})
|
||||
|
||||
const select = sanitizeSelect({
|
||||
fields: collectionConfig.flattenedFields,
|
||||
forceSelect: collectionConfig.forceSelect,
|
||||
select: incomingSelect,
|
||||
})
|
||||
|
||||
@@ -83,6 +83,13 @@ export const addOrderableFieldsAndHook = (
|
||||
hidden: true,
|
||||
readOnly: true,
|
||||
},
|
||||
hooks: {
|
||||
beforeDuplicate: [
|
||||
({ siblingData }) => {
|
||||
delete siblingData[orderableFieldName]
|
||||
},
|
||||
],
|
||||
},
|
||||
index: true,
|
||||
required: true,
|
||||
// override the schema to make order fields optional for payload.create()
|
||||
@@ -275,5 +282,6 @@ export const addOrderableEndpoint = (config: SanitizedConfig) => {
|
||||
if (!config.endpoints) {
|
||||
config.endpoints = []
|
||||
}
|
||||
|
||||
config.endpoints.push(reorderEndpoint)
|
||||
}
|
||||
|
||||
@@ -1061,6 +1061,7 @@ export type SelectField = {
|
||||
} & Admin['components']
|
||||
isClearable?: boolean
|
||||
isSortable?: boolean
|
||||
placeholder?: LabelFunction | string
|
||||
} & Admin
|
||||
/**
|
||||
* Customize the SQL table name
|
||||
@@ -1093,7 +1094,7 @@ export type SelectField = {
|
||||
Omit<FieldBase, 'validate'>
|
||||
|
||||
export type SelectFieldClient = {
|
||||
admin?: AdminClient & Pick<SelectField['admin'], 'isClearable' | 'isSortable'>
|
||||
admin?: AdminClient & Pick<SelectField['admin'], 'isClearable' | 'isSortable' | 'placeholder'>
|
||||
} & FieldBaseClient &
|
||||
Pick<SelectField, 'hasMany' | 'interfaceName' | 'options' | 'type'>
|
||||
|
||||
@@ -1160,10 +1161,11 @@ type RelationshipAdmin = {
|
||||
>
|
||||
} & Admin['components']
|
||||
isSortable?: boolean
|
||||
placeholder?: LabelFunction | string
|
||||
} & Admin
|
||||
|
||||
type RelationshipAdminClient = AdminClient &
|
||||
Pick<RelationshipAdmin, 'allowCreate' | 'allowEdit' | 'appearance' | 'isSortable'>
|
||||
Pick<RelationshipAdmin, 'allowCreate' | 'allowEdit' | 'appearance' | 'isSortable' | 'placeholder'>
|
||||
|
||||
export type PolymorphicRelationshipField = {
|
||||
admin?: {
|
||||
|
||||
@@ -200,7 +200,7 @@ export const email: EmailFieldValidation = (
|
||||
* Supports multiple subdomains (e.g., user@sub.domain.example.com)
|
||||
*/
|
||||
const emailRegex =
|
||||
/^(?!.*\.\.)[\w.%+-]+@[a-z0-9](?:[a-z0-9-]*[a-z0-9])?(?:\.[a-z0-9](?:[a-z0-9-]*[a-z0-9])?)*\.[a-z]{2,}$/i
|
||||
/^(?!.*\.\.)[\w!#$%&'*+/=?^`{|}~-](?:[\w!#$%&'*+/=?^`{|}~.-]*[\w!#$%&'*+/=?^`{|}~-])?@[a-z0-9](?:[a-z0-9-]*[a-z0-9])?(?:\.[a-z0-9](?:[a-z0-9-]*[a-z0-9])?)*\.[a-z]{2,}$/i
|
||||
|
||||
if ((value && !emailRegex.test(value)) || (!value && required)) {
|
||||
return t('validation:emailAddress')
|
||||
|
||||
@@ -53,6 +53,7 @@ export const findOneOperation = async <T extends Record<string, unknown>>(
|
||||
}
|
||||
|
||||
const select = sanitizeSelect({
|
||||
fields: globalConfig.flattenedFields,
|
||||
forceSelect: globalConfig.forceSelect,
|
||||
select: incomingSelect,
|
||||
})
|
||||
|
||||
@@ -11,6 +11,8 @@ import { afterRead } from '../../fields/hooks/afterRead/index.js'
|
||||
import { deepCopyObjectSimple } from '../../utilities/deepCopyObject.js'
|
||||
import { killTransaction } from '../../utilities/killTransaction.js'
|
||||
import { sanitizeSelect } from '../../utilities/sanitizeSelect.js'
|
||||
import { buildVersionCollectionFields } from '../../versions/buildCollectionFields.js'
|
||||
import { buildVersionGlobalFields } from '../../versions/buildGlobalFields.js'
|
||||
import { getQueryDraftsSelect } from '../../versions/drafts/getQueryDraftsSelect.js'
|
||||
|
||||
export type Arguments = {
|
||||
@@ -60,8 +62,10 @@ export const findVersionByIDOperation = async <T extends TypeWithVersion<T> = an
|
||||
const hasWhereAccess = typeof accessResults === 'object'
|
||||
|
||||
const select = sanitizeSelect({
|
||||
fields: buildVersionGlobalFields(payload.config, globalConfig, true),
|
||||
forceSelect: getQueryDraftsSelect({ select: globalConfig.forceSelect }),
|
||||
select: incomingSelect,
|
||||
versions: true,
|
||||
})
|
||||
|
||||
const findGlobalVersionsArgs: FindGlobalVersionsArgs = {
|
||||
|
||||
@@ -70,8 +70,10 @@ export const findVersionsOperation = async <T extends TypeWithVersion<T>>(
|
||||
const fullWhere = combineQueries(where, accessResults)
|
||||
|
||||
const select = sanitizeSelect({
|
||||
fields: buildVersionGlobalFields(payload.config, globalConfig, true),
|
||||
forceSelect: getQueryDraftsSelect({ select: globalConfig.forceSelect }),
|
||||
select: incomingSelect,
|
||||
versions: true,
|
||||
})
|
||||
|
||||
// /////////////////////////////////////
|
||||
|
||||
@@ -246,6 +246,7 @@ export const updateOperation = async <
|
||||
// /////////////////////////////////////
|
||||
|
||||
const select = sanitizeSelect({
|
||||
fields: globalConfig.flattenedFields,
|
||||
forceSelect: globalConfig.forceSelect,
|
||||
select: incomingSelect,
|
||||
})
|
||||
|
||||
@@ -89,6 +89,10 @@ import { traverseFields } from './utilities/traverseFields.js'
|
||||
|
||||
export { default as executeAccess } from './auth/executeAccess.js'
|
||||
export { executeAuthStrategies } from './auth/executeAuthStrategies.js'
|
||||
export { extractAccessFromPermission } from './auth/extractAccessFromPermission.js'
|
||||
export { getAccessResults } from './auth/getAccessResults.js'
|
||||
export { getFieldsToSign } from './auth/getFieldsToSign.js'
|
||||
export { getLoginOptions } from './auth/getLoginOptions.js'
|
||||
|
||||
export interface GeneratedTypes {
|
||||
authUntyped: {
|
||||
@@ -977,13 +981,12 @@ interface RequestContext {
|
||||
// eslint-disable-next-line @typescript-eslint/no-empty-object-type
|
||||
export interface DatabaseAdapter extends BaseDatabaseAdapter {}
|
||||
export type { Payload, RequestContext }
|
||||
export { extractAccessFromPermission } from './auth/extractAccessFromPermission.js'
|
||||
export { getAccessResults } from './auth/getAccessResults.js'
|
||||
export { getFieldsToSign } from './auth/getFieldsToSign.js'
|
||||
export * from './auth/index.js'
|
||||
export { jwtSign } from './auth/jwt.js'
|
||||
export { accessOperation } from './auth/operations/access.js'
|
||||
export { forgotPasswordOperation } from './auth/operations/forgotPassword.js'
|
||||
export { initOperation } from './auth/operations/init.js'
|
||||
export { checkLoginPermission } from './auth/operations/login.js'
|
||||
export { loginOperation } from './auth/operations/login.js'
|
||||
export { logoutOperation } from './auth/operations/logout.js'
|
||||
export type { MeOperationResult } from './auth/operations/me.js'
|
||||
@@ -994,6 +997,8 @@ export { resetPasswordOperation } from './auth/operations/resetPassword.js'
|
||||
export { unlockOperation } from './auth/operations/unlock.js'
|
||||
export { verifyEmailOperation } from './auth/operations/verifyEmail.js'
|
||||
export { JWTAuthentication } from './auth/strategies/jwt.js'
|
||||
export { incrementLoginAttempts } from './auth/strategies/local/incrementLoginAttempts.js'
|
||||
export { resetLoginAttempts } from './auth/strategies/local/resetLoginAttempts.js'
|
||||
export type {
|
||||
AuthStrategyFunction,
|
||||
AuthStrategyFunctionArgs,
|
||||
@@ -1201,6 +1206,7 @@ export {
|
||||
MissingFile,
|
||||
NotFound,
|
||||
QueryError,
|
||||
UnverifiedEmail,
|
||||
ValidationError,
|
||||
ValidationErrorName,
|
||||
} from './errors/index.js'
|
||||
|
||||
@@ -74,7 +74,7 @@ export const getConstraints = (config: Config): Field => ({
|
||||
},
|
||||
],
|
||||
},
|
||||
relationTo: 'users',
|
||||
relationTo: config.admin?.user ?? 'users', // TODO: remove this fallback when the args are properly typed as `SanitizedConfig`
|
||||
},
|
||||
...(config?.queryPresets?.constraints?.[operation]?.reduce(
|
||||
(acc: Field[], option: QueryPresetConstraint) => {
|
||||
|
||||
@@ -10,7 +10,7 @@ import type { SanitizedConfig } from '../config/types.js'
|
||||
import type { PayloadRequest } from '../types/index.js'
|
||||
import type { FileData, FileToSave, ProbedImageSize, UploadEdits } from './types.js'
|
||||
|
||||
import { FileRetrievalError, FileUploadError, MissingFile } from '../errors/index.js'
|
||||
import { FileRetrievalError, FileUploadError, Forbidden, MissingFile } from '../errors/index.js'
|
||||
import { canResizeImage } from './canResizeImage.js'
|
||||
import { cropImage } from './cropImage.js'
|
||||
import { getExternalFile } from './getExternalFile.js'
|
||||
@@ -85,6 +85,10 @@ export const generateFileData = async <T>({
|
||||
if (!file && uploadEdits && incomingFileData) {
|
||||
const { filename, url } = incomingFileData as FileData
|
||||
|
||||
if (filename && (filename.includes('../') || filename.includes('..\\'))) {
|
||||
throw new Forbidden(req.t)
|
||||
}
|
||||
|
||||
try {
|
||||
if (url && url.startsWith('/') && !disableLocalStorage) {
|
||||
const filePath = `${staticPath}/${filename}`
|
||||
|
||||
@@ -5,28 +5,28 @@ import path from 'path'
|
||||
|
||||
import type { PayloadRequest } from '../types/index.js'
|
||||
|
||||
const mimeTypeEstimate = {
|
||||
const mimeTypeEstimate: Record<string, string> = {
|
||||
svg: 'image/svg+xml',
|
||||
}
|
||||
|
||||
export const getFileByPath = async (filePath: string): Promise<PayloadRequest['file']> => {
|
||||
if (typeof filePath === 'string') {
|
||||
const data = await fs.readFile(filePath)
|
||||
const mimetype = fileTypeFromFile(filePath)
|
||||
const { size } = await fs.stat(filePath)
|
||||
|
||||
const name = path.basename(filePath)
|
||||
const ext = path.extname(filePath).slice(1)
|
||||
|
||||
const mime = (await mimetype)?.mime || mimeTypeEstimate[ext]
|
||||
|
||||
return {
|
||||
name,
|
||||
data,
|
||||
mimetype: mime,
|
||||
size,
|
||||
}
|
||||
if (typeof filePath !== 'string') {
|
||||
return undefined
|
||||
}
|
||||
|
||||
return undefined
|
||||
const name = path.basename(filePath)
|
||||
const ext = path.extname(filePath).slice(1)
|
||||
|
||||
const [data, stat, type] = await Promise.all([
|
||||
fs.readFile(filePath),
|
||||
fs.stat(filePath),
|
||||
fileTypeFromFile(filePath),
|
||||
])
|
||||
|
||||
return {
|
||||
name,
|
||||
data,
|
||||
mimetype: type?.mime || mimeTypeEstimate[ext],
|
||||
size: stat.size,
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,17 +1,129 @@
|
||||
import { deepMergeSimple } from '@payloadcms/translations/utilities'
|
||||
|
||||
import type { SelectType } from '../types/index.js'
|
||||
import type { FlattenedField } from '../fields/config/types.js'
|
||||
import type { SelectIncludeType, SelectType } from '../types/index.js'
|
||||
|
||||
import { getSelectMode } from './getSelectMode.js'
|
||||
|
||||
// Transform post.title -> post, post.category.title -> post
|
||||
const stripVirtualPathToCurrentCollection = ({
|
||||
fields,
|
||||
path,
|
||||
versions,
|
||||
}: {
|
||||
fields: FlattenedField[]
|
||||
path: string
|
||||
versions: boolean
|
||||
}) => {
|
||||
const resultSegments: string[] = []
|
||||
|
||||
if (versions) {
|
||||
resultSegments.push('version')
|
||||
const versionField = fields.find((each) => each.name === 'version')
|
||||
|
||||
if (versionField && versionField.type === 'group') {
|
||||
fields = versionField.flattenedFields
|
||||
}
|
||||
}
|
||||
|
||||
for (const segment of path.split('.')) {
|
||||
const field = fields.find((each) => each.name === segment)
|
||||
|
||||
if (!field) {
|
||||
continue
|
||||
}
|
||||
|
||||
resultSegments.push(segment)
|
||||
|
||||
if (field.type === 'relationship' || field.type === 'upload') {
|
||||
return resultSegments.join('.')
|
||||
}
|
||||
}
|
||||
|
||||
return resultSegments.join('.')
|
||||
}
|
||||
|
||||
const getAllVirtualRelations = ({ fields }: { fields: FlattenedField[] }) => {
|
||||
const result: string[] = []
|
||||
|
||||
for (const field of fields) {
|
||||
if ('virtual' in field && typeof field.virtual === 'string') {
|
||||
result.push(field.virtual)
|
||||
} else if (field.type === 'group' || field.type === 'tab') {
|
||||
const nestedResult = getAllVirtualRelations({ fields: field.flattenedFields })
|
||||
|
||||
for (const nestedItem of nestedResult) {
|
||||
result.push(nestedItem)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
const resolveVirtualRelationsToSelect = ({
|
||||
fields,
|
||||
selectValue,
|
||||
topLevelFields,
|
||||
versions,
|
||||
}: {
|
||||
fields: FlattenedField[]
|
||||
selectValue: SelectIncludeType | true
|
||||
topLevelFields: FlattenedField[]
|
||||
versions: boolean
|
||||
}) => {
|
||||
const result: string[] = []
|
||||
if (selectValue === true) {
|
||||
for (const item of getAllVirtualRelations({ fields })) {
|
||||
result.push(
|
||||
stripVirtualPathToCurrentCollection({ fields: topLevelFields, path: item, versions }),
|
||||
)
|
||||
}
|
||||
} else {
|
||||
for (const fieldName in selectValue) {
|
||||
const field = fields.find((each) => each.name === fieldName)
|
||||
if (!field) {
|
||||
continue
|
||||
}
|
||||
|
||||
if ('virtual' in field && typeof field.virtual === 'string') {
|
||||
result.push(
|
||||
stripVirtualPathToCurrentCollection({
|
||||
fields: topLevelFields,
|
||||
path: field.virtual,
|
||||
versions,
|
||||
}),
|
||||
)
|
||||
} else if (field.type === 'group' || field.type === 'tab') {
|
||||
for (const item of resolveVirtualRelationsToSelect({
|
||||
fields: field.flattenedFields,
|
||||
selectValue: selectValue[fieldName],
|
||||
topLevelFields,
|
||||
versions,
|
||||
})) {
|
||||
result.push(
|
||||
stripVirtualPathToCurrentCollection({ fields: topLevelFields, path: item, versions }),
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
export const sanitizeSelect = ({
|
||||
fields,
|
||||
forceSelect,
|
||||
select,
|
||||
versions,
|
||||
}: {
|
||||
fields: FlattenedField[]
|
||||
forceSelect?: SelectType
|
||||
select?: SelectType
|
||||
versions?: boolean
|
||||
}): SelectType | undefined => {
|
||||
if (!forceSelect || !select) {
|
||||
if (!select) {
|
||||
return select
|
||||
}
|
||||
|
||||
@@ -21,5 +133,36 @@ export const sanitizeSelect = ({
|
||||
return select
|
||||
}
|
||||
|
||||
return deepMergeSimple(select, forceSelect)
|
||||
if (forceSelect) {
|
||||
select = deepMergeSimple(select, forceSelect)
|
||||
}
|
||||
|
||||
if (select) {
|
||||
const virtualRelations = resolveVirtualRelationsToSelect({
|
||||
fields,
|
||||
selectValue: select as SelectIncludeType,
|
||||
topLevelFields: fields,
|
||||
versions: versions ?? false,
|
||||
})
|
||||
|
||||
for (const path of virtualRelations) {
|
||||
let currentRef = select
|
||||
const segments = path.split('.')
|
||||
for (let i = 0; i < segments.length; i++) {
|
||||
const isLast = segments.length - 1 === i
|
||||
const segment = segments[i]
|
||||
|
||||
if (isLast) {
|
||||
currentRef[segment] = true
|
||||
} else {
|
||||
if (!(segment in currentRef)) {
|
||||
currentRef[segment] = {}
|
||||
currentRef = currentRef[segment]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return select
|
||||
}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/plugin-cloud-storage",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "The official cloud storage plugin for Payload CMS",
|
||||
"homepage": "https://payloadcms.com",
|
||||
"repository": {
|
||||
|
||||
@@ -26,6 +26,7 @@ export async function getFilePrefix({
|
||||
const files = await req.payload.find({
|
||||
collection: collection.slug,
|
||||
depth: 0,
|
||||
draft: true,
|
||||
limit: 1,
|
||||
pagination: false,
|
||||
where: {
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/plugin-form-builder",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "Form builder plugin for Payload CMS",
|
||||
"keywords": [
|
||||
"payload",
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/plugin-import-export",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "Import-Export plugin for Payload",
|
||||
"keywords": [
|
||||
"payload",
|
||||
|
||||
@@ -87,7 +87,7 @@ export const createExport = async (args: CreateExportArgs) => {
|
||||
let isFirstBatch = true
|
||||
|
||||
while (result.docs.length > 0) {
|
||||
const csvInput = result.docs.map((doc) => flattenObject(doc))
|
||||
const csvInput = result.docs.map((doc) => flattenObject({ doc, fields }))
|
||||
const csvString = stringify(csvInput, { header: isFirstBatch })
|
||||
this.push(encoder.encode(csvString))
|
||||
isFirstBatch = false
|
||||
@@ -119,7 +119,7 @@ export const createExport = async (args: CreateExportArgs) => {
|
||||
result = await payload.find(findArgs)
|
||||
|
||||
if (isCSV) {
|
||||
const csvInput = result.docs.map((doc) => flattenObject(doc))
|
||||
const csvInput = result.docs.map((doc) => flattenObject({ doc, fields }))
|
||||
outputData.push(stringify(csvInput, { header: isFirstBatch }))
|
||||
isFirstBatch = false
|
||||
} else {
|
||||
|
||||
@@ -1,23 +1,61 @@
|
||||
export const flattenObject = (obj: any, prefix: string = ''): Record<string, unknown> => {
|
||||
import type { Document } from 'payload'
|
||||
|
||||
type Args = {
|
||||
doc: Document
|
||||
fields?: string[]
|
||||
prefix?: string
|
||||
}
|
||||
|
||||
export const flattenObject = ({ doc, fields, prefix }: Args): Record<string, unknown> => {
|
||||
const result: Record<string, unknown> = {}
|
||||
|
||||
Object.entries(obj).forEach(([key, value]) => {
|
||||
const newKey = prefix ? `${prefix}_${key}` : key
|
||||
const flatten = (doc: Document, prefix?: string) => {
|
||||
Object.entries(doc).forEach(([key, value]) => {
|
||||
const newKey = prefix ? `${prefix}_${key}` : key
|
||||
|
||||
if (Array.isArray(value)) {
|
||||
value.forEach((item, index) => {
|
||||
if (typeof item === 'object' && item !== null) {
|
||||
Object.assign(result, flattenObject(item, `${newKey}_${index}`))
|
||||
} else {
|
||||
result[`${newKey}_${index}`] = item
|
||||
}
|
||||
})
|
||||
} else if (typeof value === 'object' && value !== null) {
|
||||
Object.assign(result, flattenObject(value, newKey))
|
||||
} else {
|
||||
result[newKey] = value
|
||||
if (Array.isArray(value)) {
|
||||
value.forEach((item, index) => {
|
||||
if (typeof item === 'object' && item !== null) {
|
||||
flatten(item, `${newKey}_${index}`)
|
||||
} else {
|
||||
result[`${newKey}_${index}`] = item
|
||||
}
|
||||
})
|
||||
} else if (typeof value === 'object' && value !== null) {
|
||||
flatten(value, newKey)
|
||||
} else {
|
||||
result[newKey] = value
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
flatten(doc, prefix)
|
||||
|
||||
if (fields) {
|
||||
const orderedResult: Record<string, unknown> = {}
|
||||
|
||||
const fieldToRegex = (field: string): RegExp => {
|
||||
const parts = field.split('.').map((part) => `${part}(?:_\\d+)?`)
|
||||
const pattern = `^${parts.join('_')}`
|
||||
return new RegExp(pattern)
|
||||
}
|
||||
})
|
||||
|
||||
fields.forEach((field) => {
|
||||
if (result[field.replace(/\./g, '_')]) {
|
||||
const sanitizedField = field.replace(/\./g, '_')
|
||||
orderedResult[sanitizedField] = result[sanitizedField]
|
||||
} else {
|
||||
const regex = fieldToRegex(field)
|
||||
Object.keys(result).forEach((key) => {
|
||||
if (regex.test(key)) {
|
||||
orderedResult[key] = result[key]
|
||||
}
|
||||
})
|
||||
}
|
||||
})
|
||||
|
||||
return orderedResult
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/plugin-multi-tenant",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "Multi Tenant plugin for Payload",
|
||||
"keywords": [
|
||||
"payload",
|
||||
|
||||
@@ -14,6 +14,7 @@ export const findTenantOptions = async ({
|
||||
useAsTitle,
|
||||
user,
|
||||
}: Args): Promise<PaginatedDocs> => {
|
||||
const isOrderable = payload.collections[tenantsCollectionSlug]?.config?.orderable || false
|
||||
return payload.find({
|
||||
collection: tenantsCollectionSlug,
|
||||
depth: 0,
|
||||
@@ -21,8 +22,9 @@ export const findTenantOptions = async ({
|
||||
overrideAccess: false,
|
||||
select: {
|
||||
[useAsTitle]: true,
|
||||
...(isOrderable ? { _order: true } : {}),
|
||||
},
|
||||
sort: useAsTitle,
|
||||
sort: isOrderable ? '_order' : useAsTitle,
|
||||
user,
|
||||
})
|
||||
}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/plugin-nested-docs",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "The official Nested Docs plugin for Payload",
|
||||
"homepage": "https://payloadcms.com",
|
||||
"repository": {
|
||||
|
||||
@@ -22,7 +22,6 @@ type ResaveArgs = {
|
||||
|
||||
const resave = async ({ collection, doc, draft, pluginConfig, req }: ResaveArgs) => {
|
||||
const parentSlug = pluginConfig?.parentFieldSlug || 'parent'
|
||||
const breadcrumbSlug = pluginConfig.breadcrumbsFieldSlug || 'breadcrumbs'
|
||||
|
||||
if (draft) {
|
||||
// If the parent is a draft, don't resave children
|
||||
|
||||
@@ -8,47 +8,39 @@ import type { Breadcrumb, NestedDocsPluginConfig } from '../types.js'
|
||||
export const resaveSelfAfterCreate =
|
||||
(pluginConfig: NestedDocsPluginConfig, collection: CollectionConfig): CollectionAfterChangeHook =>
|
||||
async ({ doc, operation, req }) => {
|
||||
if (operation !== 'create') {
|
||||
return undefined
|
||||
}
|
||||
|
||||
const { locale, payload } = req
|
||||
const breadcrumbSlug = pluginConfig.breadcrumbsFieldSlug || 'breadcrumbs'
|
||||
const breadcrumbs = doc[breadcrumbSlug] as unknown as Breadcrumb[]
|
||||
|
||||
if (operation === 'create') {
|
||||
const originalDocWithDepth0 = await payload.findByID({
|
||||
const updateAsDraft =
|
||||
typeof collection.versions === 'object' &&
|
||||
collection.versions.drafts &&
|
||||
doc._status !== 'published'
|
||||
|
||||
try {
|
||||
await payload.update({
|
||||
id: doc.id,
|
||||
collection: collection.slug,
|
||||
data: {
|
||||
[breadcrumbSlug]:
|
||||
breadcrumbs?.map((crumb, i) => ({
|
||||
...crumb,
|
||||
doc: breadcrumbs.length === i + 1 ? doc.id : crumb.doc,
|
||||
})) || [],
|
||||
},
|
||||
depth: 0,
|
||||
draft: updateAsDraft,
|
||||
locale,
|
||||
req,
|
||||
})
|
||||
|
||||
const updateAsDraft =
|
||||
typeof collection.versions === 'object' &&
|
||||
collection.versions.drafts &&
|
||||
doc._status !== 'published'
|
||||
|
||||
try {
|
||||
await payload.update({
|
||||
id: doc.id,
|
||||
collection: collection.slug,
|
||||
data: {
|
||||
...originalDocWithDepth0,
|
||||
[breadcrumbSlug]:
|
||||
breadcrumbs?.map((crumb, i) => ({
|
||||
...crumb,
|
||||
doc: breadcrumbs.length === i + 1 ? doc.id : crumb.doc,
|
||||
})) || [],
|
||||
},
|
||||
depth: 0,
|
||||
draft: updateAsDraft,
|
||||
locale,
|
||||
req,
|
||||
})
|
||||
} catch (err: unknown) {
|
||||
payload.logger.error(
|
||||
`Nested Docs plugin has had an error while adding breadcrumbs during document creation.`,
|
||||
)
|
||||
payload.logger.error(err)
|
||||
}
|
||||
} catch (err: unknown) {
|
||||
payload.logger.error(
|
||||
`Nested Docs plugin has had an error while adding breadcrumbs during document creation.`,
|
||||
)
|
||||
payload.logger.error(err)
|
||||
}
|
||||
|
||||
return undefined
|
||||
}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/plugin-redirects",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "Redirects plugin for Payload",
|
||||
"keywords": [
|
||||
"payload",
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/plugin-search",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "Search plugin for Payload",
|
||||
"keywords": [
|
||||
"payload",
|
||||
|
||||
@@ -124,14 +124,15 @@ export const generateReindexHandler =
|
||||
for (let i = 0; i < totalBatches; i++) {
|
||||
const { docs } = await payload.find({
|
||||
collection,
|
||||
depth: 0,
|
||||
limit: batchSize,
|
||||
locale: localeToSync,
|
||||
page: i + 1,
|
||||
...defaultLocalApiProps,
|
||||
})
|
||||
|
||||
const promises = docs.map((doc) =>
|
||||
syncDocAsSearchIndex({
|
||||
for (const doc of docs) {
|
||||
await syncDocAsSearchIndex({
|
||||
collection,
|
||||
doc,
|
||||
locale: localeToSync,
|
||||
@@ -139,12 +140,7 @@ export const generateReindexHandler =
|
||||
operation,
|
||||
pluginConfig,
|
||||
req,
|
||||
}),
|
||||
)
|
||||
|
||||
// Sequentially await promises to avoid transaction issues
|
||||
for (const promise of promises) {
|
||||
await promise
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -64,18 +64,17 @@ export const syncDocAsSearchIndex = async ({
|
||||
const doSync = syncDrafts || (!syncDrafts && status !== 'draft')
|
||||
|
||||
try {
|
||||
if (operation === 'create') {
|
||||
if (doSync) {
|
||||
await payload.create({
|
||||
collection: searchSlug,
|
||||
data: {
|
||||
...dataToSave,
|
||||
priority: defaultPriority,
|
||||
},
|
||||
locale: syncLocale,
|
||||
req,
|
||||
})
|
||||
}
|
||||
if (operation === 'create' && doSync) {
|
||||
await payload.create({
|
||||
collection: searchSlug,
|
||||
data: {
|
||||
...dataToSave,
|
||||
priority: defaultPriority,
|
||||
},
|
||||
depth: 0,
|
||||
locale: syncLocale,
|
||||
req,
|
||||
})
|
||||
}
|
||||
|
||||
if (operation === 'update') {
|
||||
@@ -110,6 +109,7 @@ export const syncDocAsSearchIndex = async ({
|
||||
const duplicativeDocIDs = duplicativeDocs.map(({ id }) => id)
|
||||
await payload.delete({
|
||||
collection: searchSlug,
|
||||
depth: 0,
|
||||
req,
|
||||
where: { id: { in: duplicativeDocIDs } },
|
||||
})
|
||||
@@ -134,6 +134,7 @@ export const syncDocAsSearchIndex = async ({
|
||||
...dataToSave,
|
||||
priority: foundDoc.priority || defaultPriority,
|
||||
},
|
||||
depth: 0,
|
||||
locale: syncLocale,
|
||||
req,
|
||||
})
|
||||
@@ -148,6 +149,7 @@ export const syncDocAsSearchIndex = async ({
|
||||
docs: [docWithPublish],
|
||||
} = await payload.find({
|
||||
collection,
|
||||
depth: 0,
|
||||
draft: false,
|
||||
limit: 1,
|
||||
locale: syncLocale,
|
||||
@@ -175,6 +177,7 @@ export const syncDocAsSearchIndex = async ({
|
||||
await payload.delete({
|
||||
id: searchDocID,
|
||||
collection: searchSlug,
|
||||
depth: 0,
|
||||
req,
|
||||
})
|
||||
} catch (err: unknown) {
|
||||
@@ -190,6 +193,7 @@ export const syncDocAsSearchIndex = async ({
|
||||
...dataToSave,
|
||||
priority: defaultPriority,
|
||||
},
|
||||
depth: 0,
|
||||
locale: syncLocale,
|
||||
req,
|
||||
})
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/plugin-sentry",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "Sentry plugin for Payload",
|
||||
"keywords": [
|
||||
"payload",
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/plugin-seo",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "SEO plugin for Payload",
|
||||
"keywords": [
|
||||
"payload",
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/plugin-stripe",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "Stripe plugin for Payload",
|
||||
"keywords": [
|
||||
"payload",
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@payloadcms/richtext-lexical",
|
||||
"version": "3.35.1",
|
||||
"version": "3.37.0",
|
||||
"description": "The officially supported Lexical richtext adapter for Payload",
|
||||
"homepage": "https://payloadcms.com",
|
||||
"repository": {
|
||||
|
||||
@@ -284,10 +284,22 @@ export const InlineBlockComponent: React.FC<Props> = (props) => {
|
||||
)
|
||||
// cleanup effect
|
||||
useEffect(() => {
|
||||
const isStateOutOfSync = (formData: InlineBlockFields, initialState: FormState) => {
|
||||
return Object.keys(initialState).some(
|
||||
(key) => initialState[key] && formData[key] !== initialState[key].value,
|
||||
)
|
||||
}
|
||||
|
||||
return () => {
|
||||
// If the component is unmounted (either via removeInlineBlock or via lexical itself) and the form state got changed before,
|
||||
// we need to reset the initial state to force a re-fetch of the initial state when it gets mounted again (e.g. via lexical history undo).
|
||||
// Otherwise it would use an outdated initial state.
|
||||
if (initialState && isStateOutOfSync(formData, initialState)) {
|
||||
setInitialState(false)
|
||||
}
|
||||
abortAndIgnore(onChangeAbortControllerRef.current)
|
||||
}
|
||||
}, [])
|
||||
}, [formData, initialState])
|
||||
|
||||
/**
|
||||
* HANDLE FORM SUBMIT
|
||||
|
||||
@@ -56,22 +56,15 @@ export const BlocksPlugin: PluginComponent = () => {
|
||||
|
||||
if ($isRangeSelection(selection)) {
|
||||
const blockNode = $createBlockNode(payload)
|
||||
|
||||
// we need to get the focus node before inserting the block node, as $insertNodeToNearestRoot can change the focus node
|
||||
const { focus } = selection
|
||||
const focusNode = focus.getNode()
|
||||
// Insert blocks node BEFORE potentially removing focusNode, as $insertNodeToNearestRoot errors if the focusNode doesn't exist
|
||||
$insertNodeToNearestRoot(blockNode)
|
||||
|
||||
const { focus } = selection
|
||||
const focusNode = focus.getNode()
|
||||
|
||||
// First, delete currently selected node if it's an empty paragraph and if there are sufficient
|
||||
// paragraph nodes (more than 1) left in the parent node, so that we don't "trap" the user
|
||||
if (
|
||||
$isParagraphNode(focusNode) &&
|
||||
focusNode.getTextContentSize() === 0 &&
|
||||
focusNode
|
||||
.getParentOrThrow()
|
||||
.getChildren()
|
||||
.filter((node) => $isParagraphNode(node)).length > 1
|
||||
) {
|
||||
// Delete the node it it's an empty paragraph
|
||||
if ($isParagraphNode(focusNode) && !focusNode.__first) {
|
||||
focusNode.remove()
|
||||
}
|
||||
}
|
||||
|
||||
@@ -5,6 +5,12 @@ import type {
|
||||
SerializedParagraphNode,
|
||||
SerializedTextNode,
|
||||
SerializedLineBreakNode,
|
||||
SerializedHeadingNode,
|
||||
SerializedListItemNode,
|
||||
SerializedListNode,
|
||||
SerializedTableRowNode,
|
||||
SerializedTableNode,
|
||||
SerializedTableCellNode,
|
||||
} from '../../../nodeTypes.js'
|
||||
import { convertLexicalToPlaintext } from './sync/index.js'
|
||||
|
||||
@@ -51,7 +57,83 @@ function paragraphNode(children: DefaultNodeTypes[]): SerializedParagraphNode {
|
||||
}
|
||||
}
|
||||
|
||||
function rootNode(nodes: DefaultNodeTypes[]): DefaultTypedEditorState {
|
||||
function headingNode(children: DefaultNodeTypes[]): SerializedHeadingNode {
|
||||
return {
|
||||
type: 'heading',
|
||||
children,
|
||||
direction: 'ltr',
|
||||
format: '',
|
||||
indent: 0,
|
||||
textFormat: 0,
|
||||
tag: 'h1',
|
||||
version: 1,
|
||||
}
|
||||
}
|
||||
|
||||
function listItemNode(children: DefaultNodeTypes[]): SerializedListItemNode {
|
||||
return {
|
||||
type: 'listitem',
|
||||
children,
|
||||
checked: false,
|
||||
direction: 'ltr',
|
||||
format: '',
|
||||
indent: 0,
|
||||
value: 0,
|
||||
version: 1,
|
||||
}
|
||||
}
|
||||
|
||||
function listNode(children: DefaultNodeTypes[]): SerializedListNode {
|
||||
return {
|
||||
type: 'list',
|
||||
children,
|
||||
direction: 'ltr',
|
||||
format: '',
|
||||
indent: 0,
|
||||
listType: 'bullet',
|
||||
start: 0,
|
||||
tag: 'ul',
|
||||
version: 1,
|
||||
}
|
||||
}
|
||||
|
||||
function tableNode(children: (DefaultNodeTypes | SerializedTableRowNode)[]): SerializedTableNode {
|
||||
return {
|
||||
type: 'table',
|
||||
children,
|
||||
direction: 'ltr',
|
||||
format: '',
|
||||
indent: 0,
|
||||
version: 1,
|
||||
}
|
||||
}
|
||||
|
||||
function tableRowNode(
|
||||
children: (DefaultNodeTypes | SerializedTableCellNode)[],
|
||||
): SerializedTableRowNode {
|
||||
return {
|
||||
type: 'tablerow',
|
||||
children,
|
||||
direction: 'ltr',
|
||||
format: '',
|
||||
indent: 0,
|
||||
version: 1,
|
||||
}
|
||||
}
|
||||
|
||||
function tableCellNode(children: DefaultNodeTypes[]): SerializedTableCellNode {
|
||||
return {
|
||||
type: 'tablecell',
|
||||
children,
|
||||
direction: 'ltr',
|
||||
format: '',
|
||||
indent: 0,
|
||||
headerState: 0,
|
||||
version: 1,
|
||||
}
|
||||
}
|
||||
|
||||
function rootNode(nodes: (DefaultNodeTypes | SerializedTableNode)[]): DefaultTypedEditorState {
|
||||
return {
|
||||
root: {
|
||||
type: 'root',
|
||||
@@ -72,7 +154,6 @@ describe('convertLexicalToPlaintext', () => {
|
||||
data,
|
||||
})
|
||||
|
||||
console.log('plaintext', plaintext)
|
||||
expect(plaintext).toBe('Basic Text')
|
||||
})
|
||||
|
||||
@@ -111,4 +192,67 @@ describe('convertLexicalToPlaintext', () => {
|
||||
|
||||
expect(plaintext).toBe('Basic Text\tNext Line')
|
||||
})
|
||||
|
||||
it('ensure new lines are added between paragraphs', () => {
|
||||
const data: DefaultTypedEditorState = rootNode([
|
||||
paragraphNode([textNode('Basic text')]),
|
||||
paragraphNode([textNode('Next block-node')]),
|
||||
])
|
||||
|
||||
const plaintext = convertLexicalToPlaintext({
|
||||
data,
|
||||
})
|
||||
|
||||
expect(plaintext).toBe('Basic text\n\nNext block-node')
|
||||
})
|
||||
|
||||
it('ensure new lines are added between heading nodes', () => {
|
||||
const data: DefaultTypedEditorState = rootNode([
|
||||
headingNode([textNode('Basic text')]),
|
||||
headingNode([textNode('Next block-node')]),
|
||||
])
|
||||
|
||||
const plaintext = convertLexicalToPlaintext({
|
||||
data,
|
||||
})
|
||||
|
||||
expect(plaintext).toBe('Basic text\n\nNext block-node')
|
||||
})
|
||||
|
||||
it('ensure new lines are added between list items and lists', () => {
|
||||
const data: DefaultTypedEditorState = rootNode([
|
||||
listNode([listItemNode([textNode('First item')]), listItemNode([textNode('Second item')])]),
|
||||
listNode([listItemNode([textNode('Next list')])]),
|
||||
])
|
||||
|
||||
const plaintext = convertLexicalToPlaintext({
|
||||
data,
|
||||
})
|
||||
|
||||
expect(plaintext).toBe('First item\nSecond item\n\nNext list')
|
||||
})
|
||||
|
||||
it('ensure new lines are added between tables, table rows, and table cells', () => {
|
||||
const data: DefaultTypedEditorState = rootNode([
|
||||
tableNode([
|
||||
tableRowNode([
|
||||
tableCellNode([textNode('Cell 1, Row 1')]),
|
||||
tableCellNode([textNode('Cell 2, Row 1')]),
|
||||
]),
|
||||
tableRowNode([
|
||||
tableCellNode([textNode('Cell 1, Row 2')]),
|
||||
tableCellNode([textNode('Cell 2, Row 2')]),
|
||||
]),
|
||||
]),
|
||||
tableNode([tableRowNode([tableCellNode([textNode('Cell in Table 2')])])]),
|
||||
])
|
||||
|
||||
const plaintext = convertLexicalToPlaintext({
|
||||
data,
|
||||
})
|
||||
|
||||
expect(plaintext).toBe(
|
||||
'Cell 1, Row 1 | Cell 2, Row 1\nCell 1, Row 2 | Cell 2, Row 2\n\nCell in Table 2',
|
||||
)
|
||||
})
|
||||
})
|
||||
|
||||
@@ -86,11 +86,25 @@ export function convertLexicalNodesToPlaintext({
|
||||
}
|
||||
} else {
|
||||
// Default plaintext converter heuristic
|
||||
if (node.type === 'paragraph') {
|
||||
if (
|
||||
node.type === 'paragraph' ||
|
||||
node.type === 'heading' ||
|
||||
node.type === 'list' ||
|
||||
node.type === 'table'
|
||||
) {
|
||||
if (plainTextArray?.length) {
|
||||
// Only add a new line if there is already text in the array
|
||||
plainTextArray.push('\n\n')
|
||||
}
|
||||
} else if (node.type === 'listitem' || node.type === 'tablerow') {
|
||||
if (plainTextArray?.length) {
|
||||
// Only add a new line if there is already text in the array
|
||||
plainTextArray.push('\n')
|
||||
}
|
||||
} else if (node.type === 'tablecell') {
|
||||
if (plainTextArray?.length) {
|
||||
plainTextArray.push(' | ')
|
||||
}
|
||||
} else if (node.type === 'linebreak') {
|
||||
plainTextArray.push('\n')
|
||||
} else if (node.type === 'tab') {
|
||||
|
||||
@@ -53,22 +53,14 @@ export const RelationshipPlugin: PluginComponent<RelationshipFeatureProps> = ({
|
||||
|
||||
if ($isRangeSelection(selection)) {
|
||||
const relationshipNode = $createRelationshipNode(payload)
|
||||
// we need to get the focus node before inserting the block node, as $insertNodeToNearestRoot can change the focus node
|
||||
const { focus } = selection
|
||||
const focusNode = focus.getNode()
|
||||
// Insert relationship node BEFORE potentially removing focusNode, as $insertNodeToNearestRoot errors if the focusNode doesn't exist
|
||||
$insertNodeToNearestRoot(relationshipNode)
|
||||
|
||||
const { focus } = selection
|
||||
const focusNode = focus.getNode()
|
||||
|
||||
// First, delete currently selected node if it's an empty paragraph and if there are sufficient
|
||||
// paragraph nodes (more than 1) left in the parent node, so that we don't "trap" the user
|
||||
if (
|
||||
$isParagraphNode(focusNode) &&
|
||||
focusNode.getTextContentSize() === 0 &&
|
||||
focusNode
|
||||
.getParentOrThrow()
|
||||
.getChildren()
|
||||
.filter((node) => $isParagraphNode(node)).length > 1
|
||||
) {
|
||||
// Delete the node it it's an empty paragraph
|
||||
if ($isParagraphNode(focusNode) && !focusNode.__first) {
|
||||
focusNode.remove()
|
||||
}
|
||||
}
|
||||
|
||||
@@ -53,18 +53,14 @@ export const UploadPlugin: PluginComponent<UploadFeaturePropsClient> = ({ client
|
||||
value: payload.value,
|
||||
},
|
||||
})
|
||||
// we need to get the focus node before inserting the block node, as $insertNodeToNearestRoot can change the focus node
|
||||
const { focus } = selection
|
||||
const focusNode = focus.getNode()
|
||||
// Insert upload node BEFORE potentially removing focusNode, as $insertNodeToNearestRoot errors if the focusNode doesn't exist
|
||||
$insertNodeToNearestRoot(uploadNode)
|
||||
|
||||
const { focus } = selection
|
||||
const focusNode = focus.getNode()
|
||||
|
||||
// Delete the node it it's an empty paragraph and it has at least one sibling, so that we don't "trap" the user
|
||||
if (
|
||||
$isParagraphNode(focusNode) &&
|
||||
!focusNode.__first &&
|
||||
(focusNode.__prev || focusNode.__next)
|
||||
) {
|
||||
// Delete the node it it's an empty paragraph
|
||||
if ($isParagraphNode(focusNode) && !focusNode.__first) {
|
||||
focusNode.remove()
|
||||
}
|
||||
}
|
||||
|
||||
@@ -4,13 +4,7 @@ import { LexicalErrorBoundary } from '@lexical/react/LexicalErrorBoundary.js'
|
||||
import { HistoryPlugin } from '@lexical/react/LexicalHistoryPlugin.js'
|
||||
import { OnChangePlugin } from '@lexical/react/LexicalOnChangePlugin.js'
|
||||
import { RichTextPlugin } from '@lexical/react/LexicalRichTextPlugin.js'
|
||||
import {
|
||||
$createParagraphNode,
|
||||
$getRoot,
|
||||
BLUR_COMMAND,
|
||||
COMMAND_PRIORITY_LOW,
|
||||
FOCUS_COMMAND,
|
||||
} from 'lexical'
|
||||
import { BLUR_COMMAND, COMMAND_PRIORITY_LOW, FOCUS_COMMAND } from 'lexical'
|
||||
import * as React from 'react'
|
||||
import { useEffect, useState } from 'react'
|
||||
|
||||
@@ -24,6 +18,7 @@ import { AddBlockHandlePlugin } from './plugins/handles/AddBlockHandlePlugin/ind
|
||||
import { DraggableBlockPlugin } from './plugins/handles/DraggableBlockPlugin/index.js'
|
||||
import { InsertParagraphAtEndPlugin } from './plugins/InsertParagraphAtEnd/index.js'
|
||||
import { MarkdownShortcutPlugin } from './plugins/MarkdownShortcut/index.js'
|
||||
import { NormalizeSelectionPlugin } from './plugins/NormalizeSelection/index.js'
|
||||
import { SlashMenuPlugin } from './plugins/SlashMenu/index.js'
|
||||
import { TextPlugin } from './plugins/TextPlugin/index.js'
|
||||
import { LexicalContentEditable } from './ui/ContentEditable.js'
|
||||
@@ -112,6 +107,7 @@ export const LexicalEditor: React.FC<
|
||||
}
|
||||
ErrorBoundary={LexicalErrorBoundary}
|
||||
/>
|
||||
<NormalizeSelectionPlugin />
|
||||
<InsertParagraphAtEndPlugin />
|
||||
<DecoratorPlugin />
|
||||
<TextPlugin features={editorConfig.features} />
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user