Compare commits

..

1 Commits

Author SHA1 Message Date
Germán Jabloñski
b65588f79d Revert "fix(db-postgres): sort by distance when the near operator is used (…"
This reverts commit 9955818503.
2025-04-24 16:19:31 -03:00
179 changed files with 497 additions and 3172 deletions

View File

@@ -298,15 +298,3 @@ Passing your migrations as shown above will tell Payload, in production only, to
may slow down serverless cold starts on platforms such as Vercel. Generally,
this option should only be used for long-running servers / containers.
</Banner>
## Environment-Specific Configurations and Migrations
Your configuration may include environment-specific settings (e.g., enabling a plugin only in production). If you generate migrations without considering the environment, it can lead to discrepancies and issues. When running migrations locally, Payload uses the development environment, which might miss production-specific configurations. Similarly, running migrations in production could miss development-specific entities.
This is an easy oversight, so be mindful of any environment-specific logic in your config when handling migrations.
**Ways to address this:**
- Manually update your migration file after it is generated to include any environment-specific configurations.
- Temporarily enable any required production environment variables in your local setup when generating the migration to capture the necessary updates.
- Use separate migration files for each environment to ensure the correct migration is executed in the corresponding environment.

View File

@@ -94,7 +94,6 @@ The Relationship Field inherits all of the default options from the base [Field
| **`allowCreate`** | Set to `false` if you'd like to disable the ability to create new documents from within the relationship field. |
| **`allowEdit`** | Set to `false` if you'd like to disable the ability to edit documents from within the relationship field. |
| **`sortOptions`** | Define a default sorting order for the options within a Relationship field's dropdown. [More](#sort-options) |
| **`placeholder`** | Define a custom text or function to replace the generic default placeholder |
| **`appearance`** | Set to `drawer` or `select` to change the behavior of the field. Defaults to `select`. |
### Sort Options
@@ -150,7 +149,7 @@ The `filterOptions` property can either be a `Where` query, or a function return
| `id` | The `id` of the current document being edited. Will be `undefined` during the `create` operation or when called on a `Filter` component within the list view. |
| `relationTo` | The collection `slug` to filter against, limited to this field's `relationTo` property. |
| `req` | The Payload Request, which contains references to `payload`, `user`, `locale`, and more. |
| `siblingData` | An object containing document data that is scoped to only fields within the same parent of this field. Will be an empty object when called on a `Filter` component within the list view. |
| `siblingData` | An object containing document data that is scoped to only fields within the same parent of this field. Will be an emprt object when called on a `Filter` component within the list view. |
| `user` | An object containing the currently authenticated user. |
## Example

View File

@@ -89,7 +89,6 @@ The Select Field inherits all of the default options from the base [Field Admin
| ----------------- | ------------------------------------------------------------------------------------------------------------------------------------------- |
| **`isClearable`** | Set to `true` if you'd like this field to be clearable within the Admin UI. |
| **`isSortable`** | Set to `true` if you'd like this field to be sortable within the Admin UI using drag and drop. (Only works when `hasMany` is set to `true`) |
| **`placeholder`** | Define a custom text or function to replace the generic default placeholder |
## Example

View File

@@ -81,7 +81,7 @@ To install a Database Adapter, you can run **one** of the following commands:
#### 2. Copy Payload files into your Next.js app folder
Payload installs directly in your Next.js `/app` folder, and you'll need to place some files into that folder for Payload to run. You can copy these files from the [Blank Template](https://github.com/payloadcms/payload/tree/main/templates/blank/src/app/(payload)) on GitHub. Once you have the required Payload files in place in your `/app` folder, you should have something like this:
Payload installs directly in your Next.js `/app` folder, and you'll need to place some files into that folder for Payload to run. You can copy these files from the [Blank Template](<https://github.com/payloadcms/payload/tree/main/templates/blank/src/app/(payload)>) on GitHub. Once you have the required Payload files in place in your `/app` folder, you should have something like this:
```plaintext
app/

View File

@@ -55,11 +55,10 @@ All collection `find` queries are paginated automatically. Responses are returne
All Payload APIs support the pagination controls below. With them, you can create paginated lists of documents within your application:
| Control | Default | Description |
| ------------ | ------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `limit` | `10` | Limits the number of documents returned per page - set to `0` to show all documents, we automatically disabled pagination for you when `limit` is `0` for optimisation |
| `pagination` | `true` | Set to `false` to disable pagination and return all documents |
| `page` | `1` | Get a specific page number |
| Control | Description |
| ------- | --------------------------------------- |
| `limit` | Limits the number of documents returned |
| `page` | Get a specific page number |
### Disabling pagination within Local API

View File

@@ -84,7 +84,6 @@ pnpm add @payloadcms/storage-s3
- The `config` object can be any [`S3ClientConfig`](https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/client/s3) object (from [`@aws-sdk/client-s3`](https://github.com/aws/aws-sdk-js-v3)). _This is highly dependent on your AWS setup_. Check the AWS documentation for more information.
- When enabled, this package will automatically set `disableLocalStorage` to `true` for each collection.
- When deploying to Vercel, server uploads are limited with 4.5MB. Set `clientUploads` to `true` to do uploads directly on the client. You must allow CORS PUT method for the bucket to your website.
- Configure `signedDownloads` (either globally of per-collection in `collections`) to use [presigned URLs](https://docs.aws.amazon.com/AmazonS3/latest/userguide/using-presigned-url.html) for files downloading. This can improve performance for large files (like videos) while still respecting your access control.
```ts
import { s3Storage } from '@payloadcms/storage-s3'

View File

@@ -58,7 +58,7 @@ See the [Collections](https://payloadcms.com/docs/configuration/collections) doc
}
```
For more details on how to extend this functionality, see the [Live Preview](https://payloadcms.com/docs/live-preview/overview) docs.
For more details on how to extend this functionality, see the [Live Preview](https://payloadcms.com/docs/live-preview) docs.
## Front-end

View File

@@ -36,7 +36,7 @@ export const home: Partial<Page> = {
type: 'link',
children: [{ text: 'Live Preview' }],
newTab: true,
url: 'https://payloadcms.com/docs/live-preview/overview',
url: 'https://payloadcms.com/docs/live-preview',
},
{
text: ' you can edit this page in the admin panel and see the changes reflected here in real time.',

View File

@@ -1,6 +1,6 @@
{
"name": "payload-monorepo",
"version": "3.37.0",
"version": "3.35.1",
"private": true,
"type": "module",
"scripts": {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/admin-bar",
"version": "3.37.0",
"version": "3.35.1",
"description": "An admin bar for React apps using Payload",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -1,6 +1,6 @@
{
"name": "create-payload-app",
"version": "3.37.0",
"version": "3.35.1",
"homepage": "https://payloadcms.com",
"repository": {
"type": "git",

View File

@@ -10,7 +10,6 @@ import type { CliArgs, DbType, ProjectExample, ProjectTemplate } from '../types.
import { createProject } from './create-project.js'
import { dbReplacements } from './replacements.js'
import { getValidTemplates } from './templates.js'
import { manageEnvFiles } from './manage-env-files.js'
describe('createProject', () => {
let projectDir: string
@@ -155,75 +154,5 @@ describe('createProject', () => {
expect(content).toContain(dbReplacement.configReplacement().join('\n'))
})
})
describe('managing env files', () => {
it('updates .env files without overwriting existing data', async () => {
const envFilePath = path.join(projectDir, '.env')
const envExampleFilePath = path.join(projectDir, '.env.example')
fse.ensureDirSync(projectDir)
fse.ensureFileSync(envFilePath)
fse.ensureFileSync(envExampleFilePath)
const initialEnvContent = `CUSTOM_VAR=custom-value\nDATABASE_URI=old-connection\n`
const initialEnvExampleContent = `CUSTOM_VAR=custom-value\nDATABASE_URI=old-connection\nPAYLOAD_SECRET=YOUR_SECRET_HERE\n`
fse.writeFileSync(envFilePath, initialEnvContent)
fse.writeFileSync(envExampleFilePath, initialEnvExampleContent)
await manageEnvFiles({
cliArgs: {
'--debug': true,
} as CliArgs,
databaseType: 'mongodb',
databaseUri: 'mongodb://localhost:27017/test',
payloadSecret: 'test-secret',
projectDir,
template: undefined,
})
const updatedEnvContent = fse.readFileSync(envFilePath, 'utf-8')
expect(updatedEnvContent).toContain('CUSTOM_VAR=custom-value')
expect(updatedEnvContent).toContain('DATABASE_URI=mongodb://localhost:27017/test')
expect(updatedEnvContent).toContain('PAYLOAD_SECRET=test-secret')
const updatedEnvExampleContent = fse.readFileSync(envExampleFilePath, 'utf-8')
expect(updatedEnvExampleContent).toContain('CUSTOM_VAR=custom-value')
expect(updatedEnvContent).toContain('DATABASE_URI=mongodb://localhost:27017/test')
expect(updatedEnvContent).toContain('PAYLOAD_SECRET=test-secret')
})
it('creates .env and .env.example if they do not exist', async () => {
const envFilePath = path.join(projectDir, '.env')
const envExampleFilePath = path.join(projectDir, '.env.example')
fse.ensureDirSync(projectDir)
if (fse.existsSync(envFilePath)) fse.removeSync(envFilePath)
if (fse.existsSync(envExampleFilePath)) fse.removeSync(envExampleFilePath)
await manageEnvFiles({
cliArgs: {
'--debug': true,
} as CliArgs,
databaseUri: '',
payloadSecret: '',
projectDir,
template: undefined,
})
expect(fse.existsSync(envFilePath)).toBe(true)
expect(fse.existsSync(envExampleFilePath)).toBe(true)
const updatedEnvContent = fse.readFileSync(envFilePath, 'utf-8')
expect(updatedEnvContent).toContain('DATABASE_URI=your-connection-string-here')
expect(updatedEnvContent).toContain('PAYLOAD_SECRET=YOUR_SECRET_HERE')
const updatedEnvExampleContent = fse.readFileSync(envExampleFilePath, 'utf-8')
expect(updatedEnvExampleContent).toContain('DATABASE_URI=your-connection-string-here')
expect(updatedEnvExampleContent).toContain('PAYLOAD_SECRET=YOUR_SECRET_HERE')
})
})
})
})

View File

@@ -6,55 +6,66 @@ import type { CliArgs, DbType, ProjectTemplate } from '../types.js'
import { debug, error } from '../utils/log.js'
import { dbChoiceRecord } from './select-db.js'
const updateEnvExampleVariables = (
contents: string,
databaseType: DbType | undefined,
payloadSecret?: string,
databaseUri?: string,
): string => {
const seenKeys = new Set<string>()
const updatedEnv = contents
const updateEnvExampleVariables = (contents: string, databaseType: DbType | undefined): string => {
return contents
.split('\n')
.map((line) => {
if (line.startsWith('#') || !line.includes('=')) {
return line
return line // Preserve comments and unrelated lines
}
const [key] = line.split('=')
if (!key) {return}
if (key === 'DATABASE_URI' || key === 'POSTGRES_URL' || key === 'MONGODB_URI') {
const dbChoice = databaseType ? dbChoiceRecord[databaseType] : null
if (dbChoice) {
const placeholderUri = databaseUri
? databaseUri
: `${dbChoice.dbConnectionPrefix}your-database-name${dbChoice.dbConnectionSuffix || ''}`
line =
databaseType === 'vercel-postgres'
? `POSTGRES_URL=${placeholderUri}`
: `DATABASE_URI=${placeholderUri}`
const placeholderUri = `${dbChoice.dbConnectionPrefix}your-database-name${
dbChoice.dbConnectionSuffix || ''
}`
return databaseType === 'vercel-postgres'
? `POSTGRES_URL=${placeholderUri}`
: `DATABASE_URI=${placeholderUri}`
}
return `DATABASE_URI=your-database-connection-here` // Fallback
}
if (key === 'PAYLOAD_SECRET' || key === 'PAYLOAD_SECRET_KEY') {
line = `PAYLOAD_SECRET=${payloadSecret || 'YOUR_SECRET_HERE'}`
return `PAYLOAD_SECRET=YOUR_SECRET_HERE`
}
// handles dupes
if (seenKeys.has(key)) {
return null
}
seenKeys.add(key)
return line
})
.filter(Boolean)
.reverse()
.join('\n')
}
return updatedEnv
const generateEnvContent = (
existingEnv: string,
databaseType: DbType | undefined,
databaseUri: string,
payloadSecret: string,
): string => {
const dbKey = databaseType === 'vercel-postgres' ? 'POSTGRES_URL' : 'DATABASE_URI'
const envVars: Record<string, string> = {}
existingEnv
.split('\n')
.filter((line) => line.includes('=') && !line.startsWith('#'))
.forEach((line) => {
const [key, value] = line.split('=')
// @ts-expect-error - vestiges of when tsconfig was not strict. Feel free to improve
envVars[key] = value
})
// Override specific keys
envVars[dbKey] = databaseUri
envVars['PAYLOAD_SECRET'] = payloadSecret
// Rebuild content
return Object.entries(envVars)
.map(([key, value]) => `${key}=${value}`)
.join('\n')
}
/** Parse and swap .env.example values and write .env */
@@ -77,71 +88,42 @@ export async function manageEnvFiles(args: {
const envExamplePath = path.join(projectDir, '.env.example')
const envPath = path.join(projectDir, '.env')
const emptyEnvContent = `# Added by Payload\nDATABASE_URI=your-connection-string-here\nPAYLOAD_SECRET=YOUR_SECRET_HERE\n`
try {
let updatedExampleContents: string
if (template?.type === 'plugin') {
if (debugFlag) {
debug(`plugin template detected - no .env added .env.example added`)
// Update .env.example
if (template?.type === 'starter') {
if (!fs.existsSync(envExamplePath)) {
error(`.env.example file not found at ${envExamplePath}`)
process.exit(1)
}
return
}
if (!fs.existsSync(envExamplePath)) {
updatedExampleContents = updateEnvExampleVariables(
emptyEnvContent,
databaseType,
payloadSecret,
databaseUri,
)
await fs.writeFile(envExamplePath, updatedExampleContents)
if (debugFlag) {
debug(`.env.example file successfully created`)
}
} else {
const envExampleContents = await fs.readFile(envExamplePath, 'utf8')
const mergedEnvs = envExampleContents + '\n' + emptyEnvContent
updatedExampleContents = updateEnvExampleVariables(
mergedEnvs,
databaseType,
payloadSecret,
databaseUri,
)
updatedExampleContents = updateEnvExampleVariables(envExampleContents, databaseType)
await fs.writeFile(envExamplePath, updatedExampleContents.trimEnd() + '\n')
await fs.writeFile(envExamplePath, updatedExampleContents)
if (debugFlag) {
debug(`.env.example file successfully updated`)
}
} else {
updatedExampleContents = `# Added by Payload\nDATABASE_URI=your-connection-string-here\nPAYLOAD_SECRET=YOUR_SECRET_HERE\n`
await fs.writeFile(envExamplePath, updatedExampleContents.trimEnd() + '\n')
}
if (!fs.existsSync(envPath)) {
const envContent = updateEnvExampleVariables(
emptyEnvContent,
databaseType,
payloadSecret,
databaseUri,
)
await fs.writeFile(envPath, envContent)
// Merge existing variables and create or update .env
const envExampleContents = await fs.readFile(envExamplePath, 'utf8')
const envContent = generateEnvContent(
envExampleContents,
databaseType,
databaseUri,
payloadSecret,
)
await fs.writeFile(envPath, `# Added by Payload\n${envContent.trimEnd()}\n`)
if (debugFlag) {
debug(`.env file successfully created`)
}
} else {
const envContents = await fs.readFile(envPath, 'utf8')
const mergedEnvs = envContents + '\n' + emptyEnvContent
const updatedEnvContents = updateEnvExampleVariables(
mergedEnvs,
databaseType,
payloadSecret,
databaseUri,
)
await fs.writeFile(envPath, updatedEnvContents)
if (debugFlag) {
debug(`.env file successfully updated`)
}
if (debugFlag) {
debug(`.env file successfully created or updated`)
}
} catch (err: unknown) {
error('Unable to manage environment files')

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/db-mongodb",
"version": "3.37.0",
"version": "3.35.1",
"description": "The officially supported MongoDB database adapter for Payload",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/db-postgres",
"version": "3.37.0",
"version": "3.35.1",
"description": "The officially supported Postgres database adapter for Payload",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/db-sqlite",
"version": "3.37.0",
"version": "3.35.1",
"description": "The officially supported SQLite database adapter for Payload",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -16,7 +16,7 @@ export const countDistinct: CountDistinct = async function countDistinct(
})
.from(this.tables[tableName])
.where(where)
return Number(countResult?.[0]?.count ?? 0)
return Number(countResult[0]?.count)
}
let query: SQLiteSelect = db
@@ -39,5 +39,5 @@ export const countDistinct: CountDistinct = async function countDistinct(
// Instead, COUNT (GROUP BY id) can be used which is still slower than COUNT(*) but acceptable.
const countResult = await query
return Number(countResult?.[0]?.count ?? 0)
return Number(countResult[0]?.count)
}

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/db-vercel-postgres",
"version": "3.37.0",
"version": "3.35.1",
"description": "Vercel Postgres adapter for Payload",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/drizzle",
"version": "3.37.0",
"version": "3.35.1",
"description": "A library of shared functions used by different payload database adapters",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -46,7 +46,6 @@ export const findMany = async function find({
const offset = skip || (page - 1) * limit
if (limit === 0) {
pagination = false
limit = undefined
}

View File

@@ -42,36 +42,33 @@ export const migrate: DrizzleAdapter['migrate'] = async function migrate(
limit: 0,
sort: '-name',
}))
if (migrationsInDB.find((m) => m.batch === -1)) {
const { confirm: runMigrations } = await prompts(
{
name: 'confirm',
type: 'confirm',
initial: false,
message:
"It looks like you've run Payload in dev mode, meaning you've dynamically pushed changes to your database.\n\n" +
"If you'd like to run migrations, data loss will occur. Would you like to proceed?",
},
{
onCancel: () => {
process.exit(0)
},
},
)
if (!runMigrations) {
process.exit(0)
}
// ignore the dev migration so that the latest batch number increments correctly
migrationsInDB = migrationsInDB.filter((m) => m.batch !== -1)
}
if (Number(migrationsInDB?.[0]?.batch) > 0) {
latestBatch = Number(migrationsInDB[0]?.batch)
}
}
if (migrationsInDB.find((m) => m.batch === -1)) {
const { confirm: runMigrations } = await prompts(
{
name: 'confirm',
type: 'confirm',
initial: false,
message:
"It looks like you've run Payload in dev mode, meaning you've dynamically pushed changes to your database.\n\n" +
"If you'd like to run migrations, data loss will occur. Would you like to proceed?",
},
{
onCancel: () => {
process.exit(0)
},
},
)
if (!runMigrations) {
process.exit(0)
}
}
const newBatch = latestBatch + 1
// Execute 'up' function for each migration sequentially

View File

@@ -16,8 +16,7 @@ export const countDistinct: CountDistinct = async function countDistinct(
})
.from(this.tables[tableName])
.where(where)
return Number(countResult?.[0]?.count ?? 0)
return Number(countResult[0].count)
}
let query = db
@@ -40,5 +39,5 @@ export const countDistinct: CountDistinct = async function countDistinct(
// Instead, COUNT (GROUP BY id) can be used which is still slower than COUNT(*) but acceptable.
const countResult = await query
return Number(countResult?.[0]?.count ?? 0)
return Number(countResult[0].count)
}

View File

@@ -36,6 +36,7 @@ type Args = {
*/
export const migratePostgresV2toV3 = async ({ debug, payload, req }: Args) => {
const adapter = payload.db as unknown as BasePostgresAdapter
const db = await getTransaction(adapter, req)
const dir = payload.db.migrationDir
// get the drizzle migrateUpSQL from drizzle using the last schema
@@ -88,8 +89,6 @@ export const migratePostgresV2toV3 = async ({ debug, payload, req }: Args) => {
payload.logger.info(addColumnsStatement)
}
const db = await getTransaction(adapter, req)
await db.execute(sql.raw(addColumnsStatement))
for (const collection of payload.config.collections) {

View File

@@ -3,14 +3,12 @@ import type { FlattenedField, Where } from 'payload'
import type { DrizzleAdapter, GenericColumn } from '../types.js'
import type { BuildQueryJoinAliases } from './buildQuery.js'
import type { QueryContext } from './parseParams.js'
import { parseParams } from './parseParams.js'
export function buildAndOrConditions({
adapter,
aliasTable,
context,
fields,
joins,
locale,
@@ -23,7 +21,6 @@ export function buildAndOrConditions({
adapter: DrizzleAdapter
aliasTable?: Table
collectionSlug?: string
context: QueryContext
fields: FlattenedField[]
globalSlug?: string
joins: BuildQueryJoinAliases
@@ -44,7 +41,6 @@ export function buildAndOrConditions({
const result = parseParams({
adapter,
aliasTable,
context,
fields,
joins,
locale,

View File

@@ -3,7 +3,6 @@ import type { PgTableWithColumns } from 'drizzle-orm/pg-core'
import type { FlattenedField, Sort, Where } from 'payload'
import type { DrizzleAdapter, GenericColumn, GenericTable } from '../types.js'
import type { QueryContext } from './parseParams.js'
import { buildOrderBy } from './buildOrderBy.js'
import { parseParams } from './parseParams.js'
@@ -53,14 +52,24 @@ const buildQuery = function buildQuery({
id: adapter.tables[tableName].id,
}
const orderBy = buildOrderBy({
adapter,
aliasTable,
fields,
joins,
locale,
parentIsLocalized,
selectFields,
sort,
tableName,
})
let where: SQL
const context: QueryContext = { sort }
if (incomingWhere && Object.keys(incomingWhere).length > 0) {
where = parseParams({
adapter,
aliasTable,
context,
fields,
joins,
locale,
@@ -72,18 +81,6 @@ const buildQuery = function buildQuery({
})
}
const orderBy = buildOrderBy({
adapter,
aliasTable,
fields,
joins,
locale,
parentIsLocalized,
selectFields,
sort: context.sort,
tableName,
})
return {
joins,
orderBy,

View File

@@ -1,5 +1,5 @@
import type { SQL, Table } from 'drizzle-orm'
import type { FlattenedField, Operator, Sort, Where } from 'payload'
import type { FlattenedField, Operator, Where } from 'payload'
import { and, isNotNull, isNull, ne, notInArray, or, sql } from 'drizzle-orm'
import { PgUUID } from 'drizzle-orm/pg-core'
@@ -14,12 +14,9 @@ import { buildAndOrConditions } from './buildAndOrConditions.js'
import { getTableColumnFromPath } from './getTableColumnFromPath.js'
import { sanitizeQueryValue } from './sanitizeQueryValue.js'
export type QueryContext = { sort: Sort }
type Args = {
adapter: DrizzleAdapter
aliasTable?: Table
context: QueryContext
fields: FlattenedField[]
joins: BuildQueryJoinAliases
locale?: string
@@ -33,7 +30,6 @@ type Args = {
export function parseParams({
adapter,
aliasTable,
context,
fields,
joins,
locale,
@@ -61,7 +57,6 @@ export function parseParams({
const builtConditions = buildAndOrConditions({
adapter,
aliasTable,
context,
fields,
joins,
locale,
@@ -347,7 +342,6 @@ export function parseParams({
)
}
if (geoConstraints.length) {
context.sort = relationOrPath
constraints.push(and(...geoConstraints))
}
break

View File

@@ -666,10 +666,9 @@ export const traverseFields = <T extends Record<string, unknown>>({
withinArrayOrBlockLocale: locale || withinArrayOrBlockLocale,
})
// TODO: we need to only clean this up for arrays, blocks, and hasMany fields
// if ('_order' in ref) {
// delete ref._order
// }
if ('_order' in ref) {
delete ref._order
}
return
}

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/email-nodemailer",
"version": "3.37.0",
"version": "3.35.1",
"description": "Payload Nodemailer Email Adapter",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/email-resend",
"version": "3.37.0",
"version": "3.35.1",
"description": "Payload Resend Email Adapter",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/graphql",
"version": "3.37.0",
"version": "3.35.1",
"homepage": "https://payloadcms.com",
"repository": {
"type": "git",

View File

@@ -11,7 +11,6 @@ export type ObjectTypeConfig = {
type Args = {
baseFields?: ObjectTypeConfig
collectionSlug?: string
config: SanitizedConfig
fields: Field[]
forceNullable?: boolean
@@ -24,7 +23,6 @@ type Args = {
export function buildObjectType({
name,
baseFields = {},
collectionSlug,
config,
fields,
forceNullable,
@@ -45,7 +43,6 @@ export function buildObjectType({
return {
...objectTypeConfig,
...fieldSchema({
collectionSlug,
config,
field,
forceNullable,

View File

@@ -10,11 +10,11 @@ export const buildPaginatedListType = (name, docType) =>
hasNextPage: { type: new GraphQLNonNull(GraphQLBoolean) },
hasPrevPage: { type: new GraphQLNonNull(GraphQLBoolean) },
limit: { type: new GraphQLNonNull(GraphQLInt) },
nextPage: { type: GraphQLInt },
nextPage: { type: new GraphQLNonNull(GraphQLInt) },
offset: { type: GraphQLInt },
page: { type: new GraphQLNonNull(GraphQLInt) },
pagingCounter: { type: new GraphQLNonNull(GraphQLInt) },
prevPage: { type: GraphQLInt },
prevPage: { type: new GraphQLNonNull(GraphQLInt) },
totalDocs: { type: new GraphQLNonNull(GraphQLInt) },
totalPages: { type: new GraphQLNonNull(GraphQLInt) },
},

View File

@@ -8,7 +8,6 @@ import type {
DateField,
EmailField,
Field,
FlattenedJoinField,
GraphQLInfo,
GroupField,
JoinField,
@@ -69,7 +68,6 @@ function formattedNameResolver({
}
type SharedArgs = {
collectionSlug?: string
config: SanitizedConfig
forceNullable?: boolean
graphqlResult: GraphQLInfo
@@ -342,7 +340,7 @@ export const fieldToSchemaMap: FieldToSchemaMap = {
},
}
},
join: ({ collectionSlug, field, graphqlResult, objectTypeConfig, parentName }) => {
join: ({ field, graphqlResult, objectTypeConfig, parentName }) => {
const joinName = combineParentName(parentName, toWords(field.name, true))
const joinType = {
@@ -387,54 +385,27 @@ export const fieldToSchemaMap: FieldToSchemaMap = {
const draft = Boolean(args.draft ?? context.req.query?.draft)
const targetField = (field as FlattenedJoinField).targetField
const fullWhere = combineQueries(
where,
Array.isArray(targetField.relationTo)
? {
[field.on]: {
equals: {
relationTo: collectionSlug,
value: parent._id ?? parent.id,
},
},
}
: {
[field.on]: { equals: parent._id ?? parent.id },
},
)
const fullWhere = combineQueries(where, {
[field.on]: { equals: parent._id ?? parent.id },
})
if (Array.isArray(collection)) {
throw new Error('GraphQL with array of join.field.collection is not implemented')
}
const { docs } = await req.payload.find({
return await req.payload.find({
collection,
depth: 0,
draft,
fallbackLocale: req.fallbackLocale,
// Fetch one extra document to determine if there are more documents beyond the requested limit (used for hasNextPage calculation).
limit: typeof limit === 'number' && limit > 0 ? limit + 1 : 0,
limit,
locale: req.locale,
overrideAccess: false,
page,
pagination: false,
req,
sort,
where: fullWhere,
})
let shouldSlice = false
if (typeof limit === 'number' && limit !== 0 && limit < docs.length) {
shouldSlice = true
}
return {
docs: shouldSlice ? docs.slice(0, -1) : docs,
hasNextPage: limit === 0 ? false : limit < docs.length,
}
},
}

View File

@@ -29,7 +29,6 @@ import { recursivelyBuildNestedPaths } from './recursivelyBuildNestedPaths.js'
import { withOperators } from './withOperators.js'
type Args = {
collectionSlug?: string
nestedFieldName?: string
parentName: string
}

View File

@@ -111,7 +111,6 @@ export function initCollections({ config, graphqlResult }: InitCollectionsGraphQ
collection.graphQL.type = buildObjectType({
name: singularName,
baseFields,
collectionSlug: collectionConfig.slug,
config,
fields,
forceNullable: forceNullableObjectType,
@@ -340,7 +339,6 @@ export function initCollections({ config, graphqlResult }: InitCollectionsGraphQ
collection.graphQL.versionType = buildObjectType({
name: `${singularName}Version`,
collectionSlug: collectionConfig.slug,
config,
fields: versionCollectionFields,
forceNullable: forceNullableObjectType,

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/live-preview-react",
"version": "3.37.0",
"version": "3.35.1",
"description": "The official React SDK for Payload Live Preview",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -7,7 +7,7 @@ import { useCallback, useEffect, useRef, useState } from 'react'
// To prevent the flicker of stale data while the post message is being sent,
// you can conditionally render loading UI based on the `isLoading` state
export const useLivePreview = <T extends Record<string, unknown>>(props: {
export const useLivePreview = <T extends any>(props: {
apiRoute?: string
depth?: number
initialData: T
@@ -21,7 +21,7 @@ export const useLivePreview = <T extends Record<string, unknown>>(props: {
const [isLoading, setIsLoading] = useState<boolean>(true)
const hasSentReadyMessage = useRef<boolean>(false)
const onChange = useCallback((mergedData: T) => {
const onChange = useCallback((mergedData) => {
setData(mergedData)
setIsLoading(false)
}, [])

View File

@@ -1,4 +1,9 @@
{
"extends": "../../tsconfig.base.json",
"compilerOptions": {
/* TODO: remove the following lines */
"strict": false,
"noUncheckedIndexedAccess": false,
},
"references": [{ "path": "../payload" }]
}

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/live-preview-vue",
"version": "3.37.0",
"version": "3.35.1",
"description": "The official Vue SDK for Payload Live Preview",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -8,7 +8,7 @@ import { onMounted, onUnmounted, ref } from 'vue'
*
* {@link https://payloadcms.com/docs/live-preview/frontend View the documentation}
*/
export const useLivePreview = <T extends Record<string, unknown>>(props: {
export const useLivePreview = <T>(props: {
apiRoute?: string
depth?: number
initialData: T
@@ -27,7 +27,7 @@ export const useLivePreview = <T extends Record<string, unknown>>(props: {
isLoading.value = false
}
let subscription: (event: MessageEvent) => Promise<void> | void
let subscription: (event: MessageEvent) => void
onMounted(() => {
subscription = subscribe({

View File

@@ -1,4 +1,9 @@
{
"extends": "../../tsconfig.base.json",
"compilerOptions": {
/* TODO: remove the following lines */
"strict": false,
"noUncheckedIndexedAccess": false,
},
"references": [{ "path": "../payload" }] // db-mongodb depends on payload
}

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/live-preview",
"version": "3.37.0",
"version": "3.35.1",
"description": "The official live preview JavaScript SDK for Payload",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -1,6 +1,6 @@
import type { FieldSchemaJSON } from 'payload'
import type { CollectionPopulationRequestHandler, LivePreviewMessageEvent } from './types.js'
import type { LivePreviewMessageEvent } from './types.js'
import { isLivePreviewEvent } from './isLivePreviewEvent.js'
import { mergeData } from './mergeData.js'
@@ -29,10 +29,9 @@ export const handleMessage = async <T extends Record<string, any>>(args: {
depth?: number
event: LivePreviewMessageEvent<T>
initialData: T
requestHandler?: CollectionPopulationRequestHandler
serverURL: string
}): Promise<T> => {
const { apiRoute, depth, event, initialData, requestHandler, serverURL } = args
const { apiRoute, depth, event, initialData, serverURL } = args
if (isLivePreviewEvent(event, serverURL)) {
const { data, externallyUpdatedRelationship, fieldSchemaJSON, locale } = event.data
@@ -58,7 +57,6 @@ export const handleMessage = async <T extends Record<string, any>>(args: {
incomingData: data,
initialData: _payloadLivePreview?.previousData || initialData,
locale,
requestHandler,
serverURL,
})

View File

@@ -1,6 +1,6 @@
import type { DocumentEvent, FieldSchemaJSON, PaginatedDocs } from 'payload'
import type { CollectionPopulationRequestHandler, PopulationsByCollection } from './types.js'
import type { PopulationsByCollection } from './types.js'
import { traverseFields } from './traverseFields.js'
@@ -29,17 +29,21 @@ let prevLocale: string | undefined
export const mergeData = async <T extends Record<string, any>>(args: {
apiRoute?: string
/**
* @deprecated Use `requestHandler` instead
*/
collectionPopulationRequestHandler?: CollectionPopulationRequestHandler
collectionPopulationRequestHandler?: ({
apiPath,
endpoint,
serverURL,
}: {
apiPath: string
endpoint: string
serverURL: string
}) => Promise<Response>
depth?: number
externallyUpdatedRelationship?: DocumentEvent
fieldSchema: FieldSchemaJSON
incomingData: Partial<T>
initialData: T
locale?: string
requestHandler?: CollectionPopulationRequestHandler
returnNumberOfRequests?: boolean
serverURL: string
}): Promise<
@@ -77,8 +81,7 @@ export const mergeData = async <T extends Record<string, any>>(args: {
let res: PaginatedDocs
const ids = new Set(populations.map(({ id }) => id))
const requestHandler =
args.collectionPopulationRequestHandler || args.requestHandler || defaultRequestHandler
const requestHandler = args.collectionPopulationRequestHandler || defaultRequestHandler
try {
res = await requestHandler({

View File

@@ -1,5 +1,3 @@
import type { CollectionPopulationRequestHandler } from './types.js'
import { handleMessage } from './handleMessage.js'
export const subscribe = <T extends Record<string, any>>(args: {
@@ -7,10 +5,9 @@ export const subscribe = <T extends Record<string, any>>(args: {
callback: (data: T) => void
depth?: number
initialData: T
requestHandler?: CollectionPopulationRequestHandler
serverURL: string
}): ((event: MessageEvent) => Promise<void> | void) => {
const { apiRoute, callback, depth, initialData, requestHandler, serverURL } = args
const { apiRoute, callback, depth, initialData, serverURL } = args
const onMessage = async (event: MessageEvent) => {
const mergedData = await handleMessage<T>({
@@ -18,7 +15,6 @@ export const subscribe = <T extends Record<string, any>>(args: {
depth,
event,
initialData,
requestHandler,
serverURL,
})

View File

@@ -1,15 +1,5 @@
import type { DocumentEvent, FieldSchemaJSON } from 'payload'
export type CollectionPopulationRequestHandler = ({
apiPath,
endpoint,
serverURL,
}: {
apiPath: string
endpoint: string
serverURL: string
}) => Promise<Response>
export type LivePreviewArgs = {}
export type LivePreview = void

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/next",
"version": "3.37.0",
"version": "3.35.1",
"homepage": "https://payloadcms.com",
"repository": {
"type": "git",

View File

@@ -2,14 +2,7 @@
import type { PaginatedDocs, Where } from 'payload'
import {
fieldBaseClass,
Pill,
ReactSelect,
useConfig,
useDocumentInfo,
useTranslation,
} from '@payloadcms/ui'
import { fieldBaseClass, Pill, ReactSelect, useConfig, useTranslation } from '@payloadcms/ui'
import { formatDate } from '@payloadcms/ui/shared'
import { stringify } from 'qs-esm'
import React, { useCallback, useEffect, useState } from 'react'
@@ -44,8 +37,6 @@ export const SelectComparison: React.FC<Props> = (props) => {
},
} = useConfig()
const { hasPublishedDoc } = useDocumentInfo()
const [options, setOptions] = useState<
{
label: React.ReactNode | string
@@ -118,10 +109,7 @@ export const SelectComparison: React.FC<Props> = (props) => {
},
published: {
currentLabel: t('version:currentPublishedVersion'),
// The latest published version does not necessarily equal the current published version,
// because the latest published version might have been unpublished in the meantime.
// Hence, we should only use the latest published version if there is a published document.
latestVersion: hasPublishedDoc ? latestPublishedVersion : undefined,
latestVersion: latestPublishedVersion,
pillStyle: 'success',
previousLabel: t('version:previouslyPublished'),
},

View File

@@ -85,34 +85,13 @@ export async function VersionsView(props: DocumentViewServerProps) {
payload,
status: 'draft',
})
const publishedDoc = await payload.count({
collection: collectionSlug,
depth: 0,
overrideAccess: true,
req,
where: {
id: {
equals: id,
},
_status: {
equals: 'published',
},
},
latestPublishedVersion = await getLatestVersion({
slug: collectionSlug,
type: 'collection',
parentID: id,
payload,
status: 'published',
})
// If we pass a latestPublishedVersion to buildVersionColumns,
// this will be used to display it as the "current published version".
// However, the latest published version might have been unpublished in the meantime.
// Hence, we should only pass the latest published version if there is a published document.
latestPublishedVersion =
publishedDoc.totalDocs > 0 &&
(await getLatestVersion({
slug: collectionSlug,
type: 'collection',
parentID: id,
payload,
status: 'published',
}))
}
} catch (err) {
logError({ err, payload })

View File

@@ -140,13 +140,6 @@ export const withPayload = (nextConfig = {}, options = {}) => {
{ module: /node_modules\/mongodb\/lib\/bson\.js/ },
{ file: /node_modules\/mongodb\/lib\/bson\.js/ },
],
plugins: [
...(incomingWebpackConfig?.plugins || []),
// Fix cloudflare:sockets error: https://github.com/vercel/next.js/discussions/50177
new webpackOptions.webpack.IgnorePlugin({
resourceRegExp: /^pg-native$|^cloudflare:sockets$/,
}),
],
resolve: {
...(incomingWebpackConfig?.resolve || {}),
alias: {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/payload-cloud",
"version": "3.37.0",
"version": "3.35.1",
"description": "The official Payload Cloud plugin",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -1,6 +1,6 @@
{
"name": "payload",
"version": "3.37.0",
"version": "3.35.1",
"description": "Node, React, Headless CMS and Application Framework built on Next.js",
"keywords": [
"admin panel",

View File

@@ -0,0 +1,7 @@
const isLocked = (date: number): boolean => {
if (!date) {
return false
}
return date > Date.now()
}
export default isLocked

View File

@@ -1,6 +0,0 @@
export const isUserLocked = (date: number): boolean => {
if (!date) {
return false
}
return date > Date.now()
}

View File

@@ -138,17 +138,15 @@ export const forgotPasswordOperation = async <TSlug extends CollectionSlug>(
return null
}
const resetPasswordExpiration = new Date(
user.resetPasswordToken = token
user.resetPasswordExpiration = new Date(
Date.now() + (collectionConfig.auth?.forgotPassword?.expiration ?? expiration ?? 3600000),
).toISOString()
user = await payload.update({
id: user.id,
collection: collectionConfig.slug,
data: {
resetPasswordExpiration,
resetPasswordToken: token,
},
data: user,
req,
})

View File

@@ -3,7 +3,6 @@ import type {
AuthOperationsFromCollectionSlug,
Collection,
DataFromCollectionSlug,
SanitizedCollectionConfig,
} from '../../collections/config/types.js'
import type { CollectionSlug } from '../../index.js'
import type { PayloadRequest, Where } from '../../types/index.js'
@@ -22,7 +21,7 @@ import { killTransaction } from '../../utilities/killTransaction.js'
import sanitizeInternalFields from '../../utilities/sanitizeInternalFields.js'
import { getFieldsToSign } from '../getFieldsToSign.js'
import { getLoginOptions } from '../getLoginOptions.js'
import { isUserLocked } from '../isUserLocked.js'
import isLocked from '../isLocked.js'
import { jwtSign } from '../jwt.js'
import { authenticateLocalStrategy } from '../strategies/local/authenticate.js'
import { incrementLoginAttempts } from '../strategies/local/incrementLoginAttempts.js'
@@ -43,32 +42,6 @@ export type Arguments<TSlug extends CollectionSlug> = {
showHiddenFields?: boolean
}
type CheckLoginPermissionArgs = {
collection: SanitizedCollectionConfig
loggingInWithUsername?: boolean
req: PayloadRequest
user: any
}
export const checkLoginPermission = ({
collection,
loggingInWithUsername,
req,
user,
}: CheckLoginPermissionArgs) => {
if (!user) {
throw new AuthenticationError(req.t, Boolean(loggingInWithUsername))
}
if (collection.auth.verify && user._verified === false) {
throw new UnverifiedEmail({ t: req.t })
}
if (isUserLocked(new Date(user.lockUntil).getTime())) {
throw new LockedAuth(req.t)
}
}
export const loginOperation = async <TSlug extends CollectionSlug>(
incomingArgs: Arguments<TSlug>,
): Promise<{ user: DataFromCollectionSlug<TSlug> } & Result> => {
@@ -211,16 +184,21 @@ export const loginOperation = async <TSlug extends CollectionSlug>(
where: whereConstraint,
})
checkLoginPermission({
collection: collectionConfig,
loggingInWithUsername: Boolean(canLoginWithUsername && sanitizedUsername),
req,
user,
})
if (!user) {
throw new AuthenticationError(req.t, Boolean(canLoginWithUsername && sanitizedUsername))
}
if (args.collection.config.auth.verify && user._verified === false) {
throw new UnverifiedEmail({ t: req.t })
}
user.collection = collectionConfig.slug
user._strategy = 'local-jwt'
if (isLocked(new Date(user.lockUntil).getTime())) {
throw new LockedAuth(req.t)
}
const authResult = await authenticateLocalStrategy({ doc: user, password })
user = sanitizeInternalFields(user)

View File

@@ -247,7 +247,6 @@ export const createOperation = async <
let doc
const select = sanitizeSelect({
fields: collectionConfig.flattenedFields,
forceSelect: collectionConfig.forceSelect,
select: incomingSelect,
})

View File

@@ -110,7 +110,6 @@ export const deleteOperation = async <
const fullWhere = combineQueries(where, accessResult)
const select = sanitizeSelect({
fields: collectionConfig.flattenedFields,
forceSelect: collectionConfig.forceSelect,
select: incomingSelect,
})

View File

@@ -168,7 +168,6 @@ export const deleteByIDOperation = async <TSlug extends CollectionSlug, TSelect
}
const select = sanitizeSelect({
fields: collectionConfig.flattenedFields,
forceSelect: collectionConfig.forceSelect,
select: incomingSelect,
})

View File

@@ -102,7 +102,6 @@ export const findOperation = async <
} = args
const select = sanitizeSelect({
fields: collectionConfig.flattenedFields,
forceSelect: collectionConfig.forceSelect,
select: incomingSelect,
})

View File

@@ -87,7 +87,6 @@ export const findByIDOperation = async <
} = args
const select = sanitizeSelect({
fields: collectionConfig.flattenedFields,
forceSelect: collectionConfig.forceSelect,
select: incomingSelect,
})

View File

@@ -11,7 +11,6 @@ import { APIError, Forbidden, NotFound } from '../../errors/index.js'
import { afterRead } from '../../fields/hooks/afterRead/index.js'
import { killTransaction } from '../../utilities/killTransaction.js'
import { sanitizeSelect } from '../../utilities/sanitizeSelect.js'
import { buildVersionCollectionFields } from '../../versions/buildCollectionFields.js'
import { getQueryDraftsSelect } from '../../versions/drafts/getQueryDraftsSelect.js'
export type Arguments = {
@@ -71,10 +70,8 @@ export const findVersionByIDOperation = async <TData extends TypeWithID = any>(
// /////////////////////////////////////
const select = sanitizeSelect({
fields: buildVersionCollectionFields(payload.config, collectionConfig, true),
forceSelect: getQueryDraftsSelect({ select: collectionConfig.forceSelect }),
select: incomingSelect,
versions: true,
})
const versionsQuery = await payload.db.findVersions<TData>({

View File

@@ -72,10 +72,8 @@ export const findVersionsOperation = async <TData extends TypeWithVersion<TData>
const fullWhere = combineQueries(where, accessResults)
const select = sanitizeSelect({
fields: buildVersionCollectionFields(payload.config, collectionConfig, true),
forceSelect: getQueryDraftsSelect({ select: collectionConfig.forceSelect }),
select: incomingSelect,
versions: true,
})
// /////////////////////////////////////

View File

@@ -117,7 +117,6 @@ export const restoreVersionOperation = async <TData extends TypeWithID = any>(
// /////////////////////////////////////
const select = sanitizeSelect({
fields: collectionConfig.flattenedFields,
forceSelect: collectionConfig.forceSelect,
select: incomingSelect,
})

View File

@@ -201,7 +201,6 @@ export const updateOperation = async <
try {
const select = sanitizeSelect({
fields: collectionConfig.flattenedFields,
forceSelect: collectionConfig.forceSelect,
select: incomingSelect,
})

View File

@@ -161,7 +161,6 @@ export const updateByIDOperation = async <
})
const select = sanitizeSelect({
fields: collectionConfig.flattenedFields,
forceSelect: collectionConfig.forceSelect,
select: incomingSelect,
})

View File

@@ -83,13 +83,6 @@ export const addOrderableFieldsAndHook = (
hidden: true,
readOnly: true,
},
hooks: {
beforeDuplicate: [
({ siblingData }) => {
delete siblingData[orderableFieldName]
},
],
},
index: true,
required: true,
// override the schema to make order fields optional for payload.create()
@@ -282,6 +275,5 @@ export const addOrderableEndpoint = (config: SanitizedConfig) => {
if (!config.endpoints) {
config.endpoints = []
}
config.endpoints.push(reorderEndpoint)
}

View File

@@ -1061,7 +1061,6 @@ export type SelectField = {
} & Admin['components']
isClearable?: boolean
isSortable?: boolean
placeholder?: LabelFunction | string
} & Admin
/**
* Customize the SQL table name
@@ -1094,7 +1093,7 @@ export type SelectField = {
Omit<FieldBase, 'validate'>
export type SelectFieldClient = {
admin?: AdminClient & Pick<SelectField['admin'], 'isClearable' | 'isSortable' | 'placeholder'>
admin?: AdminClient & Pick<SelectField['admin'], 'isClearable' | 'isSortable'>
} & FieldBaseClient &
Pick<SelectField, 'hasMany' | 'interfaceName' | 'options' | 'type'>
@@ -1161,11 +1160,10 @@ type RelationshipAdmin = {
>
} & Admin['components']
isSortable?: boolean
placeholder?: LabelFunction | string
} & Admin
type RelationshipAdminClient = AdminClient &
Pick<RelationshipAdmin, 'allowCreate' | 'allowEdit' | 'appearance' | 'isSortable' | 'placeholder'>
Pick<RelationshipAdmin, 'allowCreate' | 'allowEdit' | 'appearance' | 'isSortable'>
export type PolymorphicRelationshipField = {
admin?: {

View File

@@ -200,7 +200,7 @@ export const email: EmailFieldValidation = (
* Supports multiple subdomains (e.g., user@sub.domain.example.com)
*/
const emailRegex =
/^(?!.*\.\.)[\w!#$%&'*+/=?^`{|}~-](?:[\w!#$%&'*+/=?^`{|}~.-]*[\w!#$%&'*+/=?^`{|}~-])?@[a-z0-9](?:[a-z0-9-]*[a-z0-9])?(?:\.[a-z0-9](?:[a-z0-9-]*[a-z0-9])?)*\.[a-z]{2,}$/i
/^(?!.*\.\.)[\w.%+-]+@[a-z0-9](?:[a-z0-9-]*[a-z0-9])?(?:\.[a-z0-9](?:[a-z0-9-]*[a-z0-9])?)*\.[a-z]{2,}$/i
if ((value && !emailRegex.test(value)) || (!value && required)) {
return t('validation:emailAddress')

View File

@@ -53,7 +53,6 @@ export const findOneOperation = async <T extends Record<string, unknown>>(
}
const select = sanitizeSelect({
fields: globalConfig.flattenedFields,
forceSelect: globalConfig.forceSelect,
select: incomingSelect,
})

View File

@@ -11,8 +11,6 @@ import { afterRead } from '../../fields/hooks/afterRead/index.js'
import { deepCopyObjectSimple } from '../../utilities/deepCopyObject.js'
import { killTransaction } from '../../utilities/killTransaction.js'
import { sanitizeSelect } from '../../utilities/sanitizeSelect.js'
import { buildVersionCollectionFields } from '../../versions/buildCollectionFields.js'
import { buildVersionGlobalFields } from '../../versions/buildGlobalFields.js'
import { getQueryDraftsSelect } from '../../versions/drafts/getQueryDraftsSelect.js'
export type Arguments = {
@@ -62,10 +60,8 @@ export const findVersionByIDOperation = async <T extends TypeWithVersion<T> = an
const hasWhereAccess = typeof accessResults === 'object'
const select = sanitizeSelect({
fields: buildVersionGlobalFields(payload.config, globalConfig, true),
forceSelect: getQueryDraftsSelect({ select: globalConfig.forceSelect }),
select: incomingSelect,
versions: true,
})
const findGlobalVersionsArgs: FindGlobalVersionsArgs = {

View File

@@ -70,10 +70,8 @@ export const findVersionsOperation = async <T extends TypeWithVersion<T>>(
const fullWhere = combineQueries(where, accessResults)
const select = sanitizeSelect({
fields: buildVersionGlobalFields(payload.config, globalConfig, true),
forceSelect: getQueryDraftsSelect({ select: globalConfig.forceSelect }),
select: incomingSelect,
versions: true,
})
// /////////////////////////////////////

View File

@@ -246,7 +246,6 @@ export const updateOperation = async <
// /////////////////////////////////////
const select = sanitizeSelect({
fields: globalConfig.flattenedFields,
forceSelect: globalConfig.forceSelect,
select: incomingSelect,
})

View File

@@ -89,10 +89,6 @@ import { traverseFields } from './utilities/traverseFields.js'
export { default as executeAccess } from './auth/executeAccess.js'
export { executeAuthStrategies } from './auth/executeAuthStrategies.js'
export { extractAccessFromPermission } from './auth/extractAccessFromPermission.js'
export { getAccessResults } from './auth/getAccessResults.js'
export { getFieldsToSign } from './auth/getFieldsToSign.js'
export { getLoginOptions } from './auth/getLoginOptions.js'
export interface GeneratedTypes {
authUntyped: {
@@ -981,12 +977,13 @@ interface RequestContext {
// eslint-disable-next-line @typescript-eslint/no-empty-object-type
export interface DatabaseAdapter extends BaseDatabaseAdapter {}
export type { Payload, RequestContext }
export { extractAccessFromPermission } from './auth/extractAccessFromPermission.js'
export { getAccessResults } from './auth/getAccessResults.js'
export { getFieldsToSign } from './auth/getFieldsToSign.js'
export * from './auth/index.js'
export { jwtSign } from './auth/jwt.js'
export { accessOperation } from './auth/operations/access.js'
export { forgotPasswordOperation } from './auth/operations/forgotPassword.js'
export { initOperation } from './auth/operations/init.js'
export { checkLoginPermission } from './auth/operations/login.js'
export { loginOperation } from './auth/operations/login.js'
export { logoutOperation } from './auth/operations/logout.js'
export type { MeOperationResult } from './auth/operations/me.js'
@@ -997,8 +994,6 @@ export { resetPasswordOperation } from './auth/operations/resetPassword.js'
export { unlockOperation } from './auth/operations/unlock.js'
export { verifyEmailOperation } from './auth/operations/verifyEmail.js'
export { JWTAuthentication } from './auth/strategies/jwt.js'
export { incrementLoginAttempts } from './auth/strategies/local/incrementLoginAttempts.js'
export { resetLoginAttempts } from './auth/strategies/local/resetLoginAttempts.js'
export type {
AuthStrategyFunction,
AuthStrategyFunctionArgs,
@@ -1206,7 +1201,6 @@ export {
MissingFile,
NotFound,
QueryError,
UnverifiedEmail,
ValidationError,
ValidationErrorName,
} from './errors/index.js'

View File

@@ -74,7 +74,7 @@ export const getConstraints = (config: Config): Field => ({
},
],
},
relationTo: config.admin?.user ?? 'users', // TODO: remove this fallback when the args are properly typed as `SanitizedConfig`
relationTo: 'users',
},
...(config?.queryPresets?.constraints?.[operation]?.reduce(
(acc: Field[], option: QueryPresetConstraint) => {

View File

@@ -10,7 +10,7 @@ import type { SanitizedConfig } from '../config/types.js'
import type { PayloadRequest } from '../types/index.js'
import type { FileData, FileToSave, ProbedImageSize, UploadEdits } from './types.js'
import { FileRetrievalError, FileUploadError, Forbidden, MissingFile } from '../errors/index.js'
import { FileRetrievalError, FileUploadError, MissingFile } from '../errors/index.js'
import { canResizeImage } from './canResizeImage.js'
import { cropImage } from './cropImage.js'
import { getExternalFile } from './getExternalFile.js'
@@ -85,10 +85,6 @@ export const generateFileData = async <T>({
if (!file && uploadEdits && incomingFileData) {
const { filename, url } = incomingFileData as FileData
if (filename && (filename.includes('../') || filename.includes('..\\'))) {
throw new Forbidden(req.t)
}
try {
if (url && url.startsWith('/') && !disableLocalStorage) {
const filePath = `${staticPath}/${filename}`

View File

@@ -5,28 +5,28 @@ import path from 'path'
import type { PayloadRequest } from '../types/index.js'
const mimeTypeEstimate: Record<string, string> = {
const mimeTypeEstimate = {
svg: 'image/svg+xml',
}
export const getFileByPath = async (filePath: string): Promise<PayloadRequest['file']> => {
if (typeof filePath !== 'string') {
return undefined
if (typeof filePath === 'string') {
const data = await fs.readFile(filePath)
const mimetype = fileTypeFromFile(filePath)
const { size } = await fs.stat(filePath)
const name = path.basename(filePath)
const ext = path.extname(filePath).slice(1)
const mime = (await mimetype)?.mime || mimeTypeEstimate[ext]
return {
name,
data,
mimetype: mime,
size,
}
}
const name = path.basename(filePath)
const ext = path.extname(filePath).slice(1)
const [data, stat, type] = await Promise.all([
fs.readFile(filePath),
fs.stat(filePath),
fileTypeFromFile(filePath),
])
return {
name,
data,
mimetype: type?.mime || mimeTypeEstimate[ext],
size: stat.size,
}
return undefined
}

View File

@@ -1,129 +1,17 @@
import { deepMergeSimple } from '@payloadcms/translations/utilities'
import type { FlattenedField } from '../fields/config/types.js'
import type { SelectIncludeType, SelectType } from '../types/index.js'
import type { SelectType } from '../types/index.js'
import { getSelectMode } from './getSelectMode.js'
// Transform post.title -> post, post.category.title -> post
const stripVirtualPathToCurrentCollection = ({
fields,
path,
versions,
}: {
fields: FlattenedField[]
path: string
versions: boolean
}) => {
const resultSegments: string[] = []
if (versions) {
resultSegments.push('version')
const versionField = fields.find((each) => each.name === 'version')
if (versionField && versionField.type === 'group') {
fields = versionField.flattenedFields
}
}
for (const segment of path.split('.')) {
const field = fields.find((each) => each.name === segment)
if (!field) {
continue
}
resultSegments.push(segment)
if (field.type === 'relationship' || field.type === 'upload') {
return resultSegments.join('.')
}
}
return resultSegments.join('.')
}
const getAllVirtualRelations = ({ fields }: { fields: FlattenedField[] }) => {
const result: string[] = []
for (const field of fields) {
if ('virtual' in field && typeof field.virtual === 'string') {
result.push(field.virtual)
} else if (field.type === 'group' || field.type === 'tab') {
const nestedResult = getAllVirtualRelations({ fields: field.flattenedFields })
for (const nestedItem of nestedResult) {
result.push(nestedItem)
}
}
}
return result
}
const resolveVirtualRelationsToSelect = ({
fields,
selectValue,
topLevelFields,
versions,
}: {
fields: FlattenedField[]
selectValue: SelectIncludeType | true
topLevelFields: FlattenedField[]
versions: boolean
}) => {
const result: string[] = []
if (selectValue === true) {
for (const item of getAllVirtualRelations({ fields })) {
result.push(
stripVirtualPathToCurrentCollection({ fields: topLevelFields, path: item, versions }),
)
}
} else {
for (const fieldName in selectValue) {
const field = fields.find((each) => each.name === fieldName)
if (!field) {
continue
}
if ('virtual' in field && typeof field.virtual === 'string') {
result.push(
stripVirtualPathToCurrentCollection({
fields: topLevelFields,
path: field.virtual,
versions,
}),
)
} else if (field.type === 'group' || field.type === 'tab') {
for (const item of resolveVirtualRelationsToSelect({
fields: field.flattenedFields,
selectValue: selectValue[fieldName],
topLevelFields,
versions,
})) {
result.push(
stripVirtualPathToCurrentCollection({ fields: topLevelFields, path: item, versions }),
)
}
}
}
}
return result
}
export const sanitizeSelect = ({
fields,
forceSelect,
select,
versions,
}: {
fields: FlattenedField[]
forceSelect?: SelectType
select?: SelectType
versions?: boolean
}): SelectType | undefined => {
if (!select) {
if (!forceSelect || !select) {
return select
}
@@ -133,36 +21,5 @@ export const sanitizeSelect = ({
return select
}
if (forceSelect) {
select = deepMergeSimple(select, forceSelect)
}
if (select) {
const virtualRelations = resolveVirtualRelationsToSelect({
fields,
selectValue: select as SelectIncludeType,
topLevelFields: fields,
versions: versions ?? false,
})
for (const path of virtualRelations) {
let currentRef = select
const segments = path.split('.')
for (let i = 0; i < segments.length; i++) {
const isLast = segments.length - 1 === i
const segment = segments[i]
if (isLast) {
currentRef[segment] = true
} else {
if (!(segment in currentRef)) {
currentRef[segment] = {}
currentRef = currentRef[segment]
}
}
}
}
}
return select
return deepMergeSimple(select, forceSelect)
}

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/plugin-cloud-storage",
"version": "3.37.0",
"version": "3.35.1",
"description": "The official cloud storage plugin for Payload CMS",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -26,7 +26,6 @@ export async function getFilePrefix({
const files = await req.payload.find({
collection: collection.slug,
depth: 0,
draft: true,
limit: 1,
pagination: false,
where: {

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/plugin-form-builder",
"version": "3.37.0",
"version": "3.35.1",
"description": "Form builder plugin for Payload CMS",
"keywords": [
"payload",

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/plugin-import-export",
"version": "3.37.0",
"version": "3.35.1",
"description": "Import-Export plugin for Payload",
"keywords": [
"payload",

View File

@@ -87,7 +87,7 @@ export const createExport = async (args: CreateExportArgs) => {
let isFirstBatch = true
while (result.docs.length > 0) {
const csvInput = result.docs.map((doc) => flattenObject({ doc, fields }))
const csvInput = result.docs.map((doc) => flattenObject(doc))
const csvString = stringify(csvInput, { header: isFirstBatch })
this.push(encoder.encode(csvString))
isFirstBatch = false
@@ -119,7 +119,7 @@ export const createExport = async (args: CreateExportArgs) => {
result = await payload.find(findArgs)
if (isCSV) {
const csvInput = result.docs.map((doc) => flattenObject({ doc, fields }))
const csvInput = result.docs.map((doc) => flattenObject(doc))
outputData.push(stringify(csvInput, { header: isFirstBatch }))
isFirstBatch = false
} else {

View File

@@ -1,61 +1,23 @@
import type { Document } from 'payload'
type Args = {
doc: Document
fields?: string[]
prefix?: string
}
export const flattenObject = ({ doc, fields, prefix }: Args): Record<string, unknown> => {
export const flattenObject = (obj: any, prefix: string = ''): Record<string, unknown> => {
const result: Record<string, unknown> = {}
const flatten = (doc: Document, prefix?: string) => {
Object.entries(doc).forEach(([key, value]) => {
const newKey = prefix ? `${prefix}_${key}` : key
Object.entries(obj).forEach(([key, value]) => {
const newKey = prefix ? `${prefix}_${key}` : key
if (Array.isArray(value)) {
value.forEach((item, index) => {
if (typeof item === 'object' && item !== null) {
flatten(item, `${newKey}_${index}`)
} else {
result[`${newKey}_${index}`] = item
}
})
} else if (typeof value === 'object' && value !== null) {
flatten(value, newKey)
} else {
result[newKey] = value
}
})
}
flatten(doc, prefix)
if (fields) {
const orderedResult: Record<string, unknown> = {}
const fieldToRegex = (field: string): RegExp => {
const parts = field.split('.').map((part) => `${part}(?:_\\d+)?`)
const pattern = `^${parts.join('_')}`
return new RegExp(pattern)
if (Array.isArray(value)) {
value.forEach((item, index) => {
if (typeof item === 'object' && item !== null) {
Object.assign(result, flattenObject(item, `${newKey}_${index}`))
} else {
result[`${newKey}_${index}`] = item
}
})
} else if (typeof value === 'object' && value !== null) {
Object.assign(result, flattenObject(value, newKey))
} else {
result[newKey] = value
}
fields.forEach((field) => {
if (result[field.replace(/\./g, '_')]) {
const sanitizedField = field.replace(/\./g, '_')
orderedResult[sanitizedField] = result[sanitizedField]
} else {
const regex = fieldToRegex(field)
Object.keys(result).forEach((key) => {
if (regex.test(key)) {
orderedResult[key] = result[key]
}
})
}
})
return orderedResult
}
})
return result
}

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/plugin-multi-tenant",
"version": "3.37.0",
"version": "3.35.1",
"description": "Multi Tenant plugin for Payload",
"keywords": [
"payload",

View File

@@ -14,7 +14,6 @@ export const findTenantOptions = async ({
useAsTitle,
user,
}: Args): Promise<PaginatedDocs> => {
const isOrderable = payload.collections[tenantsCollectionSlug]?.config?.orderable || false
return payload.find({
collection: tenantsCollectionSlug,
depth: 0,
@@ -22,9 +21,8 @@ export const findTenantOptions = async ({
overrideAccess: false,
select: {
[useAsTitle]: true,
...(isOrderable ? { _order: true } : {}),
},
sort: isOrderable ? '_order' : useAsTitle,
sort: useAsTitle,
user,
})
}

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/plugin-nested-docs",
"version": "3.37.0",
"version": "3.35.1",
"description": "The official Nested Docs plugin for Payload",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -22,6 +22,7 @@ type ResaveArgs = {
const resave = async ({ collection, doc, draft, pluginConfig, req }: ResaveArgs) => {
const parentSlug = pluginConfig?.parentFieldSlug || 'parent'
const breadcrumbSlug = pluginConfig.breadcrumbsFieldSlug || 'breadcrumbs'
if (draft) {
// If the parent is a draft, don't resave children

View File

@@ -8,39 +8,47 @@ import type { Breadcrumb, NestedDocsPluginConfig } from '../types.js'
export const resaveSelfAfterCreate =
(pluginConfig: NestedDocsPluginConfig, collection: CollectionConfig): CollectionAfterChangeHook =>
async ({ doc, operation, req }) => {
if (operation !== 'create') {
return undefined
}
const { locale, payload } = req
const breadcrumbSlug = pluginConfig.breadcrumbsFieldSlug || 'breadcrumbs'
const breadcrumbs = doc[breadcrumbSlug] as unknown as Breadcrumb[]
const updateAsDraft =
typeof collection.versions === 'object' &&
collection.versions.drafts &&
doc._status !== 'published'
try {
await payload.update({
if (operation === 'create') {
const originalDocWithDepth0 = await payload.findByID({
id: doc.id,
collection: collection.slug,
data: {
[breadcrumbSlug]:
breadcrumbs?.map((crumb, i) => ({
...crumb,
doc: breadcrumbs.length === i + 1 ? doc.id : crumb.doc,
})) || [],
},
depth: 0,
draft: updateAsDraft,
locale,
req,
})
} catch (err: unknown) {
payload.logger.error(
`Nested Docs plugin has had an error while adding breadcrumbs during document creation.`,
)
payload.logger.error(err)
const updateAsDraft =
typeof collection.versions === 'object' &&
collection.versions.drafts &&
doc._status !== 'published'
try {
await payload.update({
id: doc.id,
collection: collection.slug,
data: {
...originalDocWithDepth0,
[breadcrumbSlug]:
breadcrumbs?.map((crumb, i) => ({
...crumb,
doc: breadcrumbs.length === i + 1 ? doc.id : crumb.doc,
})) || [],
},
depth: 0,
draft: updateAsDraft,
locale,
req,
})
} catch (err: unknown) {
payload.logger.error(
`Nested Docs plugin has had an error while adding breadcrumbs during document creation.`,
)
payload.logger.error(err)
}
}
return undefined
}

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/plugin-redirects",
"version": "3.37.0",
"version": "3.35.1",
"description": "Redirects plugin for Payload",
"keywords": [
"payload",

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/plugin-search",
"version": "3.37.0",
"version": "3.35.1",
"description": "Search plugin for Payload",
"keywords": [
"payload",

View File

@@ -124,15 +124,14 @@ export const generateReindexHandler =
for (let i = 0; i < totalBatches; i++) {
const { docs } = await payload.find({
collection,
depth: 0,
limit: batchSize,
locale: localeToSync,
page: i + 1,
...defaultLocalApiProps,
})
for (const doc of docs) {
await syncDocAsSearchIndex({
const promises = docs.map((doc) =>
syncDocAsSearchIndex({
collection,
doc,
locale: localeToSync,
@@ -140,7 +139,12 @@ export const generateReindexHandler =
operation,
pluginConfig,
req,
})
}),
)
// Sequentially await promises to avoid transaction issues
for (const promise of promises) {
await promise
}
}
}

View File

@@ -64,17 +64,18 @@ export const syncDocAsSearchIndex = async ({
const doSync = syncDrafts || (!syncDrafts && status !== 'draft')
try {
if (operation === 'create' && doSync) {
await payload.create({
collection: searchSlug,
data: {
...dataToSave,
priority: defaultPriority,
},
depth: 0,
locale: syncLocale,
req,
})
if (operation === 'create') {
if (doSync) {
await payload.create({
collection: searchSlug,
data: {
...dataToSave,
priority: defaultPriority,
},
locale: syncLocale,
req,
})
}
}
if (operation === 'update') {
@@ -109,7 +110,6 @@ export const syncDocAsSearchIndex = async ({
const duplicativeDocIDs = duplicativeDocs.map(({ id }) => id)
await payload.delete({
collection: searchSlug,
depth: 0,
req,
where: { id: { in: duplicativeDocIDs } },
})
@@ -134,7 +134,6 @@ export const syncDocAsSearchIndex = async ({
...dataToSave,
priority: foundDoc.priority || defaultPriority,
},
depth: 0,
locale: syncLocale,
req,
})
@@ -149,7 +148,6 @@ export const syncDocAsSearchIndex = async ({
docs: [docWithPublish],
} = await payload.find({
collection,
depth: 0,
draft: false,
limit: 1,
locale: syncLocale,
@@ -177,7 +175,6 @@ export const syncDocAsSearchIndex = async ({
await payload.delete({
id: searchDocID,
collection: searchSlug,
depth: 0,
req,
})
} catch (err: unknown) {
@@ -193,7 +190,6 @@ export const syncDocAsSearchIndex = async ({
...dataToSave,
priority: defaultPriority,
},
depth: 0,
locale: syncLocale,
req,
})

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/plugin-sentry",
"version": "3.37.0",
"version": "3.35.1",
"description": "Sentry plugin for Payload",
"keywords": [
"payload",

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/plugin-seo",
"version": "3.37.0",
"version": "3.35.1",
"description": "SEO plugin for Payload",
"keywords": [
"payload",

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/plugin-stripe",
"version": "3.37.0",
"version": "3.35.1",
"description": "Stripe plugin for Payload",
"keywords": [
"payload",

View File

@@ -1,6 +1,6 @@
{
"name": "@payloadcms/richtext-lexical",
"version": "3.37.0",
"version": "3.35.1",
"description": "The officially supported Lexical richtext adapter for Payload",
"homepage": "https://payloadcms.com",
"repository": {

View File

@@ -284,22 +284,10 @@ export const InlineBlockComponent: React.FC<Props> = (props) => {
)
// cleanup effect
useEffect(() => {
const isStateOutOfSync = (formData: InlineBlockFields, initialState: FormState) => {
return Object.keys(initialState).some(
(key) => initialState[key] && formData[key] !== initialState[key].value,
)
}
return () => {
// If the component is unmounted (either via removeInlineBlock or via lexical itself) and the form state got changed before,
// we need to reset the initial state to force a re-fetch of the initial state when it gets mounted again (e.g. via lexical history undo).
// Otherwise it would use an outdated initial state.
if (initialState && isStateOutOfSync(formData, initialState)) {
setInitialState(false)
}
abortAndIgnore(onChangeAbortControllerRef.current)
}
}, [formData, initialState])
}, [])
/**
* HANDLE FORM SUBMIT

View File

@@ -56,15 +56,22 @@ export const BlocksPlugin: PluginComponent = () => {
if ($isRangeSelection(selection)) {
const blockNode = $createBlockNode(payload)
// we need to get the focus node before inserting the block node, as $insertNodeToNearestRoot can change the focus node
const { focus } = selection
const focusNode = focus.getNode()
// Insert blocks node BEFORE potentially removing focusNode, as $insertNodeToNearestRoot errors if the focusNode doesn't exist
$insertNodeToNearestRoot(blockNode)
// Delete the node it it's an empty paragraph
if ($isParagraphNode(focusNode) && !focusNode.__first) {
const { focus } = selection
const focusNode = focus.getNode()
// First, delete currently selected node if it's an empty paragraph and if there are sufficient
// paragraph nodes (more than 1) left in the parent node, so that we don't "trap" the user
if (
$isParagraphNode(focusNode) &&
focusNode.getTextContentSize() === 0 &&
focusNode
.getParentOrThrow()
.getChildren()
.filter((node) => $isParagraphNode(node)).length > 1
) {
focusNode.remove()
}
}

View File

@@ -5,12 +5,6 @@ import type {
SerializedParagraphNode,
SerializedTextNode,
SerializedLineBreakNode,
SerializedHeadingNode,
SerializedListItemNode,
SerializedListNode,
SerializedTableRowNode,
SerializedTableNode,
SerializedTableCellNode,
} from '../../../nodeTypes.js'
import { convertLexicalToPlaintext } from './sync/index.js'
@@ -57,83 +51,7 @@ function paragraphNode(children: DefaultNodeTypes[]): SerializedParagraphNode {
}
}
function headingNode(children: DefaultNodeTypes[]): SerializedHeadingNode {
return {
type: 'heading',
children,
direction: 'ltr',
format: '',
indent: 0,
textFormat: 0,
tag: 'h1',
version: 1,
}
}
function listItemNode(children: DefaultNodeTypes[]): SerializedListItemNode {
return {
type: 'listitem',
children,
checked: false,
direction: 'ltr',
format: '',
indent: 0,
value: 0,
version: 1,
}
}
function listNode(children: DefaultNodeTypes[]): SerializedListNode {
return {
type: 'list',
children,
direction: 'ltr',
format: '',
indent: 0,
listType: 'bullet',
start: 0,
tag: 'ul',
version: 1,
}
}
function tableNode(children: (DefaultNodeTypes | SerializedTableRowNode)[]): SerializedTableNode {
return {
type: 'table',
children,
direction: 'ltr',
format: '',
indent: 0,
version: 1,
}
}
function tableRowNode(
children: (DefaultNodeTypes | SerializedTableCellNode)[],
): SerializedTableRowNode {
return {
type: 'tablerow',
children,
direction: 'ltr',
format: '',
indent: 0,
version: 1,
}
}
function tableCellNode(children: DefaultNodeTypes[]): SerializedTableCellNode {
return {
type: 'tablecell',
children,
direction: 'ltr',
format: '',
indent: 0,
headerState: 0,
version: 1,
}
}
function rootNode(nodes: (DefaultNodeTypes | SerializedTableNode)[]): DefaultTypedEditorState {
function rootNode(nodes: DefaultNodeTypes[]): DefaultTypedEditorState {
return {
root: {
type: 'root',
@@ -154,6 +72,7 @@ describe('convertLexicalToPlaintext', () => {
data,
})
console.log('plaintext', plaintext)
expect(plaintext).toBe('Basic Text')
})
@@ -192,67 +111,4 @@ describe('convertLexicalToPlaintext', () => {
expect(plaintext).toBe('Basic Text\tNext Line')
})
it('ensure new lines are added between paragraphs', () => {
const data: DefaultTypedEditorState = rootNode([
paragraphNode([textNode('Basic text')]),
paragraphNode([textNode('Next block-node')]),
])
const plaintext = convertLexicalToPlaintext({
data,
})
expect(plaintext).toBe('Basic text\n\nNext block-node')
})
it('ensure new lines are added between heading nodes', () => {
const data: DefaultTypedEditorState = rootNode([
headingNode([textNode('Basic text')]),
headingNode([textNode('Next block-node')]),
])
const plaintext = convertLexicalToPlaintext({
data,
})
expect(plaintext).toBe('Basic text\n\nNext block-node')
})
it('ensure new lines are added between list items and lists', () => {
const data: DefaultTypedEditorState = rootNode([
listNode([listItemNode([textNode('First item')]), listItemNode([textNode('Second item')])]),
listNode([listItemNode([textNode('Next list')])]),
])
const plaintext = convertLexicalToPlaintext({
data,
})
expect(plaintext).toBe('First item\nSecond item\n\nNext list')
})
it('ensure new lines are added between tables, table rows, and table cells', () => {
const data: DefaultTypedEditorState = rootNode([
tableNode([
tableRowNode([
tableCellNode([textNode('Cell 1, Row 1')]),
tableCellNode([textNode('Cell 2, Row 1')]),
]),
tableRowNode([
tableCellNode([textNode('Cell 1, Row 2')]),
tableCellNode([textNode('Cell 2, Row 2')]),
]),
]),
tableNode([tableRowNode([tableCellNode([textNode('Cell in Table 2')])])]),
])
const plaintext = convertLexicalToPlaintext({
data,
})
expect(plaintext).toBe(
'Cell 1, Row 1 | Cell 2, Row 1\nCell 1, Row 2 | Cell 2, Row 2\n\nCell in Table 2',
)
})
})

Some files were not shown because too many files have changed in this diff Show More