Currently, attempting to run tasks in parallel will result in DB errors. ## Solution The problem was caused due to inefficient db update calls. After each task completes, we need to update the log array in the payload-jobs collection. On postgres, that's a different table. Currently, the update works the following way: 1. Nuke the table 2. Re-insert every single row, including the new one This will throw db errors if multiple processes start doing that. Additionally, due to conflicts, new log rows may be lost. This PR makes use of the the [new db $push operation ](https://github.com/payloadcms/payload/pull/13453) we recently added to atomically push a new log row to the database in a single round-trip. This not only reduces the amount of db round trips (=> faster job queue system) but allows multiple tasks to perform this db operation in parallel, without conflicts. ## Problem **Example:** ```ts export const fastParallelTaskWorkflow: WorkflowConfig<'fastParallelTask'> = { slug: 'fastParallelTask', handler: async ({nlineTask }) => { const taskFunctions = [] for (let i = 0; i < 20; i++) { const idx = i + 1 taskFunctions.push(async () => { return await inlineTask(`parallel task ${idx}`, { input: { test: idx, }, task: () => { return { output: { taskID: idx.toString(), }, } }, }) }) } await Promise.all(taskFunctions.map((f) => f())) }, } ``` On SQLite, this would throw the following error: ```bash Caught error Error: UNIQUE constraint failed: payload_jobs_log.id at Object.next (/Users/alessio/Documents/GitHub/payload/node_modules/.pnpm/libsql@0.4.7/node_modules/libsql/index.js:335:20) at Statement.all (/Users/alessio/Documents/GitHub/payload/node_modules/.pnpm/libsql@0.4.7/node_modules/libsql/index.js:360:16) at executeStmt (/Users/alessio/Documents/GitHub/payload/node_modules/.pnpm/@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5/node_modules/@libsql/client/lib-cjs/sqlite3.js:285:34) at Sqlite3Client.execute (/Users/alessio/Documents/GitHub/payload/node_modules/.pnpm/@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5/node_modules/@libsql/client/lib-cjs/sqlite3.js:101:16) at /Users/alessio/Documents/GitHub/payload/node_modules/.pnpm/drizzle-orm@0.44.2_@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5__@opentelemetr_asjmtflojkxlnxrshoh4fj5f6u/node_modules/src/libsql/session.ts:288:58 at LibSQLPreparedQuery.queryWithCache (/Users/alessio/Documents/GitHub/payload/node_modules/.pnpm/drizzle-orm@0.44.2_@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5__@opentelemetr_asjmtflojkxlnxrshoh4fj5f6u/node_modules/src/sqlite-core/session.ts:79:18) at LibSQLPreparedQuery.values (/Users/alessio/Documents/GitHub/payload/node_modules/.pnpm/drizzle-orm@0.44.2_@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5__@opentelemetr_asjmtflojkxlnxrshoh4fj5f6u/node_modules/src/libsql/session.ts:286:21) at LibSQLPreparedQuery.all (/Users/alessio/Documents/GitHub/payload/node_modules/.pnpm/drizzle-orm@0.44.2_@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5__@opentelemetr_asjmtflojkxlnxrshoh4fj5f6u/node_modules/src/libsql/session.ts:214:27) at QueryPromise.all (/Users/alessio/Documents/GitHub/payload/node_modules/.pnpm/drizzle-orm@0.44.2_@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5__@opentelemetr_asjmtflojkxlnxrshoh4fj5f6u/node_modules/src/sqlite-core/query-builders/insert.ts:402:26) at QueryPromise.execute (/Users/alessio/Documents/GitHub/payload/node_modules/.pnpm/drizzle-orm@0.44.2_@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5__@opentelemetr_asjmtflojkxlnxrshoh4fj5f6u/node_modules/src/sqlite-core/query-builders/insert.ts:414:40) at QueryPromise.then (/Users/alessio/Documents/GitHub/payload/node_modules/.pnpm/drizzle-orm@0.44.2_@libsql+client@0.14.0_bufferutil@4.0.8_utf-8-validate@6.0.5__@opentelemetr_asjmtflojkxlnxrshoh4fj5f6u/node_modules/src/query-promise.ts:31:15) { rawCode: 1555, code: 'SQLITE_CONSTRAINT_PRIMARYKEY', libsqlError: true } ``` --- - To see the specific tasks where the Asana app for GitHub is being used, see below: - https://app.asana.com/0/0/1211001438499053
60 lines
1.6 KiB
TypeScript
60 lines
1.6 KiB
TypeScript
import type { Payload } from 'payload'
|
|
|
|
/* eslint-disable jest/require-top-level-describe */
|
|
import assert from 'assert'
|
|
import path from 'path'
|
|
import { fileURLToPath } from 'url'
|
|
|
|
import { initPayloadInt } from '../helpers/initPayloadInt.js'
|
|
import { withoutAutoRun } from './utilities.js'
|
|
|
|
const filename = fileURLToPath(import.meta.url)
|
|
const dirname = path.dirname(filename)
|
|
|
|
const describePostgres = process.env.PAYLOAD_DATABASE?.startsWith('postgres')
|
|
? describe
|
|
: describe.skip
|
|
|
|
let payload: Payload
|
|
|
|
describePostgres('queues - postgres logs', () => {
|
|
beforeAll(async () => {
|
|
const initialized = await initPayloadInt(
|
|
dirname,
|
|
undefined,
|
|
undefined,
|
|
'config.postgreslogs.ts',
|
|
)
|
|
assert(initialized.payload)
|
|
assert(initialized.restClient)
|
|
;({ payload } = initialized)
|
|
})
|
|
|
|
afterAll(async () => {
|
|
await payload.destroy()
|
|
})
|
|
|
|
it('ensure running jobs uses minimal db calls', async () => {
|
|
await withoutAutoRun(async () => {
|
|
await payload.jobs.queue({
|
|
task: 'DoNothingTask',
|
|
input: {
|
|
message: 'test',
|
|
},
|
|
})
|
|
|
|
// Count every console log (= db call)
|
|
const consoleCount = jest.spyOn(console, 'log').mockImplementation(() => {})
|
|
|
|
const res = await payload.jobs.run({})
|
|
|
|
expect(res).toEqual({
|
|
jobStatus: { '1': { status: 'success' } },
|
|
remainingJobsFromQueried: 0,
|
|
})
|
|
expect(consoleCount).toHaveBeenCalledTimes(14) // Should be 14 sql calls if the optimizations are used. If not, this would be 22 calls
|
|
consoleCount.mockRestore()
|
|
})
|
|
})
|
|
})
|