I've got a custom page that includes the ability to back up a collection prior to dropping a collection and importing new data into it. Kind of at a loss as to where to start here.
Managed to get it working through a lot of trial and error
import { Endpoint } from 'payload/config'
import { MongoClient } from 'mongodb'
import path from 'path'
import fs from 'fs/promises'
const backupThings :Endpoint = {
path: '/api/backup-thing',
method: 'get',
root: true,
handler: async (req, res, next) => {
try {
const client = await MongoClient.connect(process.env.MONGODB_URI || '', { })
const db = client.db(process.env.MONGO_DB || '')
const collection = db.collection('things')
const things = await collection.find({}).toArray()
const backup = JSON.stringify(things)
const backupPath = path.resolve(__dirname, `../backups/things-${Date.now()}.json`)
await fs.writeFile(backupPath, backup)
await client.close()
res.status(200).send({
message: 'Backup of things successful',
})
}
catch (err) {
console.error(err)
res.status(500).send({
message: 'Error occurred during request',
error: err,
})
}
},
}
export default backupThings
It also required a bunch of fallback settings to get compiling:
const config :Config = {
admin: {
webpack: (config) => {
...(config || {}),
resolve: {
...(config.resolve || {}),
fallback: {
...(config.resolve.fallback || {}),
'fs/promises': false,
'mongodb': false,
'mssql': false,
'fs': false,
'stream': false,
'timers': false,
'dns': false,
'constants': false,
'dgram': false,
'zlib': false,
'kerberos': false,
'@mongodb-js/zstd': false,
'snappy': false,
'aws4': false,
'mongodb-client-encryption': false,
'@aws-sdk/credential-providers': false,
},
},
},
}
}
One thing I keep running into is as I change aliases/fallbacks, I keep getting not so up-to-date results from the log. Sometimes deleting and reinstalling my node_modules folder helps. Even with this setup still get a warning because mongodb has this dynamic require of modules.
Hope someone finds this useful or has some feedback to provide. Maybe I'm just doing something the hard way? lol. time for sleep now... 💤
Is it just data that you're trying to get?
How deeply nested is it, how do you want to back it up exactly and what kind of fields are you using?
Depending on the complexity, you could literally do a REST or GQL fetch, get all the data in JSON form, then store it in a JSON field
Rolling it back from this form isn't
the easiestthing ever though, so you may want to create a hidden mirror collection for that situation when you need rollbacks
i basically just need to copy a collection for a potential rollback in case something goes wrong with the import process I'm writing
the depth is I think at most 2 atm. what i've pasted works though. a new json copy of the collection is output every time I hit the endpoint. I read that a json file could be used to straight import to a collection so that basically covers my needs (I hope).
def would be ideal if I could get the same data without adding the extra dependencies
tried out a little refactor to avoid having to use
mongodb
's MongoClient module. It seems to work but the keys of the query are a bit.. different.
_id
becoming
id
and the lack of a
__v
field... not sure if that matters. would hope it doesn't since it def simplifies the dependency tree
This speaks to having a need for import/export built into Payload. If you could do this through the UI, would you?
in my current case, no. the reason I have the import process set up is due to making data from an MSSQL db available to relationship fields in payload. So I literally have a collection that matches the structure of the incoming data, then I'm dropping the contents and replacing it with the new set of data. So it's ideal that a backup is automated in case that process fails for whatever reason and we need to rollback
but
if it was built into payload's local api, that'd be amazing
i also think a UI for this would just add value
Star
Discord
online
Get help straight from the Payload team with an Enterprise License.