In DEV environments you typically have test data to cover various usecases, but not necessarily all data of PROD.
So what you can do is to identify data on production, which is suitable for testing and just copy that using package manager. But ignore all the parts (the large amount of data) which does not bring value for testing.
An easy solution could be something like this(I have done several times):
Automated nightly(scheduled) job to back -content / assets / source - (specific paths) from one Author to another environment.
Basically, We created a Scripts which nightly runs "automated' from Jenkins; run package: prepare/builds/backups/for targetsite ( base on a certain path for content/assets/source ) and move up to Amazon S3(or any other way) is now past the test phase and should be ready for after review .
Variables in the scripts allow these to run on dev/qa/stage/prod or any other environments. Easily adaptable to additional env setup as well .
After that, we take data from Amazon s3 and install in lower environment in order to get always clean environments. In addition, all backups have their own SCM - versioning strategy in case we need to restore a specific version.
Currently there are differences between Prod and Dev so trying to get both in sync. We don't have enough space in Dev to copy a large datastore and it will take a long time if we have to do it often so trying to see if there is better way to get them in sync.
Thanks. In our case we don't want to copy the datastore just the instance so that the configs are same in prod/dev. So if we copy the instance from prod to dev and use the datastore in dev or have empty datastore will it work. Does segmentstore have references to content in datastore.