Question on validating SVG folder

SOLVED

Hi,

Context:

- Migration work.

- Sage X3 pu9p5, Sage X3 v11p22, and Sage X3 v12p34 (destination version; do note I haven't test to bring v12 console to lower versions for multi-threading SVG backup work).

- SQL Server 2022 Standard Edition on v12 server. Assume SQL Server 2014/2016 Standard edition otherwise.

- SVG size from a manual backup is 500 GB. Starting from 11:00pm local time of the server, time taken was well over 15 hours, exceeding maintenance window of 3 hours only daily window by a wide margin. Audit files has been trimmed in advance from purge schedule monthly and it is just 20 GB at time of writing. Other tables were considered outside of standard purge code but could not be trimmed being key operational table such as GACCENTRYD table which is 100 GB large alone. Old data archiving had already been done and it is 1 TB and growing weekly with archive folder H* designation.

- There are other recurring files running after the maintenance period including ingesting import template files received in designated custom Sage X3 volume folder or from Sage X3 REST API on-demand. I have concerns of data consistency here as possible case of data in import state was snapshot into the SVG backup instead.

The standard Sage X3 process is to use DOSSVG recurring task daily as a main standard form of backup but due to nature of DOSSVG backing up the whole Sage X3 database of the endpoint schema at the time of extraction (or when it's time to extract the table in letter sequence from A to Z), and it already passed the maintenance window into client's working hour,

[1] how can I guarantee the SVG backup data is consistent as of the backup period start time?

[2] Is there a way for me to validate the SVG folder that all data inside can be imported by Sage X3 Console with no issues?

I have recently observed a singular table import failure by Sage X3 Console on an SVG folder due to intermittent issue of import file was importing into Sage X3 and record had a duplicate entry for a distinct index. When I check in with development team to confirm the issue duplicate entry for a distinct index, the record could not be found. For reference, the SVG folder was backed up during working hours due to server downed by Windows Update and was manually started up by IT team, with the recurring task scheduler forced trigger multiple tasks at the same time due to missing original timing as soon as Sage X3 was up along with the Batch Server.

While I technically could make use of SQL Server FULL recovery model backup of backing up the whole Sage X3 database, I am currently moving Sage X3 folder to v12 and would prefer not to accidentally overwrite the X3 schema during an import and conversion run in v12 with older version X3 schema.

Top Replies

  • Hi  
    use the latest console on former version of X3 is possible, even V6. Just make sure you have installed powershell 5.1 at least.
    WIth the latest cosole, use bcp parallel import/export instead…

  • +1
    verified answer

    Hi  
    use the latest console on former version of X3 is possible, even V6. Just make sure you have installed powershell 5.1 at least.
    WIth the latest cosole, use bcp parallel import/export instead of SVG. It will go much faster in both export and import.

    There is no application consistency features with SQL Server. You need the system to be stopped when you migrate.

  • 0 in reply to Bruno Gonzalez

    Hi Mr.  ,

    I am now test-running the SQL Server's BCP 4-processors parallel run test with the generated Windows Batch command file from Sage X3 Console.

    The backup completed in ~1 hour. I will try the restore method later when I am available to observe.

    Based from the script, seems like on the backup server, I will need:

    - Sage X3 Console.

    - Access to Sage X3 runtime.

    - `cmd` or Powershell (7 preferred)

    I am curious is there a way I can keep track of this run (as I need to know whether the run pass and safe to backup or failed for any reason on what and when it error) from [1] Sage X3 side (I have a concern of how it can find the script and a concern in security in this case where Sage X3 could bypass server security to run as admin on a trojan horse or ransomware script posed as a backup script) or [2] need to be handled by the server backup team's script inside the Sage X3 server (for my side, it is on runner's AppData at: `C:\Users\[user]\AppData\Roaming\Sage\Console\traces\exportfolder.tra`) during a backup run?

    Is there any error code for me to refer from command line in case of backup failure?

  • +1 in reply to chunheng
    verified answer

    1h vs 15 hours, not too bad. Slight smile


    The console trace shows the progress. I don't know if there is a specific error, might be, you can try to make the backup fail and see. You already have a trace formed properly to analyze.
    The backup file is triggered by the console manually. If you want to use it in batch out of its context, then it becomes your responsibility. If you have a trojan horse on the server infecting the backup script, then you are already dead Slight smile. It means the attacker is already inside your system. If Sage is embedding a corrupted script, then it's going to be difficult to do anything...

    Doing BCP export dump can be done manually without X3. If you or the customer needs to do something specific or don't trust the Sage delivered components, you can always create your own to extract tables structures, tables contents, and X3 required directories from the Folders directory tree.