Folder Export in Console using Bulk Copy (BCP)

SUGGESTED

I have folder that i want to take svg for to refresh into the UAT enviroenmnet. However the SVG process takes very long time to export because of the size of the data~ 2TB Database. 
Now i am trying to use the BCP option to see if it can speed up the process, but i am getting an error message when i check the BCP option. The same is not working on the SEED folder also.
Anyone with an idea please help.

We are on V11 P22, SQL server 2016

  • 0
    SUGGESTED

    Use Sage X3 v12p34 Console component and try this normal export with multithreading:  New Multithreading option now available 

    Note that I have never tried this before for Sage X3 v11 install but it should by theory work. If got any issue you could just bring back the X3 v11 Console.

    If backup to account for DR situation, try Mr. Bruno plan here:  RE: X3 Backup Strategies 

  • 0
    SUGGESTED

    Hi Kudakwashe,

    Do you have the correct path set in 'database.software.dbhome'? Can you confirm that osql.exe is in "D:\Program Files\Microsoft SQL Server\130\Tools\binn".

    For a 2TB database it is probably better to do a UAT refresh via a SQL backup and restore (the caveat being that folder names would end up being the same on both environments, but you could mitigate that using an alternate endpoint name).

  • 0 in reply to Carl Herrmann

    Hi Carl,
    I can see that 'database.software.dbhome' is set to ' <database.software.dbhome>D:\Program Files\Microsoft SQL Server\130\Tools</database.software.dbhome>' on the solution file. Yes, the osql.exe exists on this directory.

    SQL refresh would the best bet but then we would some changes that are on UAT and ot yet in production with regards to SCREENS, ACTIONS, FUNCTIONS and TABLES.

  • 0 in reply to chunheng

    HI Chunheng, Sure, let me try to get the P34 Console and tes the SVG extract. The problem is our business is 24/7 and hence we dont have odd hours to perfomr the activity. So performing while users are transacting means we will have inconsistant data.
    Let me get the Latest console and test this.

  • 0
    SUGGESTED

    On large databases there are many tables you can omit from the SVG extraction that can grow very large over time.

    ALISTER, ABATRQT (x3 folder), AESPION, ATMPTRA, AREPORTM, PRTSCRWRK, AWRKHISMES and AWRKHISJOI

    I usually keep on top of these using purging functions on a weekly basis so that only the relevant records are available. for a given length of time.

    ALISTER is usually large if never purged as it has a record of every requester ever run otherwise.

    Also check that somebody has not set ridiculous audit trails that are storing paramater accesses.

    If you also archive other old data to an archive folder then that reduces the amount of data in the Live folder for copying to a test folder.

  • 0 in reply to Kingy

    Thanks for the suggestions, we will try that out. Currently we are Purgng ALLISTER on a daily basis and audit is on a few tables like BPs and AUTILIS , BID and Payments.

    We will probably go with Carl's suggestion to use SQL refresh that will be the quickest approach.