two servers connecting to one database

what is the worst known thing to ever happened, when you connect two servers to a single database

it's a transitions phase, one server with heavy users and the other with lesser tasks though increasing the workload here and there 

end result is to separate application server to database server

  • Hi  if you connect two Sage 300 application server "sites" to the same SQL server, and users attempt to concurrently access a company database, the best thing that can happen is a database lock preventing users from one or eventually both sites from logging into that company database.  The worst thing that can happen is horrific database corruption.  With pre enhanced security sites you're likely to get locks first. At a post-enhanced site heaven knows what will happen - could be a complete mess.  Its also a violation of site licensing rules.  There should never be a need to do this, what is the business case?  If you're operating an RDS environment, on the second server simply deploy the Workstation Setup components, NOT the main server installer components.  This will spread out the load without risking all the problems of having two server sites.  Please go ahead and provide detail on your server infrastructure layout, the problems you're experiencing, and what you want to improve i.e. migrate to a more powerful server and we'll do our best to assist in putting a safe technical procedure together.  Also, please try and get in touch with your Sage 300 Business Partner as they're generally your #1 contact for issues like these.  Good luck!...Tim.

  • 0 in reply to Tim - Accsys Consulting

    hi thank you. 

    it's a transitional process of a 24/hr client. the backup server isn't licensed but will allow more of a handover takeover unlike shutdown and migrate. they don't have the downtime.

    main issue emanated from a high SQL memory consumption due to other 3rd party products. best is to separate the database server and application server. in having another server and migrate users and all, the current becomes database server and Sage will be uninstalled so as to house sql and databases only

    but advise highly noted. 

  • 0 in reply to Mr_Mayor

    Hi  thank you for providing this information.  Its all very "doable" depending on the version of Sage 300 you're using.  Can you please let me know which version of Sage 300 you're using and what Product Update?  Your answer will differ depending whether you're using an Enhanced Security version or classic ISAM version.  Meanwhile, remember that using the Simple Recovery model in conjunction with performing index creation and index rebuild maintenance on your Sage 300 SQL server, if not run previously, can literally slash the overhead and boost performance by 60, 70 or more %.  While planning the migration, organise a short maintenance outage and make the following changes and run the following scripts on the largest test company database first, followed by largest company database if the test is successful, one night at a time.  After a few nights you might be able to significantly boost performance unimaginably.  I do not know your individual site circumstances, complexity and data structures so I cannot be sure an adverse affect won't occur.  Therefore please consult with your Microsoft SQL DBA and Sage 300 Consultant and/or Business Partner before executing the following scripts to potentially boost performance:

    Consider changing to the Simple Recovery model:

    If you have a high-transaction volume Sage 300 database, a significant amount of memory and processor overhead is going to be taken up by transaction recovery logging when Database Properties -> Options -> Recovery Model is set to Full.  Providing you have a proper SQL backup routine in place that's been tested and confirmed to be able to restore successfully, and takes backups often enough to meet the minimum data loss requirements of the company, then you can change the Recovery Model to Simple to stop the logging overhead.  You can then shrink the log file data to clear it away.  Recovering data using the SQL transaction logs in Sage 300 is rare and complicated (in my experience) and you should instead be dependent on having a proper Full + Differential or Full + Incremental policy in place allowing you to fully restore to an acceptable point in time should something go wrong.  This backup process should be verified regularly as working.  You should never need to depend on SQL Transaction Logs in Sage 300 for any sort of roll-back.  Of course you should also be sending backups offsite to cloud and physical media.  In summary, in conjunction with your SQL DBA and Sage Advisory team, you could try changing the recovery model of your Sage 300 databases to Simple.

    Perform Indexing on your Sage 300 company databases:

    You can execute the following script on your Sage 300 databases (start with a test one the first night) to suggest and create additional indexes.  Note that this process will generally identify tables where increases in read speeds can be achieved, with some trade-off resulting in slower bulk insert, update and delete speeds (which Sage 300 doesn't really do when used normally).  Commonly indexes are identified in high volume areas like the Bank Reconciliation tables, AP Invoice tables and the GL Posting tables.  If you frequently perform large data insert/update through file imports via the Sage 300 interface or API (you shouldn't be injecting records directly through SQL) then make sure you test the performance of these insert/updates after creating the indexes on your test database first.  Run the following script on your first test database Query window, then copy the Create statements that appear in the results, paste them back into a new Query window for the same database and execute to create the newly identified indexes:
    Suggest Indexes MSSQL Script

    Rebuild the Indexes on your Sage 300 company databases:

    Now that you have created new indexes, you want to rebuild them and the existing old indexes to make sure they're fully up to date and optimal.  Like with index creation, its even more important to ensure no one is accessing the database you're performing the rebuild on as the Rebuild process will take the indexes offline, potentially causing slow performance and locks on records a user and the rebuild process tries to access at the same time.  Here is the script you want to run on the database (test first) that you just created new indexes on:
    Rebuild Indexes MSSQL Script

    Good luck and make sure you report back with more information about the edition and product update you're running so we can advise on the best migration process to reduce the downtime.  Also again, I can't stress enough, please be working on this project in conjunction with a Sage 300 Consultant or Business Partner and a DBA (unless you're already one).  Good luck...Tim.

  • 0 in reply to Tim - Accsys Consulting

    Wait... am I reading this correctly? OP wants to separate the current single box db+sage300 to dedicated boxes. 

    You 100% can have multiple Sage300 application servers hitting the same db. It's one of the three configuration options in the installation guide. "Client-Server Network: In a client-server configuration, Sage 300 programs and databases are on one or more dedicated servers." - page 9.

    Database must be on a single server. Shared data folder must be on a single server. Sage 300 Program Files can be on multiple servers.

    Install Sage300 program files and system manager on the other box, during install set the SharedData path as a UNC path to the current SharedData, press install.

    See Administration / Installation and Admin guide here: https://cdn.na.sage.com/docs/en/customer/300erp/Documentation.htm

    The biggest gotchya is to make sure both servers have the same Sage300 program file versions and patches. Also, location of reports if you're not using customised directories to point to a single path.

  • 0 in reply to SergeB

    Hi  nope OP wants to run two SITE folders hence his comment "it's a transitional process of a 24/hr client. the backup server isn't licensed but will allow more of a handover takeover unlike shutdown and migrate. they don't have the downtime." where his all-in-one SQL/App server becomes separate App server + SQL Server, but to avoid downtime he wants to have both in production briefly so he can try to move users from one to the other without downtime. Well at least that was my interpretation hence I asked for more info.  Instead with additional info I was going to suggest he does a workstation setup on the second/new machine, make sure it works, then help him get the programs over - basically end up at the same place exactly what you said and referenced in the doco as I didn't expect him to have a win with the doco without extensive Sage 300 admin experience.

    However I then realised the problem is more likely to be database related then database server related as the Sage 300 programs themselves on the app server really don't create much overhead or resource utilisation because all the execution take places on the end users' RDS or individual workstation.  Therefore I thought "he will go to all the effort of introducing a second application server and then find performance is still slow with performance problems in the company databases".  So before OP went to all the trouble, I thought I would give him some SQL optimisation tips as OP may find he can fix up SQL performance and eliminate the initial cause without significant server changes.  Many times I've had clients with slow bank recs and the like believe they need to provision additional resources only for some indexing and index rebuilds to make the bank rec 5 to 10x faster.  Worse still is when a lot of money is spent on additional hardware/cloud capacity only for it to not help and end up an optimisation issue in the first place. Of course I could be totally wrong and was hoping additional site information would help clarify.  So that's the background behind where I was heading with this case Blush

  • 0 in reply to SergeB

    hi 

    thank you Sergeb. you are both right to some extend. 

    the scenario is migration from one server to the other but without downtime

    so my thought process was to install a separate server, start migrating people from one server to the other without downtime. in the end the current Sage box becomes only the database server whilst I have a new server altogether. I was thinking of having both programs and data on the new box then link the workstations via odbc to the current box

    In the end I will have an application server and a database server on different boxes as I have a huge memory consumption thats making the server freeze and hang. I will then look into the script by Tim on how to handle the database, slowness and indexing as i have other 3rd party application who i feel are the biggest contributor to the memory spiking issue as sage wouldn't be that resource hungry for basic Sage modules