two servers connecting to one database

what is the worst known thing to ever happened, when you connect two servers to a single database

it's a transitions phase, one server with heavy users and the other with lesser tasks though increasing the workload here and there 

end result is to separate application server to database server

Parents
  • Hi  if you connect two Sage 300 application server "sites" to the same SQL server, and users attempt to concurrently access a company database, the best thing that can happen is a database lock preventing users from one or eventually both sites from logging into that company database.  The worst thing that can happen is horrific database corruption.  With pre enhanced security sites you're likely to get locks first. At a post-enhanced site heaven knows what will happen - could be a complete mess.  Its also a violation of site licensing rules.  There should never be a need to do this, what is the business case?  If you're operating an RDS environment, on the second server simply deploy the Workstation Setup components, NOT the main server installer components.  This will spread out the load without risking all the problems of having two server sites.  Please go ahead and provide detail on your server infrastructure layout, the problems you're experiencing, and what you want to improve i.e. migrate to a more powerful server and we'll do our best to assist in putting a safe technical procedure together.  Also, please try and get in touch with your Sage 300 Business Partner as they're generally your #1 contact for issues like these.  Good luck!...Tim.

  • 0 in reply to Tim - Accsys Consulting

    hi thank you. 

    it's a transitional process of a 24/hr client. the backup server isn't licensed but will allow more of a handover takeover unlike shutdown and migrate. they don't have the downtime.

    main issue emanated from a high SQL memory consumption due to other 3rd party products. best is to separate the database server and application server. in having another server and migrate users and all, the current becomes database server and Sage will be uninstalled so as to house sql and databases only

    but advise highly noted. 

Reply
  • 0 in reply to Tim - Accsys Consulting

    hi thank you. 

    it's a transitional process of a 24/hr client. the backup server isn't licensed but will allow more of a handover takeover unlike shutdown and migrate. they don't have the downtime.

    main issue emanated from a high SQL memory consumption due to other 3rd party products. best is to separate the database server and application server. in having another server and migrate users and all, the current becomes database server and Sage will be uninstalled so as to house sql and databases only

    but advise highly noted. 

Children
  • 0 in reply to Mr_Mayor

    Hi  thank you for providing this information.  Its all very "doable" depending on the version of Sage 300 you're using.  Can you please let me know which version of Sage 300 you're using and what Product Update?  Your answer will differ depending whether you're using an Enhanced Security version or classic ISAM version.  Meanwhile, remember that using the Simple Recovery model in conjunction with performing index creation and index rebuild maintenance on your Sage 300 SQL server, if not run previously, can literally slash the overhead and boost performance by 60, 70 or more %.  While planning the migration, organise a short maintenance outage and make the following changes and run the following scripts on the largest test company database first, followed by largest company database if the test is successful, one night at a time.  After a few nights you might be able to significantly boost performance unimaginably.  I do not know your individual site circumstances, complexity and data structures so I cannot be sure an adverse affect won't occur.  Therefore please consult with your Microsoft SQL DBA and Sage 300 Consultant and/or Business Partner before executing the following scripts to potentially boost performance:

    Consider changing to the Simple Recovery model:

    If you have a high-transaction volume Sage 300 database, a significant amount of memory and processor overhead is going to be taken up by transaction recovery logging when Database Properties -> Options -> Recovery Model is set to Full.  Providing you have a proper SQL backup routine in place that's been tested and confirmed to be able to restore successfully, and takes backups often enough to meet the minimum data loss requirements of the company, then you can change the Recovery Model to Simple to stop the logging overhead.  You can then shrink the log file data to clear it away.  Recovering data using the SQL transaction logs in Sage 300 is rare and complicated (in my experience) and you should instead be dependent on having a proper Full + Differential or Full + Incremental policy in place allowing you to fully restore to an acceptable point in time should something go wrong.  This backup process should be verified regularly as working.  You should never need to depend on SQL Transaction Logs in Sage 300 for any sort of roll-back.  Of course you should also be sending backups offsite to cloud and physical media.  In summary, in conjunction with your SQL DBA and Sage Advisory team, you could try changing the recovery model of your Sage 300 databases to Simple.

    Perform Indexing on your Sage 300 company databases:

    You can execute the following script on your Sage 300 databases (start with a test one the first night) to suggest and create additional indexes.  Note that this process will generally identify tables where increases in read speeds can be achieved, with some trade-off resulting in slower bulk insert, update and delete speeds (which Sage 300 doesn't really do when used normally).  Commonly indexes are identified in high volume areas like the Bank Reconciliation tables, AP Invoice tables and the GL Posting tables.  If you frequently perform large data insert/update through file imports via the Sage 300 interface or API (you shouldn't be injecting records directly through SQL) then make sure you test the performance of these insert/updates after creating the indexes on your test database first.  Run the following script on your first test database Query window, then copy the Create statements that appear in the results, paste them back into a new Query window for the same database and execute to create the newly identified indexes:
    Suggest Indexes MSSQL Script

    Rebuild the Indexes on your Sage 300 company databases:

    Now that you have created new indexes, you want to rebuild them and the existing old indexes to make sure they're fully up to date and optimal.  Like with index creation, its even more important to ensure no one is accessing the database you're performing the rebuild on as the Rebuild process will take the indexes offline, potentially causing slow performance and locks on records a user and the rebuild process tries to access at the same time.  Here is the script you want to run on the database (test first) that you just created new indexes on:
    Rebuild Indexes MSSQL Script

    Good luck and make sure you report back with more information about the edition and product update you're running so we can advise on the best migration process to reduce the downtime.  Also again, I can't stress enough, please be working on this project in conjunction with a Sage 300 Consultant or Business Partner and a DBA (unless you're already one).  Good luck...Tim.