ODBC Error when attempting DELETE query

I am trying to automate archiving of the Log files generated by sage.  We generate an average of 15,000 log entries a day and the LogViewer in sage is so slow it is nearly unusable after a month.  I am querying the data older than seven days and then inserting it to our SQL Server.  Using the same connection string I can query data from the QLM Master Log but when I go to delete data I get an error.

With my Sage user service account:

ERROR [42000] [Simba][SimbaEngine ODBC Driver][DRM File Library]Access denied.

With my personal Sage user account.  I was failing at opening second connection with this error: (Still not sure why but not directly relevent)

ERROR [08001] [Simba][SimbaEngine ODBC Driver][DRM File Library]Invalid account name.
ERROR [IM006] [Microsoft][ODBC Driver Manager] Driver's SQLSetConnectAttr failed
ERROR [01000] [Microsoft][ODBC Driver Manager] The driver doesn't support the version of ODBC behavior that the application requested (see SQLSetEnvAttr).

Once i adjusted my code to no longer attempt a second connection open I get the same error:

ERROR [42000] [Simba][SimbaEngine ODBC Driver][DRM File Library]Access denied.

Any help would be appreciated.  Specifically what access is required by the sage account to run my delete command?  

Parents Reply Children
  • 0 in reply to Jason A. Smith
    I had been working with Jared on this, but we have not achieved our desired solution. We actually never came up with a way to delete the data using an ODBC connection.

    Although working on the solution has been put on hold we did still have a plan for this. First we would copy the existing log data into a backup for the event that any of this is ever needed, then rename the flat files that hold the data log viewer displays. From speaking with support they were able to direct us to and verify the exact files, but were not willing to work with us on a solution. They did also let us know that the log file would be automatically recreated if moved/deleted (we have not tested this but I expect the information to be correct).

    We would then run the process on a schedule either kicking off nightly if logviewer > x entries or automatically every few days.

    If I remember correctly this would have been this file and directory:
    \\[servername]\Timberline Office\9.5\ACCOUNTING\Global\Master.QLM
    \\[servername]\Timberline Office\9.5\ACCOUNTING\Global\PVData\master_QLM