Sage 50 US Ideas Portal

Slow Database Access Speed when Inventory database is high.

1) Change Cache Memory Configuration on the Server
By disabling the L2 cache (Max Microkernel Memory Usage) from 60% to 0% and increasing the L1 cache (Cache Size) from 256MB to 1.5GB, the PSQL v13.31.014 database engine should now be able to cache just about all of the critical data, reducing the need to load data from the disk.  For "most" databases, 60% of the data file size is real data and needs to be cached.  (Some applications have more or less, of course.) Because of this, the 1.5GB may be enough to cache 2.7GB of data. As previously indicated, please monitor the server's "Virtual Bytes" for the W3DBSMGR.EXE process.  If you find this approaching the 4GB maximum allowable address space, then you should consider rebooting the server as soon as possible to release that memory.  If you don't see if going much over 2GB for an extended period of normal use, then you CAN consider increasing the L1 cache size further, if it is needed. 
Note: The Server-class PSQL v13 engine does come with PerfMon memory counters where you can monitor cache usage more accurately, but these are not available on the Workgroup Engine. Instead, you have to compare the actual memory used to the cache size to guess how much might actually be loaded with data.

2)  Change Cache Memory Configuration on the Workstation(s)
The Client Cache Engine (CCE) on each workstation can also benefit with a cache increase.  Increasing the Cache Size from 256MB to 512MB may be plenty to get the system working more efficiently.  Again, there is no easy way to monitor the amount of CCE cache memory actually in use, but you can guess based on the Task Manager data.  If you find that the cache never gets filled, even after a full day of Sage50 operations, then you can probably scale this down a bit, too. 
Note: Remember that each CCE has to be tracked by the engine on the server.  The more data you have in cache on workstations, the more memory the server's engine will need to track it.  If you ramp up the workstations, you should continue to monitor the Virtual Bytes on the server.

3) Rebuilding Data Files
Rebuilding the files was made possible by copying the RBLDCLI and RBLDGUI tools from a full PSQL v13 installation to the server, and this process certainly saved a ton of disk space.  (I didn't note the actual savings, but 1/3rd is typical.)  This space was lost due to fragmented index structures and free space (called shadow pages) that are needed to handle database updates, so this is normal.  The rebuild process shrinks each file down to the smallest possible file size by eliminating this wasted space, and may be helpful in shrinking your database enough so that Sage Support doesn't complain too much about helping you.  You can consider doing a periodic rebuild to save space if needed, though doing this more than once a month may not be beneficial. 
Note: While PSQL v13 has a Online Database Defragmenter that can shrink files while they are in use, I do not know for sure what is needed to graft the Defrag utility into the server. This is a Java-based function, so it would require a much more complicated setup process, so it just may not work.  If you did want to play with it, you could also try grafting the DBDEFRAG tool onto the server and just using the command-line version.

4) Pre-Caching Files
You can improve performance immediately after a reboot by pre-caching files in the server's memory.  This is done through the BUTIL -CACHE function on the server.  (BUTIL was also grafted into the server-side install from the full PSQL Client.)  In your version, this must be done for one file at a time, but a simple batch file can mean quick access shortly after a server-side reboot.

5) Using the old v11 Monitor
Like the other tools, we grafted the old Monitor into the server and your computer installations, to yield some hints as to what is going on behind the scenes.  You can use this to see who is connected to the server and kick users out, as needed.  Note, though, that this is a v11 Monitor.  You may find it better to use the PSQL v13 Monitor from the DC server.  This tool is accessed through the PCC/Java environment and may be easier to use.  It also supports sorting on the various columns, so you can easily keep tabs on which user sessions are generating the most data.

6) Slowness in starting up the Manage Inventory Screen
When initially starting up the Manage Inventory Screen, the application is pre-loading every single part number from the inventory database LINEITEM.DAT, reading each and every Key Value through a GetNext+Key loop. (This is a very efficient method, which JUST reads the key values, and not the entire record.)  As your current system has 49861 records in this file, it takes some time to complete this process.  Luckily, it seems to run in the background on startup, so the overall penalty isn't too bad here.  Since Sage50 enables the CCE by default, this data is pulled from the server the first time, but all subsequent re-loads are handled by the CCE (accessing local memory), which is why we don't see much in the way of database accesses when this screen is loading.  On the flip side, the application STILL has to make 49000 database engine calls, which takes quite a bit of time (~30 seconds).  Once this data is loaded, though, the lookup seems to work just fine off of the pre-loaded data.

7) Auto Field Completion
We did find out that this setting in Sage50 is definitely related to the slowness in the Quote screen, and this points to one of the potential logic errors in the version of Sage 50 2021 that you are running, namely v28.1.01.0176.  To understand this issue, you need to understand what is going on behind the scenes and how auto complete can work.  Developers really have two possible ways to code something like this, both with their own advantages and disadvantages:
A) Re-Read the needed database records on each keypress. This solution works by waiting for the user's first keypress, then passing the characters entered thus far to the database to look up the records starting with this character string.  This process loops, with each successive keypress re-querying the database over and over again.  So, to look up a part with the number "7918", the first keypress loads all records greater than "7", the second re-loads all records starting at "79", the third re-loads all records starting at "791", and the 4th reloads again with "7918".  As you can imagine, reloading ALL of the records (to end of file) means that record searches for "A" will take far longer than record searches for "Z".  The developer can further minimize the data load time by stopping when the next record will not match -- for example, when the user enters "7", the first part beginning with an "8" can stop the search.  However, this can be unnerving for users because they don't see ALL parts, and scrolling outside of the entered range is not possible.  The primary advantage is that you the application doesn't need to pre-read anything -- it is all done on the fly.  The disadvantage is dependent on the limiting algorithm used -- and such a looping process may be reading and re-reading records many times over and over, increasing the workload senselessly.

B) Read all records into an array in memory.  Unlike the above, this solution works by pre-loading all records from the database into a array in memory (as described in #6 above).  This may be a static array, which can be hard-coded to a specific size, or it can be a dynamic array which is built on the fly.  (It can also be a static array that gets re-dimensioned on the fly at runtime.)  The primary advantage here is that the data is read one time and then the entire array can be presented in the search box.  The primary disadvantage is that the entire data set needs to be loaded prior to displaying the dialog box, which can cause delays in loading up the screen.  Secondary issues can arise due to memory allocation -- if insufficient memory space is available (either in the static array or in the applications process space for a dynamic array), then the data cannot be properly displayed at all, and the dialog box cannot be used at all.

(I am ignoring the hybrid approaches, such as waiting for the first keypress and then loading all of the matching records into an array.)

At this time, we surmise that Sage is using the latter (array) approach, since there is an extensive start-up process that reads some 49000 rows from the LINEITEM table when the Manage Inventory screen loads up.  What is not clear, however, is whether the array size is static/fixed or dynamic in nature.  If this size is static, or if the size is dynamic within a set range, then we need to inquire from Sage as to the absolute maximum array size supported by the Sage 50 application.  An Internet search has not turned up any specific limits, though I have seen references to 3000, 100000, and 33000 on various pages.  (Certainly, 32768 would be a reasonable maximum for any computer-based data set, as would 65535.  Splitting the difference here is 48K, or 49152, only slightly less than the current record count.)  It should be noted that the PSQL v13 database engine has a limit, but it allows up to 4 billion records and file sizes up to 64GB supported in the older file format (and files up to 64TB with code changes).  To better understand the issues here, we need to inquire from Sage as to the exact maximum supported LINEITEM count.  (As discussed in the Rebuild section above, data size, which Sage keys off of for maximum size, is irrelevant here -- the record size is actually the important component.)

8) Slowness in starting up the Quote Entry Screen
As with the Manage Inventory screen, the Quote Entry screen also seems to pre-read all of the inventory part numbers into an internal array when the Auto Field Completion is enabled.  By itself, this is quite understandable and normal.  However, there is a critical bug in the logic in this module.  After the LINEITEM part numbers are read for the first keypress, the ENTIRE DATA SET is read for the next keypress, and the ENTIRE DATA SET is read for the next keypress, and so on.  Essentially, this is using the worst features of both of the approaches described above.  On a system with 49861 records, this process can easily take up to 30 seconds for each keypress, rendering this screen unusable.  Disabling the Auto Completion option certainly eliminates this issue, but it also forces the user to enter the entire part number, use the search screen (which still reads EVERY data record instead of just those with the entered text string), and then select the part number he or she needs.  For simple environments with short part numbers, for systems with easy-to-remember part numbers, or for systems with relatively few parts, this is not much of an issue, but for a complex environment with this part quantity, it becomes untenable.

My best guess as to the slowness of this field is that it is a simple coding error.  While it may be acceptable for the first keypress to generate a data record load, all subsequent characters should leverage the data already loaded.  It may just be that there is an uninitialized flag (which would tells the search window that the data is loaded) is not set properly, so the process must reload (and reload and reload and reload) on subsequent keys.  Another possibility does exist, though -- the process could use an array which is smaller than the array supported by the Manage Inventory screen, and therefore the array has to be disposed and reloaded each time.  As we don't know which is the case -- again we would need to defer to Sage Support to tell us what the maximum allowable inventory record count should be, and what number is supported in each module. 

9) Deleting Unused Inventory
While not something that I can address, if there is a nearby limit that is being exceeded (such as 49152), then it may be possible to isolate some 710 inventory records in the database that either have never been used or that have been used so long ago that their transactional data is no longer useful, and then to remove these records.  Again, this is outside the scope of what I can advise on, as it would require someone knowledgeable about the Sage50 application and its data structures.  However, reducing the number of inventory records will reduce the data load times AND perhaps address the array issues, if any.

  • Guest
  • Nov 10 2020