Subject Re: Updating buffer info to database file
From john wade <>
Date Fri, 10 Aug 2018 09:00:46 -0400
Newsgroups dbase.getting-started

Akshat Kapoor Wrote:

> On 10/08/2018 11:26, john wade wrote:
> > Ken Mayer Wrote:
> >
> >> On 8/9/2018 1:51 PM, john wade wrote:
> >>> Eventually was able to hold other terminal users at bay while updating a stock file by using a file with one field "Locked"  "Y/N". If the "Locked" field was "Y", the terminals were put into a loop until the "Locked" field was "N". The terminal updating the "stock" database changed the "Locked" field from "Y" to "N" when update was complete.
> >>> Which command is used to write the buffered data to the field value.
> >>> In the multi user environment, the terminal effects the change, but the file server needs to close databases, and then only do the changes made by the terminal get written away.
> >>> I have tried commit(), refresh(), flush(), save(), you name it.
> >>> Any suggestions, as I cannot close databases to update the changes without errors coming up.
> >>
> >> Once the users are locked out, they shouldn't able to do *anything*. If
> >> you want them to work with transactions, which handle the kind of
> >> buffering I think you're talking about, then that is where commit and
> >> flush come in. Take a look at transaction processing in online help.
> >> However, it may be a huge amount of work (I've never liked it myself so
> >> have avoided it). If I lock a user out of something I don't let them
> >> work with the data at all ... they have to wait until the table is no
> >> longer locked.
> >>
> >> Ken
> >>
> >>
> >> --
> >> *Ken Mayer*
> >> Ken's dBASE Page:
> >> The dUFLP:
> >> dBASE Books:
> >> dBASE Tutorial:
> >
> > Ken,
> >
> > The application is point of sale in particular. Stock is sold and the closing quantity is updated online real time. I can now process the multiple items being sold from the stock file by the process of the loop while other sales points are processing the sale transactions.
> > Running the process on the terminal, the closing quantity is updated.
> > It looks as if the data has been written away, but the file server picks up the previous closing quantity and the terminal data is thrown away. If the file server exits the process with a "close databases" command,
> > the terminal process updates the closing qty real time online.
> > Opening the application on the file server again reflects that the data HAS been written away.
> >
> > The same process on two locations gives two results.
> >
> > While the file server is running the application,as if in a single user environment,  the result is correct. As soon as the terminal effects a change in the databases, the fileserver picks up the "old" closing qty in the database, and data is thrown away. (My reason for thinking that the data is in a buffer somewhere)
> > Is this because of the data being buffered, and what in your opinion is the best way to write the buffered data to the database on line, real time.
> > If you need a copy of the "test" application, let me know and I will forward to you. Thanks Ken
> > John
> >
> I had suffered this issue earlier and finally located the fault to BDE
> settings.
> Open the BDE administrator. Under the configuration tab under system
> under init you will find a setting named local share.
> Set this to true. Your BDE settings should be similar to the one in the
> attached screen shot. Local Share is the most important for you to
> check. Rest you may or may not alter as per your requirements.
> Regards
> Akshat
If one looks at the desktop property settings, there is a multi-user block with Lock/Exclusive/Refresh and reprocess.
I am fiddling with the options when compiling, and will have a look at the results.
Thanks Akshat, will aslo have a look at the BDE settings, and will let you
know. I am 99% there, frustrating 1%.