Subject |
Re: Client with Dbase IV what to do |
From |
Russ <rfisher@ccnintl.com> |
Date |
Thu, 14 Oct 2021 08:35:18 -0400 |
Newsgroups |
dbase.getting-started |
Akshat Kapoor Wrote:
> I started a good 10+ years late. My first app was in foxpro for dos.
> I switched to foxpro for windows with bare minimum changes.
> And then stopped programming for approx 15+ years
Sounds a lot like me. Had my own consulting / custom programming business for a few years. Then went to work for a local hospital as senior programmer and ended up as network manager when the 24/7 on-call and all free time spent keeping up with certifications took its toll in burnout. Switched careers in 2002 to another passion of mine - woodworking - building custom furniture with a local manufacturer. A workplace injury in 2018 put me back into the I.T. chair (with the same company) maintaining our network, cad, cam and custom order entry (the db IV system) that was (and still is) written by our VP. It has been evolving over the past 25 years and quite frankly - just works. (Between us, I'm not a fan of his spaghetti coding techniques, but it does the job. :) )
The modules are compiled with Borland's compiler and executed from a file server. Tables are kept to a workable size by exporting inactive records to an archive file, so the only real speed hit anyone sees is if they need to search those files without an index.
Yes, it's old school, but the beauty lies in its simplicity. All the desktops have shortcuts to the various applications. If an application needs to be changed, just edit it, compile it in about 30 seconds, move it to the file server and everyone instantly has those changes.
I'm not sure if something similar could be done with db2019, but I'm sure the complexity grows exponentially.
> With 50+ machines I will recommend using 2019 with a proper backend
> database likey mysql firebird etc.
We've talked about a client-server model. While it's true that data retrieval is more efficient for very large tables, in our scenario, as I said, the table sizes are managed by frequent archiving. The efficiency gained, however, is FAR overshadowed by the complexity of maintaining such a system and in our situation, it's just not warranted.
Why do you think soooo many LARGE financial institutions still run millions of lines of COBOL? It just works and the cost to rewrite is astronomical.
Cheers,
Russ
|
|