Subject |
Re: complex indexes vs index key fields |
From |
Mervyn Bick <invalid@invalid.invalid> |
Date |
Thu, 30 Jul 2020 10:39:47 +0200 |
Newsgroups |
dbase.getting-started |
On 2020-07-30 08:54, Akshat Kapoor wrote:
> Just a word of caution,
> MySQL can accept 1000 rows in a single query,
> But the query s being built up as a dbase memory variable and it will
> have its own limits.
>
> For rows of small size it will not matter, But if rows get large i.e. eg
> 4k for each row then ?
4k was the limit for a record in a level 4 .dbf file. The limit is now
32767 bytes. That's a seriously BIG record. :-)
The only limit on a dBASE memory variable is available memory. With
nothing else going on in dBASE, memory(9) returns 2097151 on my computer.
I've just loaded a 1.72MB (1831711 bytes) text file into a memory
variable as a test using a file object's read() method.
cRead = f.read(2000000)
It took 0.02 seconds.
The longest INSERT statement to insert 1000 rows using Gaetano's test
data was just over 73000 bytes. Neither dBASE nor MySQL will even raise
a sweat. :-)
Mervyn.
|
|