Number of connections should not matter. We personally know companies with hundreds of users simultaneously connecting and working with cloud servers. No need at all to connect and disconnect every-time. This is definitely not an efficient way.
... good to have a SERIAL primary key. RDMS will take care of nextval() or currval() and it is not other botheration. RDMS will safely append even hundreds of records per second. No worries. i wonder how to "identify" a Record without UNIQUE "id" :?: We need to have a primary ...
Reinaldo, I think that you will be left alone, and why? The mentality here is to have everything free and open source, but nobody values the hundreds of hours that sometimes go into a product, a system, a solution for your program. I say this without intending to offend anyone, but it is what ...
Reinaldo, I think that you will be left alone, and why? The mentality here is to have everything free and open source, but nobody values the hundreds of hours that sometimes go into a product, a system, a solution for your program. I say this without intending to offend anyone, but it is what ...
Reinaldo, I think that you will be left alone, and why? The mentality here is to have everything free and open source, but nobody values the hundreds of hours that sometimes go into a product, a system, a solution for your program. I say this without intending to offend anyone, but it is what ...
... Since I'm unsure about the costs, I'm currently using an offline version. For example, in a reception with multiple users making requests all day, hundreds of requests could be generated, and using the API could become expensive. At the moment, users seem to expect information to be free, but that ...
... great product !! It seems soon or later my bussines will gone, too much extensive and complex to trying to move to SQL !! In our situation are hundreds of programers !! We need a producto like sqlrdd to conversion to SQL and in the future take parts of the code and passing to SQL with optimizing ...
... first fill all data and then call 1 to oDbf:save() Need to say that this routine is sitting in a loop and can be called for few, but also for many hundreds of records each time.
... there are fields for each file ), increment it by one, then save the new value. Locking applies. Again, it takes milliseconds to do this, and with hundreds of thousands of records written each year, my clients have NEVER encountered a problem. Some locations have up to 20 workstations sharing data ...
... there are fields for each file ), increment it by one, then save the new value. Locking applies. Again, it takes milliseconds to do this, and with hundreds of thousands of records written each year, my clients have NEVER encountered a problem. Some locations have up to 20 workstations sharing data ...
... with this? It seems it would be much easier and more efficient to write a program to provide whatever answer the client needs (without storing hundreds of thousands of records, of mostly redundant data). Looking at hundreds of thousands of rows in Excel would make my eyes bleed...
... with this? It seems it would be much easier and more efficient to write a program to provide whatever answer the client needs (without storing hundreds of thousands of records, of mostly redundant data). Looking at hundreds of thousands of rows in Excel would make my eyes bleed...
... run it from the command line, where it will print a message. So what happens if you have a bunch of different include and library directories, or hundreds of source files? As you can imagine the command-line for this would be huge. You have two choices: wrap up all of these commands into a batch ...
... So now everywhere you need to use the customers database, you just do: oCustomers:= TCustomers:New() Simple! That alone will probably eliminate hundreds of lines of code in your program. The variable oCustomers should be declared as LOCAL. Note that you can still pass oCustomers to a function ...