[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: HTTPAPI performance problems
Hello Radu,
I don't know any reason why HTTPAPI would not perform well run in
parallel in multiple jobs. I would definitely use separate debug logs
for each (http_debug has a second parameter to specify the filename, use
a different name for each job.) This will not affect performance, but
multiple jobs writing to the same log file may make it unreadable.
I can tell you that a web browser on a PC will normally limit the
number of simultaneous HTTP requests. Normally, it will allow up to 5
requests to be made simultaneously to a server, but if there are more
than that, it will queue them so that there are never more than 5
simultaneous requests. I do not know the research that went into this
or their findings, but there's a good chance that they studied it and
determined that having more than 5 at a time resulted in worse performance?
Another thing that you can try if you choose to run them in series
(rather than parallel) is using the http_persist_xxx routines. These
routines allow you to connect to the HTTP server once and submit
multiple requests without disconnecting. This can greatly improve
performance if the network connections/disconnections are a significant
part of the time things are taking. I don't know if this helps you or
not, but it might be worth coding up a test and trying it out.
Good luck, and let us know what you discover.
-SK
On 9/15/2015 2:54 AM, Radu Botescu wrote:
Hello, I've implemented the httpapi with success but i have some
performance problems when using it in many calls in parallel.
Let me explain and sorry if it is long...
I call an external webservice in order to get a shipping label. One
label per parcel.
Now if on a palette I have 10 parcels I need to call 10 times the same
webservice.�
There are 2 ways to do it:
1. In sequence. Parcel 1, parcel 2....parcel 10. If I have 3-4 seconds
per label ==> the last label will be printed after 30-40 seconds.
2. In parallel. This is the current solution. I submit in batch one job
per parcel/label. Thus I have 10 jobs running in parallel. One label is
very fast, 2-3 seconds. The longest is 20-25 seconds. So overall it is
better but I do have a big problem bc for at least one label I wait too
much...20-25 seconds...After 2-3 labels the processing time is starting
to be longer...
How it works:
1. I call a Program A in order to create a xml file in ifs (need to log
the files and have a history....)
2. I call a Program B in order to consume this xml file with
"http_url_post_xml" . I parse the answer. I also write the label (ZPL
code) to a IFS file.
3. I write some tracking information and print the label�
The bottleneck is in the step 2. � �
The debug is *ON. I prefer to have all the logs at least for several
months.
I've spoke with my client and on his side he has between 1 and 2
secondes of processing time.�
I do not understand where is the bottleneck...too many IFS operations
in parallel? Is somewhere something in httpapi which force several
parallel jobs to be processed in sequence ? I do not thing so...
Any idea will be much appreciate :)
Thanks,
Radu
--
R.
-----------------------------------------------------------------------
This is the FTPAPI mailing list. To unsubscribe, please go to:
http://www.scottklement.com/mailman/listinfo/ftpapi
-----------------------------------------------------------------------
-----
No virus found in this message.
Checked by AVG - www.avg.com
Version: 2015.0.6125 / Virus Database: 4419/10647 - Release Date: 09/15/15
-----------------------------------------------------------------------
This is the FTPAPI mailing list. To unsubscribe, please go to:
http://www.scottklement.com/mailman/listinfo/ftpapi
-----------------------------------------------------------------------