Author |
Message
|
mvic |
Posted: Wed Oct 31, 2007 8:04 am Post subject: |
|
|
 Jedi
Joined: 09 Mar 2004 Posts: 2080
|
PeterPotkay wrote: |
OK, that makes more sense. DBs are better for infrequent writes and then that data is read lots and lots of times, often chosen by multiple criteria.
But for put it once / get it once with a simple selector (like Correl ID) MQ makes more sense. |
I guess the MQ-only solution also makes more sense if they already have MQ but did not buy a database.
But even if a database is not the right answer, it's still not obvious that MQ is the right answer for this requirement. With so many remote systems the security planning and implementation in MQ would be significant. And, while admittedly this is not a show stopper, the usage scheme here is not a messaging pattern. It's one "message" per day: ie. a data retrieval pattern.
One (potentially cleaner and simpler?) way of implementing such a data retrieval pattern is a shared disk hosted by (say) a Windows 2003 system, and each system that wants its data at the beginning of its day says:
- (I am system northwest110239x, user nw110239x)
- (Today is 20071031)
- Map network drive to M: specifying username/password for nw110239
- Copy file M:\files\northwest110239.20071031.txt to local disk
- Un-map network drive M:
- Process local copy of file northwest110239.txt
All just IMHO, based on what I (think I) know of one requirement. My suggestion might be totally wrong for the full set of requirements. |
|
Back to top |
|
 |
sebastia |
Posted: Thu Nov 08, 2007 8:06 am Post subject: |
|
|
 Grand Master
Joined: 07 Oct 2004 Posts: 1003
|
mr MVIC - there is a requirement missing
that (maybe) does not allow your implementation :
the remote site IS NOT A PC,
but a "Sales Point" machine,
which I dont know if can do a "NET USE .... "
Thanks anyway. |
|
Back to top |
|
 |
mvic |
Posted: Thu Nov 08, 2007 8:47 am Post subject: |
|
|
 Jedi
Joined: 09 Mar 2004 Posts: 2080
|
sebastia wrote: |
Thanks anyway. |
You're welcome. I hope you get a good solution. |
|
Back to top |
|
 |
Vitor |
Posted: Thu Nov 08, 2007 9:17 am Post subject: |
|
|
 Grand High Poobah
Joined: 11 Nov 2005 Posts: 26093 Location: Texas, USA
|
I'm guessing (but only guessing) that this is connected to this:
http://www.mqseries.net/phpBB2/viewtopic.php?t=40418
There are now 2 initially unconnected threads where a database has been recommended as a solution. I urge you to reconsider using a queue as a database, especially with your predicted volumes of 3,000 - 5,000 messages. Scanning (as indicated by your other thread) and removing the specific message is going to be a very poor performer. _________________ Honesty is the best policy.
Insanity is the best defence. |
|
Back to top |
|
 |
sebastia |
Posted: Thu Nov 08, 2007 9:33 am Post subject: |
|
|
 Grand Master
Joined: 07 Oct 2004 Posts: 1003
|
Vitor - you are right : both threads are about the same design.
The shop goes to the queue once every 15 minutes,
and searches for new prices.
It has to be this way - prices change when "central" site decides it.
Remote machines are NOT a PC.
Use Java to access via MQ Client to the central site.
Can NOT install a DataBase client.
Have 3000 boxes spread around Europe - cant change this.
I did run some testing :
I han 10 PC's browsing in a loop for a specific Message ID
(connect+open+get/browse, close, disc - go to begin)
Then, we ran a AMQSGETC to extract all 3.500 messages -
the time was 2 minutes.
Not bad - or "good enough" ...
( ) |
|
Back to top |
|
 |
mvic |
Posted: Thu Nov 08, 2007 9:34 am Post subject: |
|
|
 Jedi
Joined: 09 Mar 2004 Posts: 2080
|
Vitor wrote: |
especially with your predicted volumes of 3,000 - 5,000 messages. Scanning (as indicated by your other thread) and removing the specific message is going to be a very poor performer. |
I don't think it's so clear if the usage is very low (from elsewhere in the thread there is one put and one get per client per day). If we're talking about a single MQGET-by-correlid on a 5000-depth queue, then I'd expect this to complete in sub-millisecond times on a quiet queue manager.
Much deeper queues, and very busy systems, or many apps all simultaneouly putting / getting from the same queue, all will be factors that lengthen the times experienced by each app.
EDIT: Just saw sebastia's clarification above that queue is read every 15 mins. Is there a new message for the client every 15 mins? Does the client do a destructive MQGET every 15 minutes? |
|
Back to top |
|
 |
sebastia |
Posted: Thu Nov 08, 2007 9:49 am Post subject: |
|
|
 Grand Master
Joined: 07 Oct 2004 Posts: 1003
|
Consumer tries to Browse/Get a message every 15 minutes.
After a Browse succeeds, a destructive Get comes.
Producer sends 3.000 messages a day (max), maybe twice a day, all at the same time.
( ) |
|
Back to top |
|
 |
jefflowrey |
Posted: Thu Nov 08, 2007 9:53 am Post subject: |
|
|
Grand Poobah
Joined: 16 Oct 2002 Posts: 19981
|
I know we've worked through this design a bit already.
Can you implement it as a request/reply, instead? Where each Store sends a request for current catalog changes, instead of the central location doing a batch update?
Then you could do a simple match on CorrelID to retrieve the message intended for the store, instead of having to wade through the entire queue.
And the queue won't have typically have a qdepth > 0. _________________ I am *not* the model of the modern major general. |
|
Back to top |
|
 |
sebastia |
Posted: Tue Nov 27, 2007 9:36 am Post subject: |
|
|
 Grand Master
Joined: 07 Oct 2004 Posts: 1003
|
Jeff - I shall try : I will propose it.
But I bet they will say : we shall look for a new design,
but meanwhile, lets go ahead with this one" !!
Thanks. |
|
Back to top |
|
 |
|