Gearman by xhr

Hi,
in a page I’ve to do a large task
(I’m still in two minds only one or break it in more chunks the test is with more tasks)
so I thinking of making it done
with gearman by xhr.
I ended up with this code:
(just a test)
// in the page


$.ajax({ url:  "/command.php", success: function(data){}});
$.ajax({ url:   "/client.php", success: function(data){
    alert(data]);
 }});

I get
[“ee9b24108865c41a2e17b220b62389cdf3bd6cd9”][“b9cb8a5965902b88167879d73bfde302a7028f44”][“5e5faffa858ab925fd02a53d4553498106d6c91c”][“212d3959622e8d7aaf3869fd3fc1de2b134fb82d”][“7040c8535161903c755267ac01c6d27ddfda2ba1”]

// command


exec('php worker.php');

// worker


$worker= new GearmanWorker();
$worker->addServer();
$worker->addFunction("hash", "my_hash_function");
while ($worker->work());


function my_hash_function($job)
{
    return  json_encode(array(sha1($job->workload())));
}

//client


$client= new GearmanClient();
$client->addServer();
$client->setOptions(GEARMAN_CLIENT_FREE_TASKS);
$client->setCompleteCallback("done");
$tasks = array();
foreach(range(1,5) as $i){
    $tasks[] = $client->addTask("hash", "ABC123".$i);
}
$result = $client->runTasks();
function done($task) {
  echo($task->data());
  flush();
}

Do you think it’s a good idea or not ?
Is there a better way ?
How could I use the data from the xhr ?
(By now I think of a ugly coma outside append to the json string split … )
Bye.

Ps.
In the really world I have to do a heavy select query
and I need the data in the page or at least in a session.

EDIT
a better way imho

//client


session_start();
$client= new GearmanClient();
$client->addServer();
$client->setOptions(GEARMAN_CLIENT_FREE_TASKS);
$client->setCompleteCallback("done");
$tasks = array();
foreach(range(1,5) as $i){
    $tasks[] = $client->addTask("hash", "ABC123".$i);
}
$result = $client->runTasks();
function done($task) {
  if(!isset($_SESSION['count'])){
     $_SESSION['count'] = 0;
     $_SESSION['data'] = array();
  }
   $_SESSION['count']++;
   $_SESSION['data'][] = json_decode($task->data());
  echo($task->data());
  flush();
}

Uppish :slight_smile:
Put it this way in a action I’ve to
do a query it can retrieve from
1000 up to 300000 records and
the server goes nut (memory exhaust memory_limit = 1024MB) so I thought
of gearman to balance server load.
I could put the client directly in the
controller/action but in this way I’ve
to run the worker by command line.

Sorry, I don’t speak Gearman. :wink:

On a serious note, you might get a better response on the gearman site itself.

That sounds like a problem that Gearman won’t solve, unless you’re farming out the heavy lifting onto other servers running the workers. What problem(s) are you looking to get solved?

Thanks for the reply.
[What problem(s) are you looking to get solved?]

I’ve to do a heavy query that it can retrieve up to about
300,000 records this is the real problem.
I raised the memory limit to 512MB but the server
response with memory exhaust.
May be the only thing I could try to optimize the query but
there is a very messy old code and the logic is quite ugly :frowning:

out of curiosity
Do gearman a better server load than the actual script ?
Be patient it’s the first time I try to use gearman.

Why on earth would you need to load so many records at once? Can’t you process them in smaller chunks?

If you absolutely must - Can you process them in a pipe-line fashion (Eg. one at a time)? In that case, you can use a database cursor (If your rdbms supports it) to load records as you process them.

Thanks for the piece of advice.
Yeah I’ve absolutely to load all the records at once.
For the other stuff could you give me a little example or
a link to a tutorial, please ?
The query is build run time by a form
so it’s not always the same.

I’m wondering if create a view filtering records
could help
like this


CREATE VIEW myuser 
AS SELECT id,email,first_name,last_name FROM user WHERE status='confirmed';

and than I could run the sql statement with other where clauses on myuser.

No my idea isn’t a good one :frowning:
http://www.sitepoint.com/forums/showthread.php?p=4755996
sorry for the cross-posting (but without the same topic) but I’m just
a little in a hurry :slight_smile:

I decided to make a store procedure that
create a new table on which do the heavy query.
Thanks everybody for help.

@kyberfabrikken
I think I din’t fully understand your point
so could you give me a little example or
a link to a tutorial, please ?

I was being vague on purpose. What I mean is that you could perhaps take code like this:


$result = db_query("select * from foo");
foreach ($result->fetchAll() as $row) {
  process($row);
}

and turn in to something like this:


$result = db_query("select * from foo");
while ($row = $result->fetch()) {
  process($row);
}

Eg. only fetch one row out from the resultset at a time. MySql has a concept called buffered queries, where it will only send back each row (or a buffer of rows) to the client (php) as they are read, rather than load everything into memory at once.

There are some calculations that can’t be done like this, but those are fairly rare. What is it that you’re trying to do?

Thanks for the point.
I’m using zf and I’m used to do like


$stmt = $db->query('SELECT * FROM bugs');
$rows = $stmt->fetchAll();
foreach ($rows  as $row) { 
  process($row); 
} 

but now I’ll have a try with


$stmt = $db->query('SELECT * FROM bugs');
 
while ($row = $stmt->fetch()) {
     process($row); 
}

and miracle it works and php is still alive :slight_smile:

just thanks a ton for your help.