Laravel - Running Jobs in Sequence

laravel run queue in background
laravel jobs
laravel chain jobs
laravel queue_connection
laravel multiple queues
laravel queue example
laravel dispatch($job with delay)
laravel dispatch job from command

I am learning Laravel, working on a project which runs Horizon to learn about jobs. I am stuck at one place where I need to run the same job a few times one after one.

Here is what I am currently doing

<?php

namespace App\Http\Controllers;

use App\Http\Controllers\Controller;
use App\Models\Subscriptions;
class MailController extends Controller
{


    public function sendEmail() {
        Subscriptions::all()
        ->each(function($subscription) {
            SendMailJob::dispatch($subscription);
        });
    }
}

This works fine, except it runs the job's across several workers and not in a guaranteed order. Is there any way to run the jobs one after another?

Everything depends on how many queue workers you will run.

If you run single queue worker, those jobs will be processed in the order they were queued. However, if you run multiple queue workers, obviously they will be run in same time. This is how queues should work. You add some tasks and they might run in same time in different order.

Of course if you want to make sure there is a pause between those jobs, you could inside each add some sleep() but assuming you are running this in controller (what might be not a good idea because what in case you have million subscriptions) it might be not the best solution.

Queues - Laravel, I'll suppose that you have already deployed your application to forge, and I'll focus mainly and what to do next in order to make the queue:work  From the Laravel docs. Job chaining allows you to specify a list of queued jobs that should be run in sequence. If one job in the sequence fails, the rest of the jobs will not be run. To execute a queued job chain, you may use the withChain method on any of your dispatchable jobs: ProcessPodcast::withChain([ new OptimizePodcast, new ReleasePodcast ])->dispatch();

What you are looking for, as you mention in your question, is job chaining.

From the Laravel docs

Job chaining allows you to specify a list of queued jobs that should be run in sequence. If one job in the sequence fails, the rest of the jobs will not be run. To execute a queued job chain, you may use the withChain method on any of your dispatchable jobs:

ProcessPodcast::withChain([
    new OptimizePodcast,
    new ReleasePodcast
])->dispatch();

So in your example above

$mailJobs = Subscriptions::all()
    ->map(function($subscription) {
        return new SendMailJob($subscription);
    });

Job::withChain($mailJobs)->dispatch()

Should give the expected result!

Update

If you do not want to use an initial job to chain from (like shown in the documentation example above) you should be able to make an empty Job class that that has use Dispatchable;. Then you can use my example above

Laravel Jobs and Queue 101: How to run your workers on your , Best Answer. https://laravel.com/docs/6.0/queues#job-chaining If one job in the sequence fails, the rest of the jobs will not be run. @sti3bas is  Laravel queues provide a unified API across a variety of different queue backends, such as Beanstalk, Amazon SQS, Redis, or even a relational database. Queues allow you to defer the processing of a time consuming task, such as sending an email, until a later time.

What you need is Job Chaining.

You can read all about it in the Laravel website : Chain

Good luck

Queued jobs handled in correct order, I am learning Laravel, working on a project which runs Horizon to learn about jobs. I am stuck at one place where I need to run the same job a  Now that you’ve built your first queue/jobs based Laravel application from parts one and two, it is time to deploy it to a production server.. The only difference comparing to “regular” Laravel applications (I.e the ones without queues and jobs), is that we need to tell the server to run the queue:work command and keep it running even after we close our ssh connection to the server and

Job within a Job: Scheduling Inception with Laravel Queues, I can easily switch from synchronous Jobs that run as a direct result of a user request, to an asynchronous Job that gets added to a queue. Running 'php artisan schedule:run' for Laravel in Kubernetes CronJobs March 27, 2019 I am working on integrating a few Laravel PHP applications into a new Kubernetes architecture, and every now and then we hit a little snag.

Pushing Jobs To Queue, Queue::push(new InvoiceEmail($order)); Bus::dispatch(new within your job class, decide if the command should be queued or run instantly,  The displayName of a job is a string you can use to identify the job that's running, in case of non-object job definitions we use the the job class name as the displayName. Notice also that we store the given data in the job payload. Creating the payload of an object job. Here's how an object-based job payload is created:

Queue Workers: How they work, Running this command will instruct Laravel to create an instance of your application and start executing jobs, this instance will stay alive  Laravel Cron Job is an inbuilt task manager that gives your applications the ability to execute specific commands like sending a slack notification or removing inactive users at a periodic time. We will be using the latest version of Laravel, which is 5.6 at the time of writing this article.

Comments
  • Thanks for the answer. I was looking for a solution like dispatch->chain([new MailJob($item1), new MailJob($item2), new MailJob($item3)])
  • Chaining does what you describe it doing in your question, but you state that is not what you want This answer provides the functionality you have asked for. It should be the chosen answer.
  • Thanks for the answer, nood question, where is Job class coming from.
  • Hey josh, your solution worked well. But when one job fails, the entire chain stops.
  • Hi Vish, yes this is an unfortunate downside. It might be acceptable still if you engineer your jobs not to fail. What I mean by this is: you should treat a failed job as a bug in the application (rather than an expected exception). So you can use try, catch, validation etc... to avoid the job actually failing. Of course, this is a design decision and you will have to consider what is best for your situation! :)
  • Thanks, i am trying Redis throttle also.
  • A specific feature of chaining is that the next job in the chain is only run if the previous job does not fail. That's not a disadvantage; that is designed and expected behaviour. To run a series of jobs in a sequence with no dependency on success, send them all to a the same queue with a single worker running on that queue. The worker will only be able to process one job at a time.