About the author

Related Articles

One Comment

  1. 1
    Gray@email.null'

    Gray

    My question is, if two or more scheduler threads are creating tasks and submitting to executor service concurrently, the above throws a RejectedExecutionException with one scheduler task not finished (i.e. file not sent)

    Let’s take a look at the javadocs for ExecutorService.execute(...)

    RejectedExecutionException – if this task cannot be accepted for execution.

    In looking at the ThreadPoolExecutor (and associated) code, the jobs get rejected for 2 reasons:

    • The queue for the jobs is full (this doesn’t apply to you because the queues are by default unbounded)
    • The executor service is no longer running (ding ding ding)

    I believe that your executor service has been shutdown, most likely because the first of your threads has called validateTasksExecution() before the 2nd thread calls executeJob(...). Your code is incorrect if you are trying to reuse that thread-pool. That you are also closing the connectionManager() makes me wonder if you want to re-use the SftpTaskExecutor at all.

    If you want each thread to see if its operation is done but have the thread-pool stay running then you need to be saving the Future(s) from the ExecutorService.submit(...) method and call get() on them. That will tell you when the jobs are done.

    Something like:

    public Future<Void> createTask(Report report) {
        return sftpTaskExecutor.executeJob(new JobProcessorTask(...));
    }
    
    public void validateTasksExecution(Future<Void> future) {
        // there is some exceptions here you need to handle
        future.get();
    }
    
    public void shutdown() {
        sftpTaskExecutor.shutdown();
        connectionManager.disconnect();
    }
    
    ...
    
    public Future<Void> executeJob(JobProcessorTask jobProcessorTask) {
        return executorService.submit(jobProcessorTask);
    }
    

    If you need to monitor multiple jobs then you should store them in a collection and call get() on them serially although the jobs will be running in parallel.

    The alternative would be for you to have a separate ExecutorService for each transaction which is wasteful but maybe not so bad considering that it is managing sftp calls.

    while (!sftpTaskExecutor.getExecutorService().isTerminated()) ;
    

    Yeah you don’t want to spin like that. See awaitTermination(...) javadocs.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Copyright © 2017 SolutionMmyself.com