GithubHelp home page GithubHelp logo

quarkiverse / quarkus-jberet Goto Github PK

View Code? Open in Web Editor NEW
47.0 9.0 21.0 597 KB

Quarkus Extension for Batch Applications.

License: Apache License 2.0

Java 99.63% Groovy 0.37%
quarkus-extension jberet batch jsr-352 java

quarkus-jberet's Introduction

Quarkus JBeret Extension

Build License Central All Contributors

The Quarkus JBeret Extension adds support for JSR-352 Batch Applications for the Java Platform. JBeret is an implementation of the JSR-352.

Usage

To use the extension, add the dependency to the target project:

<dependency>
  <groupId>io.quarkiverse.jberet</groupId>
  <artifactId>quarkus-jberet</artifactId>
  <version>2.2.0</version>
</dependency>

ℹ️ Recommended Quarkus version: 3.9.0 or higher

The Batch API and Runtime will be available out of the box. Please refer to the Batch documentation, or the JBeret documentation to learn about Batch Applications.

Configuration

The JBeret Quarkus extension supports the following configuration:

Name Type Default
quarkus.jberet.repository
The repository type to store JBeret and Job data. A jdbc type requires a JDBC datasource.
in-memory, jdbc in-memory
quarkus.jberet.repository.jdbc.datasource
The datasource name.
string <default>
quarkus.jberet.jobs.includes
A list of patterns to match batch files to include.
list of string
quarkus.jberet.jobs.excludes
A list of patterns to match batch files to exclude.
list of string
quarkus.jberet.job."job-name".cron
A cron style expression in Quartz format to schedule the job.
string
quarkus.jberet.job."job-name".params."param-key"
A parameter to start a job.
string
quarkus.jberet.max-async"
A parameter control the number of Threads that can be used by JBeret. An additional Thread for JBeret coordination is always added. Thus setting 1 will proide one thread for job executions.
string Based on available cores

Non-standard Features

Simplified Configuration

The Batch API requires the @BatchProperty annotation to inject the specific configuration from the batch definition file. Instead, you can use the @ConfigProperty annotation, which is used to inject configuration properties in Quarkus using the MicroProfile Config API and keep consistency:

@Inject
@BatchProperty(name = "job.config.name")
String batchConfig;

// These is equivalent to @BatchProperty injection
@ConfigProperty(name = "job.config.name")
Optional<String> mpConfig;

Although, there is a slight limitation: since job configuration is mostly dynamic and only injected on job execution, Quarkus may fail to start due to invalid configuration (can't find the Job configuration values). In this case, configuration injection points with the @ConfigProperty annotation need to set a default value or use an Optional.

CDI Beans

The Batch APIs JobOperator and JobRepository are available as CDI beans, so they can be injected directly into any code:

@Inject
JobOperator jobOperator;
@Inject
JobRepository jobRepository;

void start() {
    long executionId = jobOperator.start("batchlet", new Properties());
    JobExecution jobExecution = jobRepository.getJobExecution(executionId);
}

It is possible to provide a Job definition via a CDI producer (instead of using XML):

@ApplicationScoped
public static class JobProducer {
    @Produces
    @Named
    public Job job() {
        return new JobBuilder("job")
                .step(new StepBuilder("step").batchlet("batchlet", new String[] {}).build())
                .build();
    }
}

A Job registered with CDI will be named by the name provided in the @Named annotation or by the method name. The @Named annotations is required regardless.

Additional Beans

Specific Quarkus implementation is available in QuarkusJobOperator, which can be also injected directly:

@Inject
QuarkusJobOperator jobOperator;

void start() {
    Job job = new JobBuilder("programmatic")
            .step(new StepBuilder("programmaticStep")
                    .batchlet("programmaticBatchlet")
                    .build())
            .build();

    long executionId = jobOperator.start(job, new Properties());
    JobExecution jobExecution = jobOperator.getJobExecution(executionId);
}

With QuarkusJobOperator it is possible to define and start programmatic Jobs, with the JBeret Programmatic Job Definition.

Scheduler

The JBeret Scheduler is integrated out of the box in this extension.

To schedule a Job execution, please refer to the quarkus.jberet.job."job-name".cron and
quarkus.jberet.job."job-name".params."param-key" configurations.

A Job can also be scheduled programmatically, using the JobScheduler API and the Quarkus startup event:

@ApplicationScoped
public class Scheduler {
    @Inject
    JobScheduler jobScheduler;

    void onStart(@Observes StartupEvent startupEvent) {
        final JobScheduleConfig scheduleConfig = JobScheduleConfigBuilder.newInstance()
                .jobName("scheduler")
                .initialDelay(0)
                .build();

        jobScheduler.schedule(scheduleConfig);
    }
}

The JobScheduler does not support persistent schedules.

REST API

The JBeret REST is integrated as separate extension that can be easily added to the target project with the following dependency:

<dependency>
  <groupId>io.quarkiverse.jberet</groupId>
  <artifactId>quarkus-jberet-rest</artifactId>
  <version>2.0.0</version>
</dependency>

The JBeret REST API, provides REST resources to several operations around the Batch API: starting and stopping jobs, querying the status of a job, schedule a job, and many more. The extension includes a REST client to simplify the REST API calls:

@Inject
BatchClient batchClient;

void start() throws Exception {
    JobExecutionEntity jobExecutionEntity = batchClient.startJob("batchlet", new Properties());
}

Example Applications

Example applications can be found inside the integration-tests folder:

  • chunk - A simple Job that reads, processes, and stores data from a file.
  • jdbc-repository - A Job that uses a jdbc datasource to store JBeret and Job metadata.
  • scheduler - Schedule a Job to run every 10 seconds

Or take a look into the World of Warcraft Auctions - Batch Application. It downloads the World of Warcraft Auction House data and provides statistics about items prices.

Native Image Limitations

The Quakus JBeret Extension fully supports the Graal VM Native Image with the following exceptions:

  • Scripting Languages. While Javascript should work, it is unlikely that other scripting languages will be supported in Graal via JSR-223.

Contributors ✨____

Thanks goes to these wonderful people (emoji key):


Roberto Cortez

💻 🚧

This project follows the all-contributors specification. Contributions of any kind welcome!

quarkus-jberet's People

Contributors

actions-user avatar allcontributors[bot] avatar amoscatelli avatar aureamunoz avatar dependabot[bot] avatar gastaldi avatar geoand avatar holly-cummins avatar kalyan-dass avatar luca-bassoricci avatar maxandersen avatar nplu5 avatar quarkiversebot avatar radcortez avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

quarkus-jberet's Issues

Error during statup of Quarkus 3.0.0.Alpha6

I am using the version 2.0.0 of the extension. My project is using quarkus 3.0.0.Alpha6 and in dev mode I have the following error during the starting of Quarkus.

2023-03-17 17:44:30,421 ERROR [io.qua.dev.dep.DevUIProcessor] (build-2) Failed to process extension descriptor jar:file:///home/nyko/.m2/repo/io/quarkiverse/jberet/quarkus-jberet/2.0.0/quarkus-jberet-2.0.0.jar!/META-INF/quarkus-extension.yaml: java.lang.ClassCastException: class java.lang.String cannot be cast to class java.util.List (java.lang.String and java.util.List are in module java.base of loader 'bootstrap')
	at io.quarkus.devui.deployment.DevUIProcessor.lambda$getAllExtensions$1(DevUIProcessor.java:397)
	at io.quarkus.runtime.util.ClassPathUtils.lambda$consumeAsPath$0(ClassPathUtils.java:121)
	at io.quarkus.runtime.util.ClassPathUtils.processAsPath(ClassPathUtils.java:154)
	at io.quarkus.runtime.util.ClassPathUtils.consumeAsPath(ClassPathUtils.java:120)
	at io.quarkus.runtime.util.ClassPathUtils.consumeAsPaths(ClassPathUtils.java:104)
	at io.quarkus.runtime.util.ClassPathUtils.consumeAsPaths(ClassPathUtils.java:85)
	at io.quarkus.devui.deployment.DevUIProcessor.getAllExtensions(DevUIProcessor.java:358)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at io.quarkus.deployment.ExtensionLoader$3.execute(ExtensionLoader.java:909)
	at io.quarkus.builder.BuildContext.run(BuildContext.java:282)
	at org.jboss.threads.ContextHandler$1.runWith(ContextHandler.java:18)
	at org.jboss.threads.EnhancedQueueExecutor$Task.run(EnhancedQueueExecutor.java:2513)
	at org.jboss.threads.EnhancedQueueExecutor$ThreadBody.run(EnhancedQueueExecutor.java:1538)
	at java.base/java.lang.Thread.run(Thread.java:829)
	at org.jboss.threads.JBossThread.run(JBossThread.java:501)

Any idea for the reason of this errror ?

Support @StepScoped / @JobScoped

JBeret's CDI extension provides @StepScoped / @JobScoped annotations. Quarkus not supporting Portable CDI extensions, it doesn't work: annotations are on the classpath but do nothing, Quarkus fails at build time because any dependency using @StepScoped / @JobScoped beans won't work.

Reference documentation to implement custom contexts: https://quarkus.io/guides/cdi-integration#custom_context

This also seems to be quite useful:
https://stackoverflow.com/questions/60546069/how-to-create-a-custom-scope-in-quarkus/60725194#60725194

Here is some naive, unfinished implementation:

import io.quarkus.arc.InjectableBean;
import io.quarkus.arc.InjectableContext;
import jakarta.batch.runtime.context.StepContext;
import jakarta.enterprise.context.ContextNotActiveException;
import jakarta.enterprise.context.spi.Contextual;
import jakarta.enterprise.context.spi.CreationalContext;
import org.jberet.cdi.StepScoped;
import org.jberet.creation.ArtifactCreationContext;

import java.lang.annotation.Annotation;
import java.util.Map;
import java.util.Optional;
import java.util.concurrent.ConcurrentHashMap;
import java.util.stream.Collectors;

public class StepScope implements InjectableContext {

    private final Map<StepContext, StepContextState> stepStateMap = new ConcurrentHashMap<>();

    @SuppressWarnings({"rawtypes", "unchecked"})
    @Override
    public void destroy() {
        stepStateMap.values()
                .forEach(StepContextState::destroy);
    }

    @Override
    public void destroy(Contextual<?> contextual) {
        stepStateMap.values()
                .forEach(stepContextState -> stepContextState.destroy(contextual));
    }


    @Override
    public ContextState getState() {
        return getStepContextState();
    }

    private StepContextState getStepContextState() {
        StepContext stepContext = getCurrentStepContext();
        if (stepContext == null) {
            throw new ContextNotActiveException("StepContext not available (step not running), StepScope is not active.");
        }
        return stepStateMap.computeIfAbsent(stepContext, absentStepContext -> new StepContextState());
    }

    @Override
    public Class<? extends Annotation> getScope() {
        return StepScoped.class;
    }

    @Override
    public <T> T get(Contextual<T> contextual, CreationalContext<T> creationalContext) {
        StepContextState stepContextState = getStepContextState();
        return stepContextState.get(contextual, creationalContext);
    }

    @Override
    public <T> T get(Contextual<T> contextual) {
        StepContextState stepContextState = getStepContextState();
        return stepContextState.get(contextual);
    }

    @Override
    public boolean isActive() {
        ArtifactCreationContext currentArtifactCreationContext = ArtifactCreationContext.getCurrentArtifactCreationContext();
        return currentArtifactCreationContext.getStepContext() != null;
    }

    private StepContext getCurrentStepContext() {
        return ArtifactCreationContext.getCurrentArtifactCreationContext().getStepContext();
    }

    static class StepContextState implements ContextState {

        private final Map<InjectableBean<?>, ScopedInstance<?>> injectableBeanScopedInstanceMap = new ConcurrentHashMap<>();

        @Override
        public Map<InjectableBean<?>, Object> getContextualInstances() {
            return injectableBeanScopedInstanceMap.entrySet()
                    .stream()
                    .collect(Collectors.toMap(Map.Entry::getKey, entry -> entry.getValue().instance));
        }

        @SuppressWarnings({"unchecked"})
        public void destroy() {
            injectableBeanScopedInstanceMap.forEach((contextual, scopedInstance) -> destroy((Contextual) contextual, scopedInstance));
        }

        @SuppressWarnings({"rawtypes", "unchecked"})
        public void destroy(Contextual<?> contextual) {
            getScopedInstanceOptional(contextual)
                    .ifPresent(scopedInstance -> destroy((Contextual) contextual, scopedInstance));
        }

        private static <T> void destroy(Contextual<T> contextual, ScopedInstance<T> scopedInstance) {
            CreationalContext<T> creationalContext = scopedInstance.creationalContext;
            T instance = scopedInstance.instance;
            contextual.destroy(instance, creationalContext);
        }

        public <T> T get(Contextual<T> contextual) {
            return getScopedInstanceOptional(contextual)
                    .map(ScopedInstance::instance)
                    .orElse(null);
        }

        @SuppressWarnings("unchecked")
        public <T> T get(Contextual<T> contextual, CreationalContext<T> creationalContext) {
            InjectableBean<T> injectableBean = (InjectableBean<T>) contextual;
            return (T) injectableBeanScopedInstanceMap.computeIfAbsent(injectableBean, absentInjectableBean -> createScopedInstance(contextual, creationalContext)).instance;
        }

        private <T> ScopedInstance<T> createScopedInstance(Contextual<T> contextual, CreationalContext<T> creationalContext) {
            T instance = contextual.create(creationalContext);
            return new ScopedInstance<>(instance, creationalContext);
        }

        @SuppressWarnings("unchecked")
        private <T> Optional<ScopedInstance<T>> getScopedInstanceOptional(Contextual<T> contextual) {
            ScopedInstance<T> scopedInstance = (ScopedInstance<T>) injectableBeanScopedInstanceMap.get(contextual);
            return Optional.ofNullable(scopedInstance);
        }
    }

    public record ScopedInstance<T>(
            T instance,
            CreationalContext<T> creationalContext
    ) {

    }
}
import io.quarkus.arc.deployment.ContextRegistrationPhaseBuildItem;
import io.quarkus.arc.deployment.ContextRegistrationPhaseBuildItem.ContextConfiguratorBuildItem;
import io.quarkus.arc.deployment.CustomScopeBuildItem;
import io.quarkus.deployment.annotations.BuildStep;
import org.jberet.cdi.StepScoped;
import org.jboss.jandex.DotName;

public class JBeretCustomBuildStep {

    @BuildStep
    ContextConfiguratorBuildItem registerContext(ContextRegistrationPhaseBuildItem phase) {
        return new ContextConfiguratorBuildItem(phase.getContext().configure(StepScoped.class).normal().contextClass(StepScope.class));
    }

    @BuildStep
    CustomScopeBuildItem customScope() {
        return new CustomScopeBuildItem(DotName.createSimple(StepScoped.class.getName()));
    }

}

scheduler fires on quarkus startup execution of job

I configured a cron expressoin for my job in application.properties like:

quarkus.jberet.job."jobname".cron=0 0 19 ? * *

Expected:
This cron should start the job every day at 7pm.

Current situation:
Job is started after Quarkus is startet AND every day at 7pm.

Why is the job started when quarkus is launched?

Enlisted connection without active transaction in later Quarkus version

Description

I tried to upgrade a working example I had from Quarkus 3.4.0 and Quarkus JBeret 2.0.0 to the latest and greatest (3.9.5 and 2.3.0) but I started getting the error bellow.

I tried to lower the version to see where the error is coming from and the highest I got is:
Quarkus: 3.8.4
Quarkus JBerer: 2.3.0 (2.3.1 seems to need Quarkus 3.9.0 or else it gives me some Hibernate config converter errors).

The exception:

2024-05-04 10:22:10,850 WARN  [com.arj.ats.jta] (executor-thread-2) ARJUNA016045: attempted rollback of < formatId=131077, gtrid_length=35, bqual_length=36, tx_uid=0:ffffc0a80106:a4bf:6635e220:1b,
 node_name=quarkus, branch_uid=0:ffffc0a80106:a4bf:6635e220:22, subordinatenodename=null, eis_name=0 > (io.agroal.narayana.LocalXAResource@34af3c75) failed with exception code XAException.XAER_RME
RR: javax.transaction.xa.XAException: Error trying to transactionRollback local transaction: Enlisted connection used without active transaction
        at io.agroal.narayana.XAExceptionUtils.xaException(XAExceptionUtils.java:20)
        at io.agroal.narayana.XAExceptionUtils.xaException(XAExceptionUtils.java:8)
        at io.agroal.narayana.LocalXAResource.rollback(LocalXAResource.java:89)
        at com.arjuna.ats.internal.jta.resources.arjunacore.XAResourceRecord.topLevelAbort(XAResourceRecord.java:338)
        at com.arjuna.ats.internal.jta.transaction.arjunacore.TransactionImple.enlistResource(TransactionImple.java:644)
        at com.arjuna.ats.internal.jta.transaction.arjunacore.TransactionImple.enlistResource(TransactionImple.java:398)
        at io.agroal.narayana.NarayanaTransactionIntegration.associate(NarayanaTransactionIntegration.java:120)
        at io.agroal.pool.ConnectionPool.getConnection(ConnectionPool.java:257)
        at io.agroal.pool.DataSource.getConnection(DataSource.java:86)
        at io.quarkus.hibernate.orm.runtime.customized.QuarkusConnectionProvider.getConnection(QuarkusConnectionProvider.java:23)
        at org.hibernate.internal.NonContextualJdbcConnectionAccess.obtainConnection(NonContextualJdbcConnectionAccess.java:46)
        at org.hibernate.resource.jdbc.internal.LogicalConnectionManagedImpl.acquireConnectionIfNeeded(LogicalConnectionManagedImpl.java:113)
        at org.hibernate.resource.jdbc.internal.LogicalConnectionManagedImpl.getPhysicalConnection(LogicalConnectionManagedImpl.java:143)
        at org.hibernate.engine.jdbc.internal.StatementPreparerImpl.connection(StatementPreparerImpl.java:54)
        at org.hibernate.engine.jdbc.internal.StatementPreparerImpl$5.doPrepare(StatementPreparerImpl.java:153)
        at org.hibernate.engine.jdbc.internal.StatementPreparerImpl$StatementPreparationTemplate.prepareStatement(StatementPreparerImpl.java:183)
        at org.hibernate.engine.jdbc.internal.StatementPreparerImpl.prepareQueryStatement(StatementPreparerImpl.java:155)
        at org.hibernate.sql.exec.spi.JdbcSelectExecutor.lambda$list$0(JdbcSelectExecutor.java:85)
        at org.hibernate.sql.results.jdbc.internal.DeferredResultSetAccess.executeQuery(DeferredResultSetAccess.java:231)
        at org.hibernate.sql.results.jdbc.internal.DeferredResultSetAccess.getResultSet(DeferredResultSetAccess.java:167)
        at org.hibernate.sql.results.jdbc.internal.JdbcValuesResultSetImpl.advanceNext(JdbcValuesResultSetImpl.java:218)
        at org.hibernate.sql.results.jdbc.internal.JdbcValuesResultSetImpl.processNext(JdbcValuesResultSetImpl.java:98)
        at org.hibernate.sql.results.jdbc.internal.AbstractJdbcValues.next(AbstractJdbcValues.java:19)
        at org.hibernate.sql.results.internal.RowProcessingStateStandardImpl.next(RowProcessingStateStandardImpl.java:66)
        at org.hibernate.sql.results.spi.ListResultsConsumer.consume(ListResultsConsumer.java:202)
        at org.hibernate.sql.results.spi.ListResultsConsumer.consume(ListResultsConsumer.java:33)
        at org.hibernate.sql.exec.internal.JdbcSelectExecutorStandardImpl.doExecuteQuery(JdbcSelectExecutorStandardImpl.java:209)
        at org.hibernate.sql.exec.internal.JdbcSelectExecutorStandardImpl.executeQuery(JdbcSelectExecutorStandardImpl.java:83)
        at org.hibernate.sql.exec.spi.JdbcSelectExecutor.list(JdbcSelectExecutor.java:76)
        at org.hibernate.sql.exec.spi.JdbcSelectExecutor.list(JdbcSelectExecutor.java:65)
        at org.hibernate.query.sqm.internal.ConcreteSqmSelectQueryPlan.lambda$new$2(ConcreteSqmSelectQueryPlan.java:137)
        at org.hibernate.query.sqm.internal.ConcreteSqmSelectQueryPlan.withCacheableSqmInterpretation(ConcreteSqmSelectQueryPlan.java:362)
        at org.hibernate.query.sqm.internal.ConcreteSqmSelectQueryPlan.performList(ConcreteSqmSelectQueryPlan.java:303)
        at org.hibernate.query.sqm.internal.QuerySqmImpl.doList(QuerySqmImpl.java:509)
        at org.hibernate.query.spi.AbstractSelectionQuery.list(AbstractSelectionQuery.java:427)
        at org.hibernate.query.Query.getResultList(Query.java:120)
        at io.quarkus.hibernate.orm.panache.common.runtime.CommonPanacheQueryImpl.list(CommonPanacheQueryImpl.java:280)
        at io.quarkus.hibernate.orm.panache.runtime.PanacheQueryImpl.list(PanacheQueryImpl.java:149)
        at org.acme.entity.Expense.findAllExpensesOfMonth(Expense.java:51)
        at org.acme.batch.bill.ExpenseItemReader.open(ExpenseItemReader.java:42)
        at org.jberet.runtime.runner.ChunkRunner.run(ChunkRunner.java:195)
        at org.jberet.runtime.runner.StepExecutionRunner.runBatchletOrChunk(StepExecutionRunner.java:223)
        at org.jberet.runtime.runner.StepExecutionRunner.run(StepExecutionRunner.java:142)
        at org.jberet.runtime.runner.CompositeExecutionRunner.runStep(CompositeExecutionRunner.java:170)
        at org.jberet.runtime.runner.CompositeExecutionRunner.runFromHeadOrRestartPoint(CompositeExecutionRunner.java:94)
        at org.jberet.runtime.runner.JobExecutionRunner.run(JobExecutionRunner.java:58)
        at org.jberet.spi.JobExecutor$1.run(JobExecutor.java:100)
        at io.smallrye.context.impl.wrappers.SlowContextualRunnable.run(SlowContextualRunnable.java:19)
        at io.quarkus.vertx.core.runtime.VertxCoreRecorder$14.runWith(VertxCoreRecorder.java:582)
        at org.jboss.threads.EnhancedQueueExecutor$Task.run(EnhancedQueueExecutor.java:2513)
        at org.jboss.threads.EnhancedQueueExecutor$ThreadBody.run(EnhancedQueueExecutor.java:1512)
        at org.jboss.threads.DelegatingRunnable.run(DelegatingRunnable.java:29)
        at org.jboss.threads.ThreadLocalResettingRunnable.run(ThreadLocalResettingRunnable.java:29)
        at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base/java.lang.Thread.run(Thread.java:833)
Caused by: java.sql.SQLException: Enlisted connection used without active transaction
        at io.agroal.pool.ConnectionHandler.verifyEnlistment(ConnectionHandler.java:381)
        at io.agroal.pool.ConnectionHandler.transactionRollback(ConnectionHandler.java:352)
        at io.agroal.narayana.LocalXAResource.rollback(LocalXAResource.java:86)
        ... 52 more

My setup

The batch-job.xml looks like this:

<?xml version="1.0" encoding="UTF-8"?>
<job id="billing-job" xmlns="http://xmlns.jcp.org/xml/ns/javaee" version="1.0">
    <step id="createBillItems" >
        <chunk>
            <reader ref="expenseItemReader"/>
            <processor ref="billItemGenerator"/>
            <writer ref="billItemWriter"/>
        </chunk>    
    </step>
</job>

My reader:

package org.acme.batch.bill;

import java.io.Serializable;
import java.util.List;
import java.util.Properties;

import org.acme.entity.Expense;
import org.eclipse.microprofile.config.inject.ConfigProperty;

import jakarta.batch.api.chunk.AbstractItemReader;
import jakarta.batch.operations.JobOperator;
import jakarta.batch.runtime.context.JobContext;
import jakarta.enterprise.context.Dependent;
import jakarta.inject.Inject;
import jakarta.inject.Named;

@Dependent
@Named
public class ExpenseItemReader extends AbstractItemReader {

  @Inject
  private JobOperator jobOperator;

  @Inject
  private JobContext jobContext;

  @Inject
  @ConfigProperty(name = "month", defaultValue = "1")
  private int month;
  @Inject
  @ConfigProperty(name = "year", defaultValue = "2020")
  private int year;

  private List<Expense> items;
  private int currentIndex = 0;

  @Override
  public void open(Serializable checkpoint) throws Exception {
    Properties properties = jobOperator.getParameters(jobContext.getExecutionId());
    month = Integer.parseInt(properties.getProperty("month"));
    year = Integer.parseInt(properties.getProperty("year"));
    items = Expense.findAllExpensesOfMonth(month, year);
  }

  @Override
  public Object readItem() throws Exception {
    if (currentIndex < items.size()) {
      return items.get(currentIndex++);
    } else {
      return null;
    }
  }
}

My writer:

package org.acme.batch.bill;

import java.util.List;

import org.acme.entity.BillItem;

import jakarta.batch.api.chunk.AbstractItemWriter;
import jakarta.enterprise.context.Dependent;
import jakarta.inject.Named;
import jakarta.transaction.Transactional;

@Dependent
@Named
public class BillItemWriter extends AbstractItemWriter {

  @Transactional
  @Override
  public void writeItems(List<Object> items) throws Exception {
    for (Object item : items) {
      if (item instanceof BillItem billItem) {
        billItem.persist();
      } else if (item instanceof List list) {
        writeItems(list);
      }
    }
  }
}

Cannot restart a failed job on quarkus-app restart (jdbc repository)

  • Started a quarkus app (mvnw quarkus:dev) with jberet jdbc-repository.
  • Submitted a chunk based job, which results in failure.
  • Stopped quarkus app
  • Start quarkus app, and restar the job using jberet-rest endpoint (/lms/jobexecutions/{jobExecutionId}/restart)
  • We get an java.lang.UnsupportedOperationException exception
    snippet of stack trace
    at io.quarkiverse.jberet.runtime.QuarkusBatchEnvironment.getJobXmlResolver(QuarkusBatchEnvironment.java:69) at org.jberet.operations.AbstractJobOperator.restartFailedOrStopped(AbstractJobOperator.java:402) at org.jberet.operations.AbstractJobOperator.restart(AbstractJobOperator.java:253) at org.jberet.operations.AbstractJobOperator.restart(AbstractJobOperator.java:225) at io.quarkiverse.jberet.runtime.QuarkusJobOperator.restart(QuarkusJobOperator.java:81) at org.jberet.operations.DelegatingJobOperator.restart(DelegatingJobOperator.java:73) at org.jberet.rest.service.JobService.restart(JobService.java:166) at org.jberet.rest.resource.JobExecutionResource.restart(JobExecutionResource.java:150)

This happens because we cannot find a Job in AbstractJobOperator at
https://github.com/jberet/jsr352/blob/738f1805a872ec3a2618c1cd2b1726b6afe8c458/jberet-core/src/main/java/org/jberet/operations/AbstractJobOperator.java#L387
which triggers a batchEnvironment.getJobXmlResolver()
at https://github.com/jberet/jsr352/blob/738f1805a872ec3a2618c1cd2b1726b6afe8c458/jberet-core/src/main/java/org/jberet/operations/AbstractJobOperator.java#L402

If we restart the job without restarting our quarkus-app, we don't face the above issue

GraalVM 20.3.0: ServiceLoader can not find JobOperator

Hallo,
I use quarkus-jberet 0.0.2 in a project with 10 splitting steps. All is running fine in any java package (fast-jar, ueber-jar). If I packaing this project native with graal I can see a Warning "The ServiceLoader was unable to find an implementation for JobOperator. Check classpath for META-INF/services/javax.batch.operations.JobOperator file." First instantiation of a JobOperator ends with a NullPointerException. What can I do?

JdbcRepository customization

Is there any way to specify custom properties or use a custom DDL for table creation?
I'm asking that because I want to create table with a specific schema, but I haven't found any solution yet.
JBeretRepositoryFactory.getJobRepository() is the factory static method used to create the instance of JobRepository but at line 20 an empty Properties is passed to JdbcRepository ctor so none customization is permitted.
In JBeretConfig adding support for properties managed by JdbcRepository or integrate JBeretConfig with jberet.properties content.

context accessible: module java.base does not "opens java.security" to unnamed module

Hello, I'm just trying to make a very simple command that launches a jberet job.

But even the most basic code has problems.
the example is here https://github.com/sekaijin/jberet-quarkus/blob/main/src/main/java/fr/sekaijin/jberet/Main.java

and the error

2022-06-28 14:58:08,244 ERROR [io.qua.run.boo.StartupActionImpl] (Quarkus Main Thread) Error running Quarkus: java.lang.reflect.InvocationTargetException
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.base/java.lang.reflect.Method.invoke(Method.java:568)
        at io.quarkus.runner.bootstrap.StartupActionImpl$1.run(StartupActionImpl.java:103)
        at java.base/java.lang.Thread.run(Thread.java:833)
Caused by: java.lang.ExceptionInInitializerError
        at org.jberet.runtime.JobExecutionImpl.<init>(JobExecutionImpl.java:75)
        at org.jberet.repository.InMemoryRepository.createJobExecution(InMemoryRepository.java:195)
        at org.jberet.operations.AbstractJobOperator.startJobExecution(AbstractJobOperator.java:445)
        at org.jberet.operations.AbstractJobOperator.access$000(AbstractJobOperator.java:60)
        at org.jberet.operations.AbstractJobOperator$1.invoke(AbstractJobOperator.java:146)
        at org.jberet.operations.AbstractJobOperator$1.invoke(AbstractJobOperator.java:142)
        at org.jberet.operations.AbstractJobOperator.invokeTransaction(AbstractJobOperator.java:465)
        at org.jberet.operations.AbstractJobOperator.start(AbstractJobOperator.java:142)
        at org.jberet.operations.AbstractJobOperator.start(AbstractJobOperator.java:119)
        at fr.sekaijin.jberet.Main.start(Main.java:28)
        at fr.sekaijin.jberet.Main_Subclass.start$$superforward1(Unknown Source)
        at fr.sekaijin.jberet.Main_Subclass$$function$$4.apply(Unknown Source)
        at io.quarkus.arc.impl.AroundInvokeInvocationContext.proceed(AroundInvokeInvocationContext.java:53)
        at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.proceed(InvocationInterceptor.java:62)
        at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.monitor(InvocationInterceptor.java:51)
        at io.quarkus.arc.runtime.devconsole.InvocationInterceptor_Bean.intercept(Unknown Source)
        at io.quarkus.arc.impl.InterceptorInvocation.invoke(InterceptorInvocation.java:41)
        at io.quarkus.arc.impl.AroundInvokeInvocationContext.perform(AroundInvokeInvocationContext.java:40)
        at io.quarkus.arc.impl.InvocationContexts.performAroundInvoke(InvocationContexts.java:32)
        at fr.sekaijin.jberet.Main_Subclass.start(Unknown Source)
        at fr.sekaijin.jberet.Main.run(Main.java:34)
        at fr.sekaijin.jberet.Main_Subclass.run$$superforward1(Unknown Source)
        at fr.sekaijin.jberet.Main_Subclass$$function$$3.apply(Unknown Source)
        at io.quarkus.arc.impl.AroundInvokeInvocationContext.proceed(AroundInvokeInvocationContext.java:53)
        at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.proceed(InvocationInterceptor.java:62)
        at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.monitor(InvocationInterceptor.java:51)
        at io.quarkus.arc.runtime.devconsole.InvocationInterceptor_Bean.intercept(Unknown Source)
        at io.quarkus.arc.impl.InterceptorInvocation.invoke(InterceptorInvocation.java:41)
        at io.quarkus.arc.impl.AroundInvokeInvocationContext.perform(AroundInvokeInvocationContext.java:40)
        at io.quarkus.arc.impl.InvocationContexts.performAroundInvoke(InvocationContexts.java:32)
        at fr.sekaijin.jberet.Main_Subclass.run(Unknown Source)
        at fr.sekaijin.jberet.Main_ClientProxy.run(Unknown Source)
        at io.quarkus.runtime.ApplicationLifecycleManager.run(ApplicationLifecycleManager.java:124)
        at io.quarkus.runtime.Quarkus.run(Quarkus.java:67)
        at io.quarkus.runtime.Quarkus.run(Quarkus.java:41)
        at io.quarkus.runner.GeneratedMain.main(Unknown Source)
        ... 6 more
Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make field private java.security.ProtectionDomain[] java.security.AccessControlContext.context accessible: module java.base does not "opens java.security" to unnamed module @298eb393
        at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:354)
        at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:297)
        at java.base/java.lang.reflect.Field.checkCanSetAccessible(Field.java:178)
        at java.base/java.lang.reflect.Field.setAccessible(Field.java:172)
        at org.wildfly.security.manager.GetAccessibleDeclaredFieldAction.run(GetAccessibleDeclaredFieldAction.java:56)
        at org.wildfly.security.manager.GetAccessibleDeclaredFieldAction.run(GetAccessibleDeclaredFieldAction.java:34)
        at java.base/java.security.AccessController.doPrivileged(AccessController.java:318)
        at org.wildfly.security.manager.WildFlySecurityManager.<clinit>(WildFlySecurityManager.java:107)
        ... 42 more

thank

Compilation error when there is a groovy script job

Hi,
I get the following compilation error when I have defined a Job with an external groovy script the step.

Job definition: helloGroovyJob.xml

<script src="groovy/HelloGroovyBachlet.groovy" />

Compilation error after execution of mvnw package
D:\desa\eclipse-dev\quarkus-picocli-example>mvnw package
[INFO] Scanning for projects...
[INFO]
[INFO] --------------< quarkus-examples:quarkus-picocli-example >--------------
[INFO] Building quarkus-picocli-example 0.0.1-SNAPSHOT
[INFO] --------------------------------[ jar ]---------------------------------
[INFO]
[INFO] --- quarkus-maven-plugin:1.12.0.Final:generate-code (default) @ quarkus-picocli-example ---
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ quarkus-picocli-example ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 4 resources
[INFO]
[INFO] --- maven-compiler-plugin:3.8.1:compile (default-compile) @ quarkus-picocli-example ---
[INFO] Changes detected - recompiling the module!
[INFO] Using Groovy-Eclipse compiler to compile both Java and Groovy files
[INFO]
[INFO] --- quarkus-maven-plugin:1.12.0.Final:generate-code-tests (default) @ quarkus-picocli-example ---
[INFO]
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ quarkus-picocli-example ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory D:\desa\eclipse-dev\quarkus-picocli-example\src\test\resources
[INFO]
[INFO] --- maven-compiler-plugin:3.8.1:testCompile (default-testCompile) @ quarkus-picocli-example ---
[INFO] Changes detected - recompiling the module!
[INFO] Using Groovy-Eclipse compiler to compile both Java and Groovy files
[INFO]
[INFO] --- maven-surefire-plugin:2.22.1:test (default-test) @ quarkus-picocli-example ---
[INFO]
[INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ quarkus-picocli-example ---
[INFO] Building jar: D:\desa\eclipse-dev\quarkus-picocli-example\target\quarkus-picocli-example-0.0.1-SNAPSHOT.jar
[INFO]
[INFO] --- quarkus-maven-plugin:1.12.0.Final:build (default) @ quarkus-picocli-example ---
[INFO] [org.jboss.threads] JBoss Threads version 3.2.0.Final
[INFO] [org.jberet] JBERET000030: Resolved job file:/D:/desa/eclipse-dev/quarkus-picocli-example/target/classes/META-INF/batch-jobs/helloGroovyJob.xml
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 19.478 s
[INFO] Finished at: 2021-03-03T12:31:29+01:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal io.quarkus:quarkus-maven-plugin:1.12.0.Final:build (default) on project quarkus-picocli-example: Failed to build quarkus application: io.quarkus.builder.BuildException: Build failure: Build failed due to errors
[ERROR] [error]: Build step io.quarkus.deployment.steps.MainClassBuildStep#build threw an exception: java.lang.RuntimeException: Failed to record call to method public void io.quarkiverse.jberet.runtime.JBeretRecorder.registerJobs(java.util.List)
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.writeBytecode(BytecodeRecorderImpl.java:452)
[ERROR] at io.quarkus.deployment.steps.MainClassBuildStep.writeRecordedBytecode(MainClassBuildStep.java:437)
[ERROR] at io.quarkus.deployment.steps.MainClassBuildStep.build(MainClassBuildStep.java:170)
[ERROR] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[ERROR] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[ERROR] at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[ERROR] at java.base/java.lang.reflect.Method.invoke(Method.java:566)
[ERROR] at io.quarkus.deployment.ExtensionLoader$2.execute(ExtensionLoader.java:920)
[ERROR] at io.quarkus.builder.BuildContext.run(BuildContext.java:277)
[ERROR] at org.jboss.threads.EnhancedQueueExecutor$Task.run(EnhancedQueueExecutor.java:2415)
[ERROR] at org.jboss.threads.EnhancedQueueExecutor$ThreadBody.run(EnhancedQueueExecutor.java:1452)
[ERROR] at java.base/java.lang.Thread.run(Thread.java:834)
[ERROR] at org.jboss.threads.JBossThread.run(JBossThread.java:501)
[ERROR] Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: Cannot serialise field 'type' on object 'org.jberet.job.model.Script@3b6a1f53' as the property is read only
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.loadComplexObject(BytecodeRecorderImpl.java:1371)
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.loadObjectInstanceImpl(BytecodeRecorderImpl.java:956)
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.loadObjectInstance(BytecodeRecorderImpl.java:548)
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.loadComplexObject(BytecodeRecorderImpl.java:1114)
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.loadObjectInstanceImpl(BytecodeRecorderImpl.java:956)
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.loadObjectInstance(BytecodeRecorderImpl.java:548)
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.writeBytecode(BytecodeRecorderImpl.java:447)
[ERROR] ... 12 more
[ERROR] Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: Cannot serialise field 'type' on object 'org.jberet.job.model.Script@3b6a1f53' as the property is read only
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.loadComplexObject(BytecodeRecorderImpl.java:1371)
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.loadObjectInstanceImpl(BytecodeRecorderImpl.java:956)
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.loadObjectInstance(BytecodeRecorderImpl.java:548)
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.loadComplexObject(BytecodeRecorderImpl.java:1114)
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.loadObjectInstanceImpl(BytecodeRecorderImpl.java:956)
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.loadObjectInstance(BytecodeRecorderImpl.java:548)
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.loadComplexObject(BytecodeRecorderImpl.java:1336)
[ERROR] ... 18 more
[ERROR] Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: Cannot serialise field 'type' on object 'org.jberet.job.model.Script@3b6a1f53' as the property is read only
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.loadComplexObject(BytecodeRecorderImpl.java:1371)
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.loadObjectInstanceImpl(BytecodeRecorderImpl.java:956)
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.loadObjectInstance(BytecodeRecorderImpl.java:548)
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.loadComplexObject(BytecodeRecorderImpl.java:1336)
[ERROR] ... 24 more
[ERROR] Caused by: java.lang.RuntimeException: java.lang.RuntimeException: Cannot serialise field 'type' on object 'org.jberet.job.model.Script@3b6a1f53' as the property is read only
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.loadComplexObject(BytecodeRecorderImpl.java:1301)
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.loadObjectInstanceImpl(BytecodeRecorderImpl.java:956)
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.loadObjectInstance(BytecodeRecorderImpl.java:548)
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.loadComplexObject(BytecodeRecorderImpl.java:1336)
[ERROR] ... 27 more
[ERROR] Caused by: java.lang.RuntimeException: Cannot serialise field 'type' on object 'org.jberet.job.model.Script@3b6a1f53' as the property is read only
[ERROR] at io.quarkus.deployment.recording.BytecodeRecorderImpl.loadComplexObject(BytecodeRecorderImpl.java:1292)
[ERROR] ... 30 more
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

Environment:
JDK Version AdoptOpenJDK 11.0.8.10-hotspot
IDE: Eclipse 2020-12
maven: 3.6.3
OS: Windows 10 Pro

I have tried also putting type="groovy" in the script reference, but I get the same error.

Sinceraly
Fernando

Custom annotation dropped on classes implementing ItemReader. Writer, Processor, etc.

Created sample application to show the problem:

  • Uses JBeret 0.0.5 extension
  • Created a custom annotation (HostName)
  • Created interface TestInterface that extends ItemProcessor
  • Created class HostNameMMS that implements TestInterface and has HostName custom annotation plus Dependent and Named annotations
  • Created class ApplicationScopedTest which is annotatied with @ApplicationScoped and observes StartupEvent. Also injects any TestInterface

On the init method of ApplicationScopedTest I am printing out the annotations for the injected class
Only Dependant is printed, the HostName annotation is gone

If I remove extends ItemProcessor and run the app again, all 3 annotation are being printed.

Result:
With extends ItemProcessor
@javax.enterprise.context.Dependent()
2021-06-12 06:12:19,856 ERROR [org.acm.ApplicationScoredTest_Subclass] (Quarkus Main Thread) Annotation NOT found

Removing extends ItemProcessor
@javax.enterprise.context.Dependent()
@javax.inject.Named(value="")
@org.acme.HostName(name="MMS")
2021-06-12 06:13:37,107 INFO [org.acm.ApplicationScoredTest_Subclass] (Quarkus Main Thread) Annotation HostName found

This works ok up to Quarkus 1.12.2. The zip with the code points to 1.12.2, If you change to 1.13.0 (gradle.properties) and above you will see the problem
custom-annotation.zip

Hot reload `batch.xml`

Currently, batch.xml, the file to set batch artifacts if you don't want to use CDI, is not being monitored for changes. We need to add it.

Support ServiceLoader injection for JobXmlResolver in JBeretProcessor

Hi everyone,

at the moment it is not possible to replace the usage of MetaInfBatchJobsJobXmlResolver with a custom implementation within the JBeretProcessor because it is directly instantiated within the class.

Codepointer of relevance:
https://github.com/quarkiverse/quarkus-jberet/blob/main/core/deployment/src/main/java/io/quarkiverse/jberet/deployment/JBeretProcessor.java#L242

Would be nice to register a custom implementation of the JobXmlResolver SPI, since the read xml files are cached after application startup.

JBERET000651: The requested permits (2) is greater than the maximum number of permits (1) allowed in the thread pool.

We got this error in openshift 4.0 with quarkus 1.11.0-Final and latest quarkus-jberet

org.jboss.threads.JBossThread.run(JBossThread.java:501)
: java.lang.IllegalStateException: JBERET000651: The requested permits (2) is greater than the maximum number of permits (1) allowed in the thread pool.

Is this something we can solve in application.properties? or in openshift environment?

Backport #189 prior to Quarkus 3.0.0 Update

Would it be possible to release the changes from #189 as a patch for the version of this extension before 2.0.0? E.g. a release of a 1.2.2 as we would need the change but did not and cannot yet migrate to Quarkus 3.0.0?
Happy to support where possible.

Support named jberet repository definition in application.properties

I've set my persistence unit as a named persistence unit in application.properties (This can also be if using multiple persistence units)

e.g.

quarkus.datasource."mydb".db-kind=mariadb
quarkus.datasource."mydb".jdbc.url=${DATABASE_URL:jdbc:mariadb://127.0.0.1:3306/test}
quarkus.datasource."mydb".username=xxxxx
quarkus.datasource."mydb".password=xxxxx

quarkus.jberet.repository.type=JDBC

The above configuration gives the following error

2021-04-02 14:43:22,299 ERROR [io.qua.dep.dev.IsolatedDevModeMain] (main) Failed to start quarkus: java.lang.RuntimeException: io.quarkus.builder.BuildException: Build failure: Build failed due to errors
	[error]: Build step io.quarkiverse.jberet.deployment.JBeretProcessor#init threw an exception: io.quarkus.deployment.configuration.ConfigurationError: TODO

at io.quarkiverse.jberet.deployment.JBeretProcessor.validateRepository(JBeretProcessor.java:301)
at io.quarkiverse.jberet.deployment.JBeretProcessor.init(JBeretProcessor.java:176)

Environment
Quarkus: 1.12.2.Final
jberet: 0.0.5

Workaround
I'm working around this by adding a default persistence unit to my properties file

quarkus.datasource.db-kind=mariadb
quarkus.datasource.jdbc.url=${DATABASE_URL:jdbc:mariadb://127.0.0.1:3306/test}
quarkus.datasource.username=xxxxx
quarkus.datasource.password=xxxxx

Expected Behavior
We should be able to define the jberet repository if using a named persistence unit or in case of multiple persistence units
e.g.

quarkus.jberet.repository.name=mydb
quarkus.jberet.repository.type=JDBC

ItemReader/Writer can not be declared using producer methods

When I declare an ItemReader or ItemWriter using @Produces factory method:

    @Produces
    @Named("myReader")
    @Dependent
    CustomItemReader myReader() {
        return new ...;
    }

Quarkus JBeret is unable to create the Read/Writer instance:

Caused by: java.lang.IllegalArgumentException: Type class org.acme.MyJobProvider is not a bean type of PRODUCER_METHOD bean [class=org.acme.MyJobProvider, id=005ac9d0e79901ee6b2aaacef5e95b54cc3a17f4]; its bean types are: [class java.lang.Object, interface jakarta.batch.api.chunk.ItemReader, class jakarta.batch.api.chunk.AbstractItemReader, class org.acme.CustomItemReader]
	at io.quarkus.arc.impl.BeanManagerImpl.getReference(BeanManagerImpl.java:60)
	at io.quarkiverse.jberet.runtime.QuarkusBatchEnvironment$QuarkusArtifactFactory.create(QuarkusBatchEnvironment.java:97)
	at org.jberet.creation.ArtifactFactoryWrapper.create(ArtifactFactoryWrapper.java:39)
	at org.jberet.runtime.context.JobContextImpl.createArtifact(JobContextImpl.java:195)
	... 16 more

In the attached reproducer project batch-provides.zip:

  • myJob / testRun uses @Produces factory method -> Job fails
  • myJob2 / testRun2 uses bean level class -> Job succeeds

In QuarkusBatchEnvironment class

        @Override
        public Object create(String ref, Class<?> cls, ClassLoader classLoader) {
            BeanManager bm = Arc.container().beanManager();
            Bean<?> bean = bm.resolve(bm.getBeans(aliases.getOrDefault(ref, ref)));
            return bean == null ? null : bm.getReference(bean, bean.getBeanClass(), bm.createCreationalContext(bean));
        }

bean.getBeanClass() contains the factory class (MyJobProvider here), not the target bean class (CustomItemReader here).

Start a job defined in java (programatic) using Rest API

Hi, i like to start using rest api (jbarret-rest), but this not woks for Programatic jobs

curl -X 'POST' \
  'http://localhost:8080/jobs/programaticJob/start' \
  -H 'Content-Type: application/json' \
  -d '{
        "myTestCase" : "SUCCESS"
  }'

16:45:19 WARN [or.jb.rest-api] (executor-thread-1) JBERET070500: Exception occurred when accessing JBeret Rest API:: jakarta.batch.operations.NoSuchJobException: Job with xml name programaticJob was not found

My class

import org.jberet.job.model.Job;
import org.jberet.job.model.JobBuilder;
import org.jberet.job.model.StepBuilder;

import jakarta.enterprise.context.Dependent;
import jakarta.enterprise.inject.Produces;
import jakarta.inject.Named;

@Dependent
public class ProgramaticJob {

        public static final String JOB_NAME = "programaticJob";

        @Produces
        @Named(JOB_NAME)
        public Job create() {

                // Mais exemplos:
                // https://jberet.gitbooks.io/jberet-user-guide/content/programmatic_job_definition_with_java/

                String batchlet1Name = "demoJobStep";

                return new JobBuilder(JOB_NAME)
                                .restartable(false)
                                .property("myTestCase", "SUCCESS")

                                .step(new StepBuilder("step-01")
                                                .batchlet(batchlet1Name, new String[] { "myTestCase", "SUCCESS" })
                                                .nextOn("*").to("step-02")
                                                .build())

                                .step(new StepBuilder("step-02")
                                                .batchlet(batchlet1Name, new String[] { "myTestCase", "DELAY" })
                                                .stopOn("STOP").restartFrom("step-01").exitStatus()
                                                .endOn("END").exitStatus()
                                                .failOn("FAIL").exitStatus()
                                                .build())

                                .build();
        }

}

Exception when injecting BatchClient

It's not possible to inject BatchClient as stated in documentation. When trying to use:

@Inject
BatchClient

got the following error:

Caused by: javax.enterprise.inject.UnsatisfiedResolutionException: Unsatisfied dependency for type org.jberet.rest.client.BatchClient and qualifiers [@Default]
	- java member: com.vibertronic.GreetingResource#batchClient
	- declared on CLASS bean [types=[java.lang.Object, com.vibertronic.GreetingResource], qualifiers=[@Default, @Any], target=com.vibertronic.GreetingResource]
	at io.quarkus.arc.processor.Beans.resolveInjectionPoint(Beans.java:484)
	at io.quarkus.arc.processor.BeanInfo.init(BeanInfo.java:378)
	at io.quarkus.arc.processor.BeanDeployment.init(BeanDeployment.java:247)
	... 12 more

	at io.quarkus.builder.Execution.run(Execution.java:116)
	at io.quarkus.builder.BuildExecutionBuilder.execute(BuildExecutionBuilder.java:79)
	at io.quarkus.deployment.QuarkusAugmentor.run(QuarkusAugmentor.java:153)
	at io.quarkus.runner.bootstrap.AugmentActionImpl.runAugment(AugmentActionImpl.java:306)
	... 9 more
Caused by: javax.enterprise.inject.spi.DeploymentException: javax.enterprise.inject.UnsatisfiedResolutionException: Unsatisfied dependency for type org.jberet.rest.client.BatchClient and qualifiers [@Default]
	- java member: com.vibertronic.GreetingResource#batchClient
	- declared on CLASS bean [types=[java.lang.Object, com.vibertronic.GreetingResource], qualifiers=[@Default, @Any], target=com.vibertronic.GreetingResource]
	at io.quarkus.arc.processor.BeanDeployment.processErrors(BeanDeployment.java:1078)
	at io.quarkus.arc.processor.BeanDeployment.init(BeanDeployment.java:255)
	at io.quarkus.arc.processor.BeanProcessor.initialize(BeanProcessor.java:129)
	at io.quarkus.arc.deployment.ArcProcessor.validate(ArcProcessor.java:428)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at io.quarkus.deployment.ExtensionLoader$2.execute(ExtensionLoader.java:920)
	at io.quarkus.builder.BuildContext.run(BuildContext.java:277)
	at org.jboss.threads.EnhancedQueueExecutor$Task.run(EnhancedQueueExecutor.java:2415)
	at org.jboss.threads.EnhancedQueueExecutor$ThreadBody.run(EnhancedQueueExecutor.java:1452)
	at java.base/java.lang.Thread.run(Thread.java:834)
	at org.jboss.threads.JBossThread.run(JBossThread.java:501)

Looking at source code base, JBeretRestProcessor does not add any additional bean as JBeretProcessor.additionalBeans() does. Maybe there is a missing method in JBeretRestProcessor ?:

    @BuildStep
    public void additionalBeans(
            BuildProducer<AdditionalBeanBuildItem> additionalBeans) {

        additionalBeans.produce(new AdditionalBeanBuildItem(JBeretRestProducer.class));
    }

quarkus-jberet in openshift pod

I went though the documentation but did not find any informations. How is quarkus-jberet suppose to work when there is more than one pod running the same job? (jberet scaling model is thread based, but openshift scalability is recommending using pods) or if one pod died and is restarted?

RequestScoped Context no longer works since the 2.20+(Java 17/Quarkus 3.7+) Release

Prior to release 2.20 the RequestScoped context functioned as expected on an injected dependency on a Batchlet. The Request Context started at the start/trigger of the job and was torn down at the end of the job completing and subsequent jobs ran with the same dependency functioned the same. However ever since release 2.20 when a job is run with a Batchlet that contains a RequestScoped context injected dependency the following error occurs:

2024-06-19 16:26:02,644 WARN [org.jberet] (executor-thread-2) JBERET000001: Failed to run batchlet org.jberet.job.model.RefArtifact@429e1416: jakarta.enterprise.context.ContextNotActiveException: RequestScoped context was not active when trying to obtain a bean instance for a client proxy of CLASS bean [class={classHere}, id={idHere}] - you can activate the request context for a specific method using the @ActivateRequestContext interceptor binding

Sometimes the job will work the first time and then subsequent calls on the job get the above error. It's a fairly simple thing to recreate by injecting a RequestScoped dependency into any batch job and then running that job and using that dependency.

Transaction timeout not working

Hi

Using last version of quarkus-jberet and quarkus 3.0.0.RC2 I'm having troubles with transaction timeout

I specify a timeout of 300 seconds using the step property "jakarta.transaction.global.timeout" (also tried with "javax.transaction.global.timeout") but when I have a big chunk, when commit, it fails after 60 seconds:

Hi

Using last version of quarkus-jberet and quarkus 3.0.0.RC2 I'm having troubles with transaction timeout

I specify a timeout of 300 seconds using the step property "jakarta.transaction.global.timeout" (also tried with "javax.transaction.global.timeout") but when I have a big chunk, when commit, it fails after 60 seconds:

JBERET000007: Failed to run job job, updateValues, org.jberet.job.model.Chunk@27a222f3: jakarta.transaction.RollbackException: ARJUNA016102: The transaction is not active! Uid is 0:ffffc0a82f3c:cbca:644a2fa7:18
at com.arjuna.ats.internal.jta.transaction.arjunacore.TransactionImple.commitAndDisassociate(TransactionImple.java:1285)
at com.arjuna.ats.internal.jta.transaction.arjunacore.BaseTransaction.commit(BaseTransaction.java:128)
at io.quarkus.narayana.jta.runtime.NotifyingTransactionManager.commit(NotifyingTransactionManager.java:70)
at org.jberet.runtime.runner.ChunkRunner.doCheckpoint(ChunkRunner.java:596)
at org.jberet.runtime.runner.ChunkRunner.readProcessWriteItems(ChunkRunner.java:358)
at org.jberet.runtime.runner.ChunkRunner.run(ChunkRunner.java:206)
at org.jberet.runtime.runner.StepExecutionRunner.runBatchletOrChunk(StepExecutionRunner.java:223)
at org.jberet.runtime.runner.StepExecutionRunner.run(StepExecutionRunner.java:142)
at org.jberet.runtime.runner.CompositeExecutionRunner.runStep(CompositeExecutionRunner.java:170)
at org.jberet.runtime.runner.CompositeExecutionRunner.runJobElement(CompositeExecutionRunner.java:134)
at org.jberet.runtime.runner.StepExecutionRunner.run(StepExecutionRunner.java:200)
at org.jberet.runtime.runner.CompositeExecutionRunner.runStep(CompositeExecutionRunner.java:170)
at org.jberet.runtime.runner.CompositeExecutionRunner.runJobElement(CompositeExecutionRunner.java:134)
at org.jberet.runtime.runner.StepExecutionRunner.run(StepExecutionRunner.java:200)
at org.jberet.runtime.runner.CompositeExecutionRunner.runStep(CompositeExecutionRunner.java:170)
at org.jberet.runtime.runner.CompositeExecutionRunner.runJobElement(CompositeExecutionRunner.java:134)
at org.jberet.runtime.runner.StepExecutionRunner.run(StepExecutionRunner.java:200)
at org.jberet.runtime.runner.CompositeExecutionRunner.runStep(CompositeExecutionRunner.java:170)
at org.jberet.runtime.runner.CompositeExecutionRunner.runJobElement(CompositeExecutionRunner.java:134)
at org.jberet.runtime.runner.StepExecutionRunner.run(StepExecutionRunner.java:200)
at org.jberet.runtime.runner.CompositeExecutionRunner.runStep(CompositeExecutionRunner.java:170)
at org.jberet.runtime.runner.CompositeExecutionRunner.runFromHeadOrRestartPoint(CompositeExecutionRunner.java:94)
at org.jberet.runtime.runner.JobExecutionRunner.run(JobExecutionRunner.java:58)
at org.jberet.spi.JobExecutor$1.run(JobExecutor.java:100)
at io.smallrye.context.impl.wrappers.SlowContextualRunnable.run(SlowContextualRunnable.java:19)
at io.quarkus.vertx.core.runtime.VertxCoreRecorder$14.runWith(VertxCoreRecorder.java:576)
at org.jboss.threads.EnhancedQueueExecutor$Task.run(EnhancedQueueExecutor.java:2513)
at org.jboss.threads.EnhancedQueueExecutor$ThreadBody.run(EnhancedQueueExecutor.java:1538)
at org.jboss.threads.DelegatingRunnable.run(DelegatingRunnable.java:29)
at org.jboss.threads.ThreadLocalResettingRunnable.run(ThreadLocalResettingRunnable.java:29)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.base/java.lang.Thread.run(Thread.java:833)

It's well known that timeout for Quarkus managed transactions is 60 seconds.

Is there any trouble with transaction timeout handling/overriding?

Thank you!

Is it possible to resove JBeret "quarkified" artifacts with Universe BOM only?

I usually have to declare implementation(enforcedPlatform("io.quarkus:quarkus-universe-bom:3.X.Y")) only in Gradle in order to resolve all Quarkus dependencies released with the Quarkus version in question.

This doesn't work with io.quarkiverse.jberet:quarkus-jberet.

Would it be possible to make the JBeret "quarkified" releases to be resolved in such way?

StepListener can not be declared using producer methods

When I declare a StepListener using @Produces factory method:

@Produces
@Named
@Dependent
CustomStepListener myListener() {
    return new ...;
}

Quarkus JBeret is unable to use the listener.

This is linked with the issue #223

In QuarkusBatchEnvironment class

         @Override
        public Class<?> getArtifactClass(String ref, ClassLoader classLoader) {
            BeanManager bm = Arc.container().beanManager();
            Bean<?> bean = bm.resolve(bm.getBeans(aliases.getOrDefault(ref, ref)));
            return bean == null ? null : bean.getBeanClass();
        }

bean.getBeanClass() contains the factory class, not the target bean class (CustomStepListener here). This class is used in jberet StepExecutionRunner to register the corresponding stepListener :

            final Class<?> cls = jobContext.getArtifactClass(ref);

            //a class can implement multiple listener interfaces, so need to check it against all listener types
            //even after previous matches
            if (StepListener.class.isAssignableFrom(cls)) {
                final Object o = jobContext.createArtifact(ref, null, listener.getProperties(), batchContext);
                stepListeners.add((StepListener) o);
            }

Support @BatchProperty of type Enum

Given a job property defined as an enum named BatchMode

@BatchProperty(name = "batchMode") BatchMode batchMode

I get an injection error

Caused by: java.lang.RuntimeException: io.quarkus.builder.BuildException: Build failure: Build failed due to errors
	[error]: Build step io.quarkus.arc.deployment.ArcProcessor#validate threw an exception: jakarta.enterprise.inject.spi.DeploymentException: jakarta.enterprise.inject.UnsatisfiedResolutionException: Unsatisfied dependency for type com.mycompany.myapp.batch.BatchMode and qualifiers [@BatchProperty(name = "batchMode")]
	- java member: com.mycompany.myapp.batch.user.UserDqlReader():batchMode
	- declared on CLASS bean [types=[jakarta.batch.api.chunk.ItemReader, com.mycompany.myapp.batch.DqlItemReader<com.mycompany.myapp.model.User>, jakarta.batch.api.chunk.AbstractItemReader, com.mycompany.myapp.batch.user.UserDqlReader, java.lang.Object], qualifiers=[@Default, @Any, @Named("UserDqlReader")], target=com.mycompany.myapp.batch.user.UserDqlReader]
	at io.quarkus.arc.processor.BeanDeployment.procerrors(BeanDeployment.java:1435)
	at io.quarkus.arc.processor.BeanDeployment.init(BeanDeployment.java:310)
	at io.quarkus.arc.processor.BeanProcessor.initialize(BeanProcessor.java:155)
	at io.quarkus.arc.deployment.ArcProcessor.validate(ArcProcessor.java:469)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:568)
	at io.quarkus.deployment.ExtensionLoader$3.execute(ExtensionLoader.java:864)
	at io.quarkus.builder.BuildContext.run(BuildContext.java:282)
	at org.jboss.threads.ContextHandler$1.runWith(ContextHandler.java:18)
	at org.jboss.threads.EnhancedQueueExecutor$Task.run(EnhancedQueueExecutor.java:2513)
	at org.jboss.threads.EnhancedQueueExecutor$ThreadBody.run(EnhancedQueueExecutor.java:1538)
	at java.base/java.lang.Thread.run(Thread.java:833)
	at org.jboss.threads.JBossThread.run(JBossThread.java:501)
Caused by: jakarta.enterprise.inject.UnsatisfiedResolutionException: Unsatisfied dependency for type com.mycompany.myapp.batch.BatchMode and qualifiers [@BatchProperty(name = "batchMode")]
	- java member: com.mycompany.myapp.batch.user.UserDqlReader():batchMode
	- declared on CLASS bean [types=[jakarta.batch.api.chunk.ItemReader, com.mycompany.myapp.batch.DqlItemReader<com.mycompany.myapp.model.User>, jakarta.batch.api.chunk.AbstractItemReader, com.mycompany.myapp.batch.user.UserDqlReader, java.lang.Object], qualifiers=[@Default, @Any, @Named("UserDqlReader")], target=com.mycompany.myapp.batch.user.UserDqlReader]
	at io.quarkus.arc.processor.Beans.resolveInjectionPoint(Beans.java:477)
	at io.quarkus.arc.processor.BeanInfo.init(BeanInfo.java:624)
	at io.quarkus.arc.processor.BeanDeployment.init(BeanDeployment.java:298)
	... 13 more

Even if it's not described in the Jakarta Batch 2.1 Spec, there is a ValueConverter in JBeret which should be able to do the String to Enum conversion:
https://github.com/jberet/jsr352/blob/main/jberet-core/src/main/java/org/jberet/creation/ValueConverter.java

Question: Way to limit the number of parallel running Jobs

I develop a quarkus application that gathers data from users and they can request a report of the data. I want to use JBeret for the report generation but additionally I want to limit the amount of parallel Jobs as I do not want to interfere with the gatheriing of data and ensure full functionality.

What would be the way to achieve this behaviour with JBeret/this extension?

If it requires code changes I would be happy to contribute given some guidance.

Unable to activate batch custom scopes

I'm trying to activate a bean annotated with @JobScoped but this annotation is ignored during deployment phase (bean is not registered at all)

@JobScoped
@Named
public class MyDTO {...}

@Dependent
@Named
public class MyBatchlet implement Batchlet {
  @Inject MyDTO jobScopedInstance;
}

results is a deployment error due to unresolvable injection point.
Using a @Produces will make compilation passing the deployment phase, but during runtime the jobScopedField is managed as a @Dependent scoped bean.

public class ObjectProducer
{
	@Produces
	@JobScoped
	@Named("jobscoped")
	MyDTO jobScopedField = new MyDTO();

JBeret registers its custom scopes using service loader via BatchCDIExtension; I suppose the same should be done during deployment.
I attached a reproducer; remove comments from ObjectProducer class to let the (failing) test run.

Best regards
Luca Basso Ricci
quarkus-jberet-scope-problem.zip

Add a <description> to the runtime artifact´s pom.xml

The generated metadata is reusing the from the parent pom. It should have its own:

    {
      "name": "JBeret - Batch Processing",
      "description": "Parent POM for Quarkiverse projects that includes the default release and artifact publishing related\n    configuration",
      "metadata": {
        "keywords": [
          "batch",
          "jberet",
          "jsr352"
        ],
        "guide": "https://quarkus.io/guides/jberet",
        "categories": [
          "data"
        ],
        "status": "experimental",
        "built-with-quarkus-core": "1.13.0.Final"
      }
}

Starting job defined using Job Definition in Java and @Named bean

Given a job defined using the Java DSL (instead of the XML DSL):

  @Produces
  @Named("myJob")
  public Job job() {
    return new JobBuilder("myJob")....build();
  }

I can launch it using public long start(final Job jobDefined, final Properties jobParameters) method, but it requires me to inject or resolve the Job bean instance.
I would like the public long start(String jobXMLName, Properties jobParameters, String user) to use the jobXMLName to first look for the Job in XML config and then look for the Job in CDI managed beans with @Named and type Job.
To avoid breaking the JBatch start method contract, alternatives could be

  • to add a Job getJob(String jobName) in QuarkusJobOperator
  • to be able to inject all named Jobs in a Map: @Inject @All Map<String, Job> namedJobs

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.