In this post, we will show you how to configure a Spring Batch job to read data from mongo database and write into XML file.
Project structure
This is a directory structure of the standard gradle project.
Project dependencies
task wrapper(type: Wrapper) {
gradleVersion = '3.2.1'
}
apply plugin: 'java'
apply plugin: 'eclipse'
apply plugin: 'org.springframework.boot'
sourceCompatibility = 1.8
repositories {
mavenLocal()
mavenCentral()
}
dependencies {
compile 'org.springframework:spring-oxm:4.3.7.RELEASE'
compile 'org.springframework.data:spring-data-mongodb:1.9.8.RELEASE'
compileOnly('org.projectlombok:lombok:1.16.12')
compile('org.springframework.boot:spring-boot-starter-batch:1.5.2.RELEASE')
testCompile('org.springframework.boot:spring-boot-starter-test:1.5.2.RELEASE')
}
buildscript {
repositories {
mavenLocal()
jcenter()
}
dependencies {
classpath "org.springframework.boot:spring-boot-gradle-plugin:1.5.2.RELEASE"
}
}
application.properties file
spring.data.mongodb.host=127.0.0.1 spring.data.mongodb.port=27017 spring.data.mongodb.database=springbatch
Spring Batch Jobs
Create a job which will read from mongodb and write into XML file.
package com.walking.techie.mongotoxml.jobs;
import com.walking.techie.mongotoxml.model.Report;
import java.util.HashMap;
import java.util.Map;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.item.data.MongoItemReader;
import org.springframework.batch.item.xml.StaxEventItemWriter;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.io.FileSystemResource;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.oxm.xstream.XStreamMarshaller;
@Configuration
@EnableBatchProcessing
public class ReadFromDB {
@Autowired
private JobBuilderFactory jobBuilderFactory;
@Autowired
private StepBuilderFactory stepBuilderFactory;
@Autowired
private MongoTemplate mongoTemplate;
@Bean
public Job readReport() throws Exception {
return jobBuilderFactory.get("readReport").flow(step1()).end().build();
}
@Bean
public Step step1() throws Exception {
return stepBuilderFactory.get("step1").<Report, Report>chunk(10).reader(reader())
.writer(writer()).build();
}
@Bean
public MongoItemReader<Report> reader() {
MongoItemReader<Report> reader = new MongoItemReader<>();
reader.setTemplate(mongoTemplate);
reader.setSort(new HashMap<String, Sort.Direction>() {{
put("_id", Direction.DESC);
}});
reader.setTargetType(Report.class);
reader.setQuery("{}");
return reader;
}
@Bean
public StaxEventItemWriter<Report> writer() {
StaxEventItemWriter<Report> writer = new StaxEventItemWriter<>();
writer.setResource(new FileSystemResource("xml/mongo.xml"));
writer.setMarshaller(userUnmarshaller());
writer.setRootTagName("report");
return writer;
}
@Bean
public XStreamMarshaller userUnmarshaller() {
XStreamMarshaller unMarshaller = new XStreamMarshaller();
Map<String, Class> aliases = new HashMap<String, Class>();
aliases.put("report", Report.class);
unMarshaller.setAliases(aliases);
return unMarshaller;
}
}
Map record from mongodb report collection to Report object and write to XML file.
A Java model class
package com.walking.techie.mongotoxml.model;
import java.math.BigDecimal;
import java.util.Date;
import lombok.Data;
import org.springframework.data.mongodb.core.mapping.Document;
@Data
@Document(collection = "report")
public class Report {
private int id;
private Date date;
private long impression;
private int clicks;
private BigDecimal earning;
}
Run Application
package com.walking.techie;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration;
@SpringBootApplication(exclude = DataSourceAutoConfiguration.class)
public class Application {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
Output
Output of the application will store in xml/mongo.xml. Below is formatted output.<?xml version="1.0" encoding="UTF-8"?>
<report>
<report>
<id>3</id>
<date>2017-03-29 18:30:00.0 UTC</date>
<impression>431436</impression>
<clicks>86</clicks>
<earning>270.80</earning>
</report>
<report>
<id>2</id>
<date>2017-03-28 18:30:00.0 UTC</date>
<impression>339100</impression>
<clicks>60</clicks>
<earning>320.88</earning>
</report>
<report>
<id>1</id>
<date>2017-03-27 18:30:00.0 UTC</date>
<impression>139237</impression>
<clicks>50</clicks>
<earning>220.90</earning>
</report>
</report>
output on console
2017-03-29 13:02:57.916 INFO 37073 --- [ main] o.s.b.c.l.support.SimpleJobLauncher : Job: [FlowJob: [name=readReport]] launched with the following parameters: [{}]
2017-03-29 13:02:57.935 INFO 37073 --- [ main] o.s.batch.core.job.SimpleStepHandler : Executing step: [step1]
2017-03-29 13:02:58.003 INFO 37073 --- [ main] org.mongodb.driver.connection : Opened connection [connectionId{localValue:2, serverValue:394}] to 127.0.0.1:27017
2017-03-29 13:02:58.075 INFO 37073 --- [ main] o.s.b.c.l.support.SimpleJobLauncher : Job: [FlowJob: [name=readReport]] completed with the following parameters: [{}] and the following status: [COMPLETED]
Note : This code has been compiled and run on mac notebook and intellij IDEA.
Nice Article , Thanks for your support
ReplyDeleteThank you very much for this article.
ReplyDelete