springboot应用程序在map reduce任务完成后关闭,没有任何错误

ffscu2ro  于 2021-06-02  发布在  Hadoop
关注(0)|答案(0)|浏览(322)

我正在用springboot+hadoop建立投资组合。和标题完全一样,SpringBoot应用程序在完成MapReduce后关闭,没有任何错误消息。我以前在编程jsoup crawler时遇到过这种错误。不过,我可以通过试一试来解决它。我这次试过了,但解决不了。
这是一个电脑和程序规范-macbook pro.(high sieera.-hadoop版本2.7.3-springboot版本2.0.0-spring数据hadoop boot 2.5.0
这是控制台日志。

] o.s.b.w.servlet.ServletRegistrationBean  : Servlet dispatcherServlet mapped to [/]
2018-05-08 15:58:31.701  INFO 59820 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean   : Mapping filter: 'characterEncodingFilter' to: [/*]
2018-05-08 15:58:31.702  INFO 59820 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean   : Mapping filter: 'hiddenHttpMethodFilter' to: [/*]
2018-05-08 15:58:31.702  INFO 59820 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean   : Mapping filter: 'httpPutFormContentFilter' to: [/*]
2018-05-08 15:58:31.702  INFO 59820 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean   : Mapping filter: 'requestContextFilter' to: [/*]
2018-05-08 15:58:31.969  INFO 59820 --- [  restartedMain] com.zaxxer.hikari.HikariDataSource       : HikariPool-1 - Starting...
2018-05-08 15:58:32.444  INFO 59820 --- [  restartedMain] com.zaxxer.hikari.HikariDataSource       : HikariPool-1 - Start completed.
2018-05-08 15:58:32.482  INFO 59820 --- [  restartedMain] j.LocalContainerEntityManagerFactoryBean : Building JPA container EntityManagerFactory for persistence unit 'default'
2018-05-08 15:58:32.496  INFO 59820 --- [  restartedMain] o.hibernate.jpa.internal.util.LogHelper  : HHH000204: Processing PersistenceUnitInfo [
    name: default
    ...]
2018-05-08 15:58:32.582  INFO 59820 --- [  restartedMain] org.hibernate.Version                    : HHH000412: Hibernate Core {5.2.14.Final}
2018-05-08 15:58:32.584  INFO 59820 --- [  restartedMain] org.hibernate.cfg.Environment            : HHH000206: hibernate.properties not found
2018-05-08 15:58:32.629  INFO 59820 --- [  restartedMain] o.hibernate.annotations.common.Version   : HCANN000001: Hibernate Commons Annotations {5.0.1.Final}
2018-05-08 15:58:32.714  INFO 59820 --- [  restartedMain] org.hibernate.dialect.Dialect            : HHH000400: Using dialect: org.hibernate.dialect.MySQL5InnoDBDialect
2018-05-08 15:58:33.131  INFO 59820 --- [  restartedMain] j.LocalContainerEntityManagerFactoryBean : Initialized JPA EntityManagerFactory for persistence unit 'default'
2018-05-08 15:58:33.448  INFO 59820 --- [  restartedMain] o.h.h.i.QueryTranslatorFactoryInitiator  : HHH000397: Using ASTQueryTranslatorFactory
2018-05-08 15:58:34.226  INFO 59820 --- [  restartedMain] s.w.s.m.m.a.RequestMappingHandlerAdapter : Looking for @ControllerAdvice: org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@2acb939f: startup date [Tue May 08 15:58:29 KST 2018]; root of context hierarchy
2018-05-08 15:58:34.276  WARN 59820 --- [  restartedMain] aWebConfiguration$JpaWebMvcConfiguration : spring.jpa.open-in-view is enabled by default. Therefore, database queries may be performed during view rendering. Explicitly configure spring.jpa.open-in-view to disable this warning
2018-05-08 15:58:34.333  INFO 59820 --- [  restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/dc/list],methods=[GET]}" onto public void com.logan.controller.DcController.list(com.logan.domain.Dc_base,org.springframework.ui.Model,org.springframework.data.domain.Pageable)
2018-05-08 15:58:34.334  INFO 59820 --- [  restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/dc/view],methods=[GET]}" onto public void com.logan.controller.DcController.view(java.lang.Long,com.logan.domain.Dc_base,org.springframework.ui.Model)
2018-05-08 15:58:34.335  INFO 59820 --- [  restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/dc/crawl],methods=[GET]}" onto public java.lang.String com.logan.controller.DcController.crawl() throws java.io.IOException
2018-05-08 15:58:34.335  INFO 59820 --- [  restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/dc/hadoop],methods=[GET]}" onto public void com.logan.controller.DcController.hadoop() throws java.lang.Exception
2018-05-08 15:58:34.342  INFO 59820 --- [  restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/error]}" onto public org.springframework.http.ResponseEntity<java.util.Map<java.lang.String, java.lang.Object>> org.springframework.boot.autoconfigure.web.servlet.error.BasicErrorController.error(javax.servlet.http.HttpServletRequest)
2018-05-08 15:58:34.344  INFO 59820 --- [  restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/error],produces=[text/html]}" onto public org.springframework.web.servlet.ModelAndView org.springframework.boot.autoconfigure.web.servlet.error.BasicErrorController.errorHtml(javax.servlet.http.HttpServletRequest,javax.servlet.http.HttpServletResponse)
2018-05-08 15:58:34.422  INFO 59820 --- [  restartedMain] o.s.w.s.handler.SimpleUrlHandlerMapping  : Mapped URL path [/webjars/**] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler]
2018-05-08 15:58:34.422  INFO 59820 --- [  restartedMain] o.s.w.s.handler.SimpleUrlHandlerMapping  : Mapped URL path [/**] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler]
2018-05-08 15:58:34.516  INFO 59820 --- [  restartedMain] o.s.w.s.handler.SimpleUrlHandlerMapping  : Mapped URL path [/**/favicon.ico] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler]
2018-05-08 15:58:35.347  INFO 59820 --- [  restartedMain] o.s.b.d.a.OptionalLiveReloadServer       : LiveReload server is running on port 35729
2018-05-08 15:58:35.410  INFO 59820 --- [  restartedMain] o.s.d.h.c.a.c.SpringHadoopConfiguration  : Building configuration for bean 'hadoopConfiguration'
2018-05-08 15:58:35.667  WARN 59820 --- [  restartedMain] org.apache.hadoop.util.NativeCodeLoader  : Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2018-05-08 15:58:35.880  INFO 59820 --- [  restartedMain] org.apache.hadoop.fs.TrashPolicyDefault  : Namenode trash configuration: Deletion interval = 0 minutes, Emptier interval = 0 minutes.
2018-05-08 15:58:35.968  INFO 59820 --- [  restartedMain] o.s.j.e.a.AnnotationMBeanExporter        : Registering beans for JMX exposure on startup
2018-05-08 15:58:35.969  INFO 59820 --- [  restartedMain] o.s.j.e.a.AnnotationMBeanExporter        : Bean with name 'dataSource' has been autodetected for JMX exposure
2018-05-08 15:58:35.977  INFO 59820 --- [  restartedMain] o.s.j.e.a.AnnotationMBeanExporter        : Located MBean 'dataSource': registering with JMX server as MBean [com.zaxxer.hikari:name=dataSource,type=HikariDataSource]
2018-05-08 15:58:35.983  INFO 59820 --- [  restartedMain] o.s.c.support.DefaultLifecycleProcessor  : Starting beans in phase 0
2018-05-08 15:58:36.046  INFO 59820 --- [  restartedMain] o.s.b.w.embedded.tomcat.TomcatWebServer  : Tomcat started on port(s): 8080 (http) with context path ''
2018-05-08 15:58:36.064  INFO 59820 --- [  restartedMain] com.logan.Portfolio1Application          : Started Portfolio1Application in 7.61 seconds (JVM running for 8.502)
2018-05-08 15:58:45.737  INFO 59820 --- [nio-8080-exec-1] o.a.c.c.C.[Tomcat].[localhost].[/]       : Initializing Spring FrameworkServlet 'dispatcherServlet'
2018-05-08 15:58:45.737  INFO 59820 --- [nio-8080-exec-1] o.s.web.servlet.DispatcherServlet        : FrameworkServlet 'dispatcherServlet': initialization started
2018-05-08 15:58:45.755  INFO 59820 --- [nio-8080-exec-1] o.s.web.servlet.DispatcherServlet        : FrameworkServlet 'dispatcherServlet': initialization completed in 18 ms
2018-05-08 15:58:46.544  INFO 59820 --- [nio-8080-exec-1] o.a.h.conf.Configuration.deprecation     : session.id is deprecated. Instead, use dfs.metrics.session-id
2018-05-08 15:58:46.545  INFO 59820 --- [nio-8080-exec-1] o.apache.hadoop.metrics.jvm.JvmMetrics   : Initializing JVM Metrics with processName=JobTracker, sessionId=
2018-05-08 15:58:46.680  WARN 59820 --- [nio-8080-exec-1] o.a.h.mapreduce.JobResourceUploader      : Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
2018-05-08 15:58:46.692  WARN 59820 --- [nio-8080-exec-1] o.a.h.mapreduce.JobResourceUploader      : No job jar file set.  User classes may not be found. See Job or Job#setJar(String).
2018-05-08 15:58:46.706  INFO 59820 --- [nio-8080-exec-1] o.a.h.m.lib.input.FileInputFormat        : Total input paths to process : 1
2018-05-08 15:58:46.835  INFO 59820 --- [nio-8080-exec-1] o.apache.hadoop.mapreduce.JobSubmitter   : number of splits:1
2018-05-08 15:58:46.935  INFO 59820 --- [nio-8080-exec-1] o.apache.hadoop.mapreduce.JobSubmitter   : Submitting tokens for job: job_local1276007486_0001
2018-05-08 15:58:47.082  INFO 59820 --- [nio-8080-exec-1] org.apache.hadoop.mapreduce.Job          : The url to track the job: http://localhost:8080/
2018-05-08 15:58:47.083  INFO 59820 --- [nio-8080-exec-1] org.apache.hadoop.mapreduce.Job          : Running job: job_local1276007486_0001
2018-05-08 15:58:47.085  INFO 59820 --- [      Thread-33] org.apache.hadoop.mapred.LocalJobRunner  : OutputCommitter set in config null
2018-05-08 15:58:47.089  INFO 59820 --- [      Thread-33] o.a.h.m.lib.output.FileOutputCommitter   : File Output Committer Algorithm version is 1
2018-05-08 15:58:47.091  INFO 59820 --- [      Thread-33] org.apache.hadoop.mapred.LocalJobRunner  : OutputCommitter is org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter
2018-05-08 15:58:47.128  INFO 59820 --- [      Thread-33] org.apache.hadoop.mapred.LocalJobRunner  : Waiting for map tasks
2018-05-08 15:58:47.129  INFO 59820 --- [ask Executor #0] org.apache.hadoop.mapred.LocalJobRunner  : Starting task: attempt_local1276007486_0001_m_000000_0
2018-05-08 15:58:47.146  INFO 59820 --- [ask Executor #0] o.a.h.m.lib.output.FileOutputCommitter   : File Output Committer Algorithm version is 1
2018-05-08 15:58:47.150  INFO 59820 --- [ask Executor #0] o.a.h.yarn.util.ProcfsBasedProcessTree   : ProcfsBasedProcessTree currently is supported only on Linux.
2018-05-08 15:58:47.151  INFO 59820 --- [ask Executor #0] org.apache.hadoop.mapred.Task            :  Using ResourceCalculatorProcessTree : null
2018-05-08 15:58:47.155  INFO 59820 --- [ask Executor #0] org.apache.hadoop.mapred.MapTask         : Processing split: hdfs://localhost:9000/user/Logan/dc_in/dc_base.csv:0+2715710
2018-05-08 15:58:47.210  INFO 59820 --- [ask Executor #0] org.apache.hadoop.mapred.MapTask         : (EQUATOR) 0 kvi 26214396(104857584)
2018-05-08 15:58:47.210  INFO 59820 --- [ask Executor #0] org.apache.hadoop.mapred.MapTask         : mapreduce.task.io.sort.mb: 100
2018-05-08 15:58:47.210  INFO 59820 --- [ask Executor #0] org.apache.hadoop.mapred.MapTask         : soft limit at 83886080
2018-05-08 15:58:47.210  INFO 59820 --- [ask Executor #0] org.apache.hadoop.mapred.MapTask         : bufstart = 0; bufvoid = 104857600
2018-05-08 15:58:47.210  INFO 59820 --- [ask Executor #0] org.apache.hadoop.mapred.MapTask         : kvstart = 26214396; length = 6553600
2018-05-08 15:58:47.213  INFO 59820 --- [ask Executor #0] org.apache.hadoop.mapred.MapTask         : Map output collector class = org.apache.hadoop.mapred.MapTask$MapOutputBuffer
2018-05-08 15:58:47.441  INFO 59820 --- [ask Executor #0] org.apache.hadoop.mapred.LocalJobRunner  : 
2018-05-08 15:58:47.443  INFO 59820 --- [ask Executor #0] org.apache.hadoop.mapred.MapTask         : Starting flush of map output
2018-05-08 15:58:47.443  INFO 59820 --- [ask Executor #0] org.apache.hadoop.mapred.MapTask         : Spilling map output
2018-05-08 15:58:47.443  INFO 59820 --- [ask Executor #0] org.apache.hadoop.mapred.MapTask         : bufstart = 0; bufend = 3249391; bufvoid = 104857600
2018-05-08 15:58:47.443  INFO 59820 --- [ask Executor #0] org.apache.hadoop.mapred.MapTask         : kvstart = 26214396(104857584); kvend = 25677028(102708112); length = 537369/6553600
2018-05-08 15:58:48.089  INFO 59820 --- [nio-8080-exec-1] org.apache.hadoop.mapreduce.Job          : Job job_local1276007486_0001 running in uber mode : false
2018-05-08 15:58:48.091  INFO 59820 --- [nio-8080-exec-1] org.apache.hadoop.mapreduce.Job          :  map 0% reduce 0%
2018-05-08 15:58:48.143  INFO 59820 --- [ask Executor #0] org.apache.hadoop.mapred.MapTask         : Finished spill 0
2018-05-08 15:58:48.148  INFO 59820 --- [ask Executor #0] org.apache.hadoop.mapred.Task            : Task:attempt_local1276007486_0001_m_000000_0 is done. And is in the process of committing
2018-05-08 15:58:48.160  INFO 59820 --- [ask Executor #0] org.apache.hadoop.mapred.LocalJobRunner  : map
2018-05-08 15:58:48.161  INFO 59820 --- [ask Executor #0] org.apache.hadoop.mapred.Task            : Task 'attempt_local1276007486_0001_m_000000_0' done.
2018-05-08 15:58:48.161  INFO 59820 --- [ask Executor #0] org.apache.hadoop.mapred.LocalJobRunner  : Finishing task: attempt_local1276007486_0001_m_000000_0
2018-05-08 15:58:48.161  INFO 59820 --- [      Thread-33] org.apache.hadoop.mapred.LocalJobRunner  : map task executor complete.
2018-05-08 15:58:48.162  INFO 59820 --- [      Thread-33] org.apache.hadoop.mapred.LocalJobRunner  : Waiting for reduce tasks
2018-05-08 15:58:48.163  INFO 59820 --- [pool-7-thread-1] org.apache.hadoop.mapred.LocalJobRunner  : Starting task: attempt_local1276007486_0001_r_000000_0
2018-05-08 15:58:48.169  INFO 59820 --- [pool-7-thread-1] o.a.h.m.lib.output.FileOutputCommitter   : File Output Committer Algorithm version is 1
2018-05-08 15:58:48.170  INFO 59820 --- [pool-7-thread-1] o.a.h.yarn.util.ProcfsBasedProcessTree   : ProcfsBasedProcessTree currently is supported only on Linux.
2018-05-08 15:58:48.170  INFO 59820 --- [pool-7-thread-1] org.apache.hadoop.mapred.Task            :  Using ResourceCalculatorProcessTree : null
2018-05-08 15:58:48.172  INFO 59820 --- [pool-7-thread-1] org.apache.hadoop.mapred.ReduceTask      : Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@2c0f65dc
2018-05-08 15:58:48.180  INFO 59820 --- [pool-7-thread-1] o.a.h.m.task.reduce.MergeManagerImpl     : MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2018-05-08 15:58:48.182  INFO 59820 --- [mpletion Events] o.a.h.m.task.reduce.EventFetcher         : attempt_local1276007486_0001_r_000000_0 Thread started: EventFetcher for fetching Map Completion Events
2018-05-08 15:58:48.215  INFO 59820 --- [ localfetcher#1] o.a.h.m.task.reduce.LocalFetcher         : localfetcher#1 about to shuffle output of map attempt_local1276007486_0001_m_000000_0 decomp: 2709391 len: 2709395 to MEMORY
2018-05-08 15:58:48.219  INFO 59820 --- [ localfetcher#1] o.a.h.m.task.reduce.InMemoryMapOutput    : Read 2709391 bytes from map-output for attempt_local1276007486_0001_m_000000_0
2018-05-08 15:58:48.223  INFO 59820 --- [ localfetcher#1] o.a.h.m.task.reduce.MergeManagerImpl     : closeInMemoryFile -> map-output of size: 2709391, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2709391
2018-05-08 15:58:48.225  INFO 59820 --- [mpletion Events] o.a.h.m.task.reduce.EventFetcher         : EventFetcher is interrupted.. Returning
2018-05-08 15:58:48.225  INFO 59820 --- [pool-7-thread-1] org.apache.hadoop.mapred.LocalJobRunner  : 1 / 1 copied.
2018-05-08 15:58:48.226  INFO 59820 --- [pool-7-thread-1] o.a.h.m.task.reduce.MergeManagerImpl     : finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2018-05-08 15:58:48.231  INFO 59820 --- [pool-7-thread-1] org.apache.hadoop.mapred.Merger          : Merging 1 sorted segments
2018-05-08 15:58:48.232  INFO 59820 --- [pool-7-thread-1] org.apache.hadoop.mapred.Merger          : Down to the last merge-pass, with 1 segments left of total size: 2709387 bytes
2018-05-08 15:58:48.304  INFO 59820 --- [pool-7-thread-1] o.a.h.m.task.reduce.MergeManagerImpl     : Merged 1 segments, 2709391 bytes to disk to satisfy reduce memory limit
2018-05-08 15:58:48.304  INFO 59820 --- [pool-7-thread-1] o.a.h.m.task.reduce.MergeManagerImpl     : Merging 1 files, 2709395 bytes from disk
2018-05-08 15:58:48.305  INFO 59820 --- [pool-7-thread-1] o.a.h.m.task.reduce.MergeManagerImpl     : Merging 0 segments, 0 bytes from memory into reduce
2018-05-08 15:58:48.305  INFO 59820 --- [pool-7-thread-1] org.apache.hadoop.mapred.Merger          : Merging 1 sorted segments
2018-05-08 15:58:48.306  INFO 59820 --- [pool-7-thread-1] org.apache.hadoop.mapred.Merger          : Down to the last merge-pass, with 1 segments left of total size: 2709387 bytes
2018-05-08 15:58:48.307  INFO 59820 --- [pool-7-thread-1] org.apache.hadoop.mapred.LocalJobRunner  : 1 / 1 copied.
2018-05-08 15:58:48.329  INFO 59820 --- [pool-7-thread-1] o.a.h.conf.Configuration.deprecation     : mapred.skip.on is deprecated. Instead, use mapreduce.job.skiprecords
2018-05-08 15:58:48.547  INFO 59820 --- [pool-7-thread-1] org.apache.hadoop.mapred.Task            : Task:attempt_local1276007486_0001_r_000000_0 is done. And is in the process of committing
2018-05-08 15:58:48.550  INFO 59820 --- [pool-7-thread-1] org.apache.hadoop.mapred.LocalJobRunner  : 1 / 1 copied.
2018-05-08 15:58:48.550  INFO 59820 --- [pool-7-thread-1] org.apache.hadoop.mapred.Task            : Task attempt_local1276007486_0001_r_000000_0 is allowed to commit now
2018-05-08 15:58:48.557  INFO 59820 --- [pool-7-thread-1] o.a.h.m.lib.output.FileOutputCommitter   : Saved output of task 'attempt_local1276007486_0001_r_000000_0' to hdfs://localhost:9000/user/Logan/dc_in/out/_temporary/0/task_local1276007486_0001_r_000000
2018-05-08 15:58:48.558  INFO 59820 --- [pool-7-thread-1] org.apache.hadoop.mapred.LocalJobRunner  : reduce > reduce
2018-05-08 15:58:48.558  INFO 59820 --- [pool-7-thread-1] org.apache.hadoop.mapred.Task            : Task 'attempt_local1276007486_0001_r_000000_0' done.
2018-05-08 15:58:48.558  INFO 59820 --- [pool-7-thread-1] org.apache.hadoop.mapred.LocalJobRunner  : Finishing task: attempt_local1276007486_0001_r_000000_0
2018-05-08 15:58:48.559  INFO 59820 --- [      Thread-33] org.apache.hadoop.mapred.LocalJobRunner  : reduce task executor complete.
2018-05-08 15:58:49.098  INFO 59820 --- [nio-8080-exec-1] org.apache.hadoop.mapreduce.Job          :  map 100% reduce 100%
2018-05-08 15:58:49.099  INFO 59820 --- [nio-8080-exec-1] org.apache.hadoop.mapreduce.Job          : Job job_local1276007486_0001 completed successfully
2018-05-08 15:58:49.108  INFO 59820 --- [nio-8080-exec-1] org.apache.hadoop.mapreduce.Job          : Counters: 35
    File System Counters
        FILE: Number of bytes read=5419178
        FILE: Number of bytes written=8693317
        FILE: Number of read operations=0
        FILE: Number of large read operations=0
        FILE: Number of write operations=0
        HDFS: Number of bytes read=5431420
        HDFS: Number of bytes written=2341190
        HDFS: Number of read operations=15
        HDFS: Number of large read operations=0
        HDFS: Number of write operations=6
    Map-Reduce Framework
        Map input records=19172
        Map output records=134343
        Map output bytes=3249391
        Map output materialized bytes=2709395
        Input split bytes=115
        Combine input records=134343
        Combine output records=92172
        Reduce input groups=92172
        Reduce shuffle bytes=2709395
        Reduce input records=92172
        Reduce output records=92172
        Spilled Records=184344
        Shuffled Maps =1
        Failed Shuffles=0
        Merged Map outputs=1
        GC time elapsed (ms)=0
        Total committed heap usage (bytes)=1330642944
    Shuffle Errors
        BAD_ID=0
        CONNECTION=0
        IO_ERROR=0
        WRONG_LENGTH=0
        WRONG_MAP=0
        WRONG_REDUCE=0
    File Input Format Counters 
        Bytes Read=2715710
    File Output Format Counters 
        Bytes Written=2341190
2018-05-08 15:58:49.109  INFO 59820 --- [      Thread-12] ConfigServletWebServerApplicationContext : Closing org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@2acb939f: startup date [Tue May 08 15:58:29 KST 2018]; root of context hierarchy
2018-05-08 15:58:49.112  INFO 59820 --- [      Thread-12] o.s.c.support.DefaultLifecycleProcessor  : Stopping beans in phase 0
2018-05-08 15:58:49.217  INFO 59820 --- [      Thread-12] o.s.j.e.a.AnnotationMBeanExporter        : Unregistering JMX-exposed beans on shutdown
2018-05-08 15:58:49.218  INFO 59820 --- [      Thread-12] o.s.j.e.a.AnnotationMBeanExporter        : Unregistering JMX-exposed beans
2018-05-08 15:58:49.219  INFO 59820 --- [      Thread-12] j.LocalContainerEntityManagerFactoryBean : Closing JPA EntityManagerFactory for persistence unit 'default'
2018-05-08 15:58:49.222  INFO 59820 --- [      Thread-12] com.zaxxer.hikari.HikariDataSource       : HikariPool-1 - Shutdown initiated...
2018-05-08 15:58:49.229  INFO 59820 --- [      Thread-12] com.zaxxer.hikari.HikariDataSource       : HikariPool-1 - Shutdown completed.

这是我的pom.xml

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.logan</groupId>
    <artifactId>portfolio-1</artifactId>
    <version>0.0.1</version>
    <packaging>war</packaging>

    <name>portfolio-1</name>
    <description>JSOUP + SPRINGBOOT + JPA + DB + HADOOP + R + HIVE + THYMELEAF</description>

    <parent>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-parent</artifactId>
        <version>2.0.0.RELEASE</version>
        <relativePath /> <!-- lookup parent from repository -->
    </parent>

    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
        <java.version>1.8</java.version>
    </properties>

    <dependencies>
        <!-- https://mvnrepository.com/artifact/javax.servlet/javax.servlet-api -->
        <dependency>
            <groupId>javax.servlet</groupId>
            <artifactId>javax.servlet-api</artifactId>
            <scope>provided</scope>
        </dependency>

        <!-- spring hadoop 2.5 -->
        <!-- Thymeleaf -->
        <dependency>
            <groupId>nz.net.ultraq.thymeleaf</groupId>
            <artifactId>thymeleaf-layout-dialect</artifactId>
            <!-- <version>2.2.1</version> -->
        </dependency>

        <!-- https://mvnrepository.com/artifact/com.querydsl/querydsl-jpa -->
        <dependency>
            <groupId>com.querydsl</groupId>
            <artifactId>querydsl-jpa</artifactId>
        </dependency><!-- https://mvnrepository.com/artifact/com.querydsl/querydsl-apt -->
        <dependency>
            <groupId>com.querydsl</groupId>
            <artifactId>querydsl-apt</artifactId>
        </dependency>
        <!-- https://mvnrepository.com/artifact/com.querydsl/querydsl-core -->
        <dependency>
            <groupId>com.querydsl</groupId>
            <artifactId>querydsl-core</artifactId>
            <!-- <version>4.1.4</version> -->
        </dependency><!-- https://mvnrepository.com/artifact/com.querydsl/querydsl-sql -->
        <dependency>
            <groupId>com.querydsl</groupId>
            <artifactId>querydsl-sql</artifactId>
            <version>4.1.4</version>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-data-jpa</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-thymeleaf</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
        </dependency>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-devtools</artifactId>
            <scope>runtime</scope>
        </dependency>
        <dependency>
            <groupId>mysql</groupId>
            <artifactId>mysql-connector-java</artifactId>
            <scope>runtime</scope>
        </dependency>
        <dependency>
            <groupId>org.projectlombok</groupId>
            <artifactId>lombok</artifactId>
            <optional>true</optional>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-tomcat</artifactId>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.jsoup</groupId>
            <artifactId>jsoup</artifactId>
            <version>1.11.2</version>
        </dependency>
        <dependency>
            <groupId>org.springframework.data</groupId>
            <artifactId>spring-data-hadoop-config</artifactId>
            <version>2.5.0.RELEASE</version>
        </dependency>
        <dependency>
            <groupId>org.springframework.data</groupId>
            <artifactId>spring-data-hadoop-core</artifactId>
            <version>2.5.0.RELEASE</version>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-log4j</artifactId>
            <version>1.2.3.RELEASE</version>
        </dependency>
        <dependency>
            <groupId>org.springframework.data</groupId>
            <artifactId>spring-data-hadoop-boot</artifactId>
            <version>2.5.0.RELEASE</version>
        </dependency>
    </dependencies>

    <build>
        <plugins>
            <plugin>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-maven-plugin</artifactId>
            </plugin>
            <plugin>
                <groupId>com.mysema.maven</groupId>
                <artifactId>apt-maven-plugin</artifactId>
                <version>1.1.3</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>process</goal>
                        </goals>
                        <configuration>
                            <outputDirectory>target/generated-sources/java</outputDirectory>
                            <processor>com.querydsl.apt.jpa.JPAAnnotationProcessor</processor>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>

</project>

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题