zoeminghong / full-stack Goto Github PK
View Code? Open in Web Editor NEW全栈学习之路,工具书
Home Page: http://stack.zerostech.com
全栈学习之路,工具书
Home Page: http://stack.zerostech.com
java.lang.IllegalArgumentException: Illegal principal name [email protected]: org.apache.hadoop.security.authentication.util.KerberosName$NoMatchingRule: No rules applied to [email protected]
krb5.conf 文件中 default_realm 不是当前环境域配置,需要进行切换一下
Jackson JsonMappingException – No serializer found for class
在调用 mybatis plus 自带的方法的时候,报出 invaild bound statement 问题,在启动的时候也没有 mybatis logo 信息。所以我怀疑是 mybatis plus 没有启用。
网上有很多是通过自己实现 sqlSessionFactory 方式,但现在都 spring boot 年代,肯定不可能需要自己写,除非没有引入 boot-start 包。但由于使用的是公司内部的封装框架,本身就自带了 plus 相关的依赖,我以为已经引用了,所以觉得很奇怪的。用 Idea 查找 MybatisPlusAutoConfiguration 没有找到,估计就是没有引用该 boot start 包。
引用
<dependency>
<groupId>com.baomidou</groupId>
<artifactId>mybatis-plus-boot-starter</artifactId>
</dependency>
Failed to start bean 'documentationPluginsBootstrapper'; nested exception is java.lang.NullPointerException
spring boot 高版本存在filter问题
xml 文件未被加载。
mybatis-plus:
mapper-locations: ["classpath*:/mapper/*.xml"]
*
当存在多个子模块的时候,要使用 classpath*
方式Redis ha helm chart error - NOREPLICAS Not enough good replicas to write
helm install stable/redis-ha
kubectl edit cm redis-ha-configmap # change min-slaves-to-write from 1 to 0
kubectl delete pod redis-ha-0
Hive on HBase 时创建表的时候,报如下错误
ERROR : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Table default.t_student failed strict managed table checks due to the following reason: Table is marked as a managed table but is not transactional.)
INFO : Completed executing command(queryId=hive_20190509143748_e915cb45-4846-4927-b5fd-8983507526fe); Time taken: 12.084 seconds
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Table default.t_student failed strict managed table checks due to the following reason: Table is marked as a managed table but is not transactional.) (state=08S01,code=1)
是由于创建表的语句错误导致的。应该加上 external
CREATE EXTERNAL TABLE hbase_table_2(key int, value string)
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITH SERDEPROPERTIES ("hbase.columns.mapping" = "cf1:val")
TBLPROPERTIES("hbase.table.name" = "some_existing_table", "hbase.mapred.output.outputtable" = "some_existing_table");
记录有价值的网站
自己声明 MetaObjectHandler 实现类,并 @component 进行加载,但在使用的时候,没有进入类方法中。查询了网络上的文章和检查官方文档内容,觉得自己代码并没有问题。通过关键字在github mybatis plus 项目中找到了一些实现类,定位到了 MybatisDefaultParameterHandler类,发现 populateKeys 方法中 metaObjectHandler 为 null 。说明我的 MetaObjectHandler 未被加载进去。尝试手动加载方式,也不生效。当想要去找 Configuration 类进行debug 的时候,发现 boot starter 包并没有显示引用。进行该包的显式引用,问题解决了
显示引用
<dependency>
<groupId>com.baomidou</groupId>
<artifactId>mybatis-plus-boot-starter</artifactId>
</dependency>
host 不生效,host 已经修改,但程序重启都还是没有生效
清除 DNS 本地缓存
项目使用了 Submodule,且 项目根目录不存在 mc-magic-value-counter 所在的 pom 路径信息,从而导致 maven 在项目根目录下,使用 -pl 进行编译的时候,报如下错误(直接在 mc-magic-value-counter 中是没有问题的)
[ERROR] Could not find the selected project in the reactor: apps/ext/mc-magic-value-counter @
[ERROR] Could not find the selected project in the reactor: apps/ext/mc-magic-value-counter -> [Help 1]
将 mc-magic-value-counter 项目信息引入到 根目录 pom modules 信息中
有意思和有业务痛点的解决方案项目
java.lang.IllegalArgumentException: When allowCredentials is true, allowedOrigins cannot contain the special value "*"since that cannot be set on the "Access-Control-Allow-Origin" response header. To allow credentials to a set of origins, list them explicitly or consider using "allowedOriginPatterns" instead.
at org.springframework.web.cors.CorsConfiguration.validateAllowCredentials(CorsConfiguration.java:457)
at org.springframework.web.servlet.handler.AbstractHandlerMapping.getHandler(AbstractHandlerMapping.java:520)
at org.springframework.web.servlet.DispatcherServlet.getHandler(DispatcherServlet.java:1255)
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1037)
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:961)
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1006)
at org.springframework.web.servlet.FrameworkServlet.doOptions(FrameworkServlet.java:945)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:661)
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:883)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:733)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:231)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:100)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:119)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:103)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:103)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
at org.apache.catalina.core.ApplicationDispatcher.invoke(ApplicationDispatcher.java:712)
at org.apache.catalina.core.ApplicationDispatcher.processRequest(ApplicationDispatcher.java:461)
at org.apache.catalina.core.ApplicationDispatcher.doForward(ApplicationDispatcher.java:384)
at org.apache.catalina.core.ApplicationDispatcher.forward(ApplicationDispatcher.java:312)
at org.apache.catalina.core.StandardHostValve.custom(StandardHostValve.java:398)
at org.apache.catalina.core.StandardHostValve.status(StandardHostValve.java:257)
at org.apache.catalina.core.StandardHostValve.throwable(StandardHostValve.java:352)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:177)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:92)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:78)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:343)
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:374)
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:65)
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:868)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1590)
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:748)
使用allowedOriginPatterns替代allowedOrigins
Docket中的groupName重复了,大概率是存在两个Docket启动着
在 Idea 环境下,使用typescript,报:Cannot use JSX unless the '--jsx' flag is provided
typescript 版本太低,可能是idea中的ts配置项指定错误的路径了
https://stackoverflow.com/questions/50432556/cannot-use-jsx-unless-the-jsx-flag-is-provided
npm ERR! Unexpected token < in JSON at position 1 while parsing near '
npm ERR! <!doctype html>
npm ERR! <htm...'
重新(我之前将其切换成其他的npm仓库)
npm set registry https://registry.npmjs.org/
spring boot 项目报错内容
java.lang.ArrayStoreException: sun.reflect.annotation.TypeNotPresentExceptionProxy
由于 AutoConfiguration 类初始化了晚于其他的类,初始化顺序方面存在问题。
我将一些bean初始化迁移到了其他的类中。
2018-12-31 15:20:08.954 WARN [dmp-spring-boot-hbase-kerberos-example,,,] 59377 --- [hared--pool2-t1] o.a.h.security.UserGroupInformation : Not attempting to re-login since the last re-login was attempted less than 60 seconds before. Last Login=1546240807297
2018-12-31 15:20:13.087 WARN [dmp-spring-boot-hbase-kerberos-example,,,] 59377 --- [hared--pool2-t1] o.a.h.security.UserGroupInformation : Not attempting to re-login since the last re-login was attempted less than 60 seconds before. Last Login=1546240807297
2018-12-31 15:20:16.728 WARN [dmp-spring-boot-hbase-kerberos-example,,,] 59377 --- [hared--pool2-t1] o.a.h.security.UserGroupInformation : Not attempting to re-login since the last re-login was attempted less than 60 seconds before. Last Login=1546240807297
2018-12-31 15:20:20.919 WARN [dmp-spring-boot-hbase-kerberos-example,,,] 59377 --- [hared--pool2-t1] o.a.h.security.UserGroupInformation : Not attempting to re-login since the last re-login was attempted less than 60 seconds before. Last Login=1546240807297
2018-12-31 15:20:24.659 WARN [dmp-spring-boot-hbase-kerberos-example,,,] 59377 --- [hared--pool2-t1] o.apache.hadoop.hbase.ipc.RpcClientImpl : Couldn't setup connection for [email protected] to hbase/[email protected]
javax.security.sasl.SaslException: GSS initiate failed
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) ~[na:1.8.0_121]
at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179) ~[hbase-client-1.3.2.jar:1.3.2]
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:619) [hbase-client-1.3.2.jar:1.3.2]
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$700(RpcClientImpl.java:164) [hbase-client-1.3.2.jar:1.3.2]
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:745) ~[hbase-client-1.3.2.jar:1.3.2]
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:742) ~[hbase-client-1.3.2.jar:1.3.2]
at java.security.AccessController.doPrivileged(Native Method) ~[na:1.8.0_121]
at javax.security.auth.Subject.doAs(Subject.java:422) ~[na:1.8.0_121]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1682) ~[hadoop-common-3.1.0.jar:na]
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:742) [hbase-client-1.3.2.jar:1.3.2]
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:911) [hbase-client-1.3.2.jar:1.3.2]
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:875) [hbase-client-1.3.2.jar:1.3.2]
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1249) [hbase-client-1.3.2.jar:1.3.2]
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227) [hbase-client-1.3.2.jar:1.3.2]
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336) [hbase-client-1.3.2.jar:1.3.2]
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:35518) [hbase-protocol-1.3.2.jar:1.3.2]
at org.apache.hadoop.hbase.client.ClientSmallReversedScanner$SmallReversedScannerCallable.call(ClientSmallReversedScanner.java:298) [hbase-client-1.3.2.jar:1.3.2]
at org.apache.hadoop.hbase.client.ClientSmallReversedScanner$SmallReversedScannerCallable.call(ClientSmallReversedScanner.java:276) [hbase-client-1.3.2.jar:1.3.2]
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:212) [hbase-client-1.3.2.jar:1.3.2]
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:364) [hbase-client-1.3.2.jar:1.3.2]
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:338) [hbase-client-1.3.2.jar:1.3.2]
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:137) [hbase-client-1.3.2.jar:1.3.2]
at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:65) [hbase-client-1.3.2.jar:1.3.2]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_121]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_121]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_121]
Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Server not found in Kerberos database (7) - LOOKING_UP_SERVER)
at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:770) ~[na:1.8.0_121]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248) ~[na:1.8.0_121]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[na:1.8.0_121]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ~[na:1.8.0_121]
... 25 common frames omitted
Caused by: sun.security.krb5.KrbException: Server not found in Kerberos database (7) - LOOKING_UP_SERVER
at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:73) ~[na:1.8.0_121]
at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:251) ~[na:1.8.0_121]
at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:262) ~[na:1.8.0_121]
at sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:308) ~[na:1.8.0_121]
at sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:126) ~[na:1.8.0_121]
at sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:458) ~[na:1.8.0_121]
at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:693) ~[na:1.8.0_121]
... 28 common frames omitted
Caused by: sun.security.krb5.Asn1Exception: Identifier doesn't match expected value (906)
at sun.security.krb5.internal.KDCRep.init(KDCRep.java:140) ~[na:1.8.0_121]
at sun.security.krb5.internal.TGSRep.init(TGSRep.java:65) ~[na:1.8.0_121]
at sun.security.krb5.internal.TGSRep.<init>(TGSRep.java:60) ~[na:1.8.0_121]
at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:55) ~[na:1.8.0_121]
... 34 common frames omitted
2018-12-31 15:20:26.850 WARN [dmp-spring-boot-hbase-kerberos-example,,,] 59377 --- [hared--pool2-t1] o.a.h.security.UserGroupInformation : Not attempting to re-login since the last re-login was attempted less than 60 seconds before. Last Login=1546240807297
2018-12-31 15:20:29.057 WARN [dmp-spring-boot-hbase-kerberos-example,,,] 59377 --- [hared--pool2-t1] o.a.h.security.UserGroupInformation : Not attempting to re-login since the last re-login was attempted less than 60 seconds before. Last Login=1546240807297
添加相应环境的 host 配置,如果还是不行,可以执行一下本地DNS缓存刷新
Mac: sudo dscacheutil -flushcache
Windows: ipconfig /flushdns
[ERROR] error: missing or invalid dependency detected while loading class file 'BasicMissionStreamTemplate.class'.
[INFO] Could not access term templates in package com.tairanchina.csp.dmp,
[INFO] because it (or its dependencies) are missing. Check your build definition for
[INFO] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
[INFO] A full rebuild may help if 'BasicMissionStreamTemplate.class' was compiled against an incompatible version of com.tairanchina.csp.dmp.
[ERROR] error: missing or invalid dependency detected while loading class file 'BasicMissionStreamTemplate.class'.
[INFO] Could not access term spark in value com.tairanchina.csp.dmp.templates,
[INFO] because it (or its dependencies) are missing. Check your build definition for
[INFO] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
[INFO] A full rebuild may help if 'BasicMissionStreamTemplate.class' was compiled against an incompatible version of com.tairanchina.csp.dmp.templates.
[ERROR] two errors found
com.tairanchina.csp.dmp.templates
无法识别,改为 com.tairanchina.csp.dmp.template
即可。
在升级了 SC Version 2020.0.3、SB:2.5.4 后,bootstrap 文件不读取问题。
引入 spring-cloud-starter-bootstrap 依赖
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-bootstrap</artifactId>
<version>3.0.3</version>
</dependency>
https://docs.spring.io/spring-cloud-config/docs/3.0.3/reference/html/#config-first-bootstrap
java.lang.UnsupportedOperationException: No Encoder found for com.fasterxml.jackson.databind.JsonNode
https://stackoverflow.com/questions/36648128/how-to-store-custom-objects-in-dataset
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.