@If you are facing the following error, you should change version 2.1 on the common.io.
------------------------------------------------------------------------
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.commons.io.FileUtils.isSymlink(Ljava/io/File;)Z
at org.apache.hadoop.fs.FileUtil.getDU(FileUtil.java:456)
at org.apache.hadoop.filecache.TrackerDistributedCacheManager.downloadCacheObject(TrackerDistributedCacheManager.java:463)
at org.apache.hadoop.filecache.TrackerDistributedCacheManager.localizePublicCacheObject(TrackerDistributedCacheManager.java:475)
at org.apache.hadoop.filecache.TrackerDistributedCacheManager.getLocalCache(TrackerDistributedCacheManager.java:191)
at org.apache.hadoop.filecache.TaskDistributedCacheManager.setupCache(TaskDistributedCacheManager.java:182)
at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:124)
at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:437)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:912)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
at jp.ameba.hadoop.main.FreqCounter1.main(FreqCounter1.java:92)
Hi, I'm an architecture engineer and native Korean speaker.
I have started to live in Seoul in Korea.
I have been developping Abilists tool,
Please feel free to visit abilists.com
Wednesday, August 14, 2013
Monday, August 12, 2013
Link - Hadoop, Hbase, Zookeeper
@All information about Hadoop
https://www.ibm.com/developerworks/data/library/techarticle/dm-1209hadoopbigdata/
http://www.ne.jp/asahi/hishidama/home/tech/apache/hbase/Filter.html#h_class
@Good install process
http://knight76.tistory.com/entry/hbase-Hbase-%EC%84%A4%EC%B9%98%ED%95%98%EA%B8%B0-Fully-Distributed-mode
@Hbase and Zookeeper
http://blog.naver.com/PostView.nhn?blogId=albertx&logNo=100187419333
http://promaster.tistory.com/82
@Hbase
http://engineering.vcnc.co.kr/2013/04/hbase-configuration/
@Hadoop - Good Install Information
http://blog.beany.co.kr/archives/1373#hdfs-sitexml
http://crazia.tistory.com/entry/%ED%95%98%EB%91%A1-%ED%95%98%EB%91%A1Hadoop-%EC%B4%88-%EA%B0%84%EB%8B%A8-%EC%84%A4%EC%B9%98-%EC%99%84%EC%A0%84-%EB%B6%84%EC%82%B0-Full-Distributed-%EB%B0%A9%EC%8B%9D
https://www.ibm.com/developerworks/data/library/techarticle/dm-1209hadoopbigdata/
http://www.ne.jp/asahi/hishidama/home/tech/apache/hbase/Filter.html#h_class
@Good install process
http://knight76.tistory.com/entry/hbase-Hbase-%EC%84%A4%EC%B9%98%ED%95%98%EA%B8%B0-Fully-Distributed-mode
@Hbase and Zookeeper
http://blog.naver.com/PostView.nhn?blogId=albertx&logNo=100187419333
http://promaster.tistory.com/82
@Hbase
http://engineering.vcnc.co.kr/2013/04/hbase-configuration/
@Hadoop - Good Install Information
http://blog.beany.co.kr/archives/1373#hdfs-sitexml
http://crazia.tistory.com/entry/%ED%95%98%EB%91%A1-%ED%95%98%EB%91%A1Hadoop-%EC%B4%88-%EA%B0%84%EB%8B%A8-%EC%84%A4%EC%B9%98-%EC%99%84%EC%A0%84-%EB%B6%84%EC%82%B0-Full-Distributed-%EB%B0%A9%EC%8B%9D
Hbase - anything else on the Scan
// Hbase
BinaryComparator comparator = new BinaryComparator(Bytes.toBytes("key"));
Filter filter = new RowFilter(CompareOp.EQUAL, comparator);
// Mysql
[select * from aTable where ROW='key']
// Hbase
byte[] prefix = Bytes.toBytes("key");
Filter filter = new PrefixFilter(prefix);
// Mysql
[select * from aTable where ROW like'key%']
// Hbase
byte[] stop = Bytes.toBytes("key");
Filter filter = new InclusiveStopFilter(stop);
// Mysql
[select * from aTable where ROW<='key']
BinaryComparator comparator = new BinaryComparator(Bytes.toBytes("key"));
Filter filter = new RowFilter(CompareOp.EQUAL, comparator);
// Mysql
[select * from aTable where ROW='key']
// Hbase
byte[] prefix = Bytes.toBytes("key");
Filter filter = new PrefixFilter(prefix);
// Mysql
[select * from aTable where ROW like'key%']
// Hbase
byte[] stop = Bytes.toBytes("key");
Filter filter = new InclusiveStopFilter(stop);
// Mysql
[select * from aTable where ROW<='key']
Hbase - Page limit
@For a paging like [Select * from aTable limit 10]
// HbaseDao.java
public ResultScanner resultScanner(String tableName, int intPages) throws Exception {
// Get a object from Pool
HTableInterface hTable = htablePool.getTable(tableName);
long pageSize = intPages;
Filter filter = new PageFilter(pageSize);
Scan s =new Scan();
s.setFilter(filter);
ResultScanner rs = hTable.getScanner(s);
return rs;
}
Monday, August 5, 2013
Link - How to do in Java
@How to configure a Netty 4 project using Spring 3.2+ and Maven
http://nerdronix.blogspot.jp/2013/06/netty-4-configuration-using-spring-maven.html
@HOW TO
http://www.kodejava.org/how-do-i-convert-inputstream-to-string/
@Like tail in Linux
http://blog.naver.com/PostView.nhn?blogId=jchem95&logNo=60008769821&redirect=Dlog&widgetTypeCall=true
@Jetty Document
http://www.eclipse.org/jetty/documentation/current/jetty-maven-plugin.html#get-up-and-running
@Jetty of Eclipse
http://wiki.eclipse.org/Jetty_WTP_Plugin/Jetty_WTP_Install
http://nerdronix.blogspot.jp/2013/06/netty-4-configuration-using-spring-maven.html
@HOW TO
http://www.kodejava.org/how-do-i-convert-inputstream-to-string/
@Like tail in Linux
http://blog.naver.com/PostView.nhn?blogId=jchem95&logNo=60008769821&redirect=Dlog&widgetTypeCall=true
@Jetty Document
http://www.eclipse.org/jetty/documentation/current/jetty-maven-plugin.html#get-up-and-running
@Jetty of Eclipse
http://wiki.eclipse.org/Jetty_WTP_Plugin/Jetty_WTP_Install
Subscribe to:
Posts (Atom)