Sunday, December 29, 2013

MyStory - When I disputed with my wife.

Just after getting married, I often had disputes with my wife.
Actually I cannot remember everything,
But I will try to tell you the situation.
We were disputing a matter.
At that time, I was so angry for some reason.
I had felt that my wife was ignoring me.
"Who am I?" I asked loudly. "Who am I?" I asked again.
She didn't say anything.
So, I answered for her,
"you are my husband!"
As I said that, I became aware that I misspoke --
I wasn't fluent in Japanese at that time.
It was silent for a while.
Then, she burst out laughing and shed tears.
In my mind, I also burst out laughing.
I tried to suppress laughter, but she became aware that I was laughing.
I couldn't remember why I was angry exactly.
Now, that we are love love....

Monday, December 16, 2013

Troubleshooting - Hive

@When you see the following error.
----------------------------------------------------------------
FAILED: SemanticException [Error 10035]: Column repeated in partitioning columns
----------------------------------------------------------------
@Solution
sudo -u hdfs hive -e "CREATE TABLE table_temp (time string, aaa string, bbb string, dt string) partitioned by(dt string) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t' STORED AS SEQUENCEFILE;"

Friday, December 13, 2013

Troubleshooting - An error has occurred in Eclipse

For resolve this problem
$ eclipse -clean
---------------------------------------------------------------------------------
!ENTRY org.eclipse.osgi 4 0 2013-12-13 18:44:55.618
!MESSAGE Startup error
!STACK 1
java.lang.RuntimeException: Exception in org.eclipse.osgi.framework.internal.core.SystemBundleActivator.start() of bundle org.eclipse.osgi.
    at org.eclipse.osgi.framework.internal.core.InternalSystemBundle.resume(InternalSystemBundle.java:233)
    at org.eclipse.osgi.framework.internal.core.Framework.launch(Framework.java:657)
    at org.eclipse.core.runtime.adaptor.EclipseStarter.startup(EclipseStarter.java:274)
    at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:176)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.eclipse.equinox.launcher.Main.invokeFramework(Main.java:629)
    at org.eclipse.equinox.launcher.Main.basicRun(Main.java:584)
    at org.eclipse.equinox.launcher.Main.run(Main.java:1438)
Caused by: org.osgi.framework.BundleException: Exception in org.eclipse.osgi.framework.internal.core.SystemBundleActivator.start() of bundle org.eclipse.osgi.
    at org.eclipse.osgi.framework.internal.core.BundleContextImpl.startActivator(BundleContextImpl.java:734)
    at org.eclipse.osgi.framework.internal.core.BundleContextImpl.start(BundleContextImpl.java:683)
    at org.eclipse.osgi.framework.internal.core.InternalSystemBundle.resume(InternalSystemBundle.java:225)
    ... 10 more

Monday, December 9, 2013

Gradle - Add the jars to the dependencies in Eclipse


You need to install the [https://github.com/spring-projects/eclipse-integration-gradle]
I installed this version[http://dist.springsource.com/release/TOOLS/gradle]

build.gradle

$ gradle cleanEclipse eclipse

You can see jars file in the Eclipse

DataArtists - What is the Data Artists?

Let's watch the Ted.

Thursday, December 5, 2013

MyDesign - This is a office partition design.

15 years ago, I was a professional 3D rendering Designer.
when I learned 3D Studio program, I was used to make some rendering all night.
Of course, I read all of the 3D books and I even thought to write a book of 3D renderings.
The two pictures are my work when I was a college student.

Thursday, November 28, 2013

Gradle - Release Script on Jenkins

You just put the following this to the Execute shell of the Post Steps on Jenkins.

Also, you need to configure the following option on Jenkins.
Build > invoke Gradle script > Invoke Gradle Gradle Version >> Default
Build > invoke Gradle script > Tasks >> zip

Gradle - Sample build.gradle for Batch

This Gradle file is for batch.

Tuesday, November 19, 2013

Saturday, November 9, 2013

MyDesign - This is a clock desging on the desk.

I designed the follow drawing when I worked the Industrial Design Company.


OS:Windows 97
Graphic Tool: Corel Draw

Why I was going to decide to enter the college for Industrial Design.
I had a reason. I will tell you why in this Session.


Friday, November 8, 2013

Java - Common Daemon in Java

@Download Commons Daemon
$ wget http://ftp.meisei-u.ac.jp/mirror/apache/dist//commons/daemon/source/commons-daemon-1.0.15-src.tar.gz
@Decompress
$ tar xvf ./commons-daemon-1.0.15-src.tar.gz
@Change Directory
$ cd /usr/local/src/commons-daemon-1.0.15-src/src/native/unix
@You need to build the "configure" program with:
$ ./support/buildconf.sh
@Set configuration and compile
$ ./configure --with-java=/usr/local/java
$ make
@Move jsvc to home of apps
$ mv /usr/local/src/commons-daemon-1.0.15-src/src/native/unix/jsvc /usr/local/app/

Link - Gradle

@Plug-in in Eclipse
http://www.kaczanowscy.pl/tomek/2010-03/gradle-ide-integration-eclipse-plugin

@Multi-modules
http://blog.tamashumi.com/2012/11/muliti-module-gradle-project-with-ide.html

Thursday, November 7, 2013

Troubleshooting - Vert.x

@The following error occurred in the Vert.x, it included the vertx-core-1.3.1.final.jar in the Lib.
The resolution is that you should use  vertx-core-2.0.2.final.jar.
------------------------------------------------------------------------------
nested exception is java.lang.IncompatibleClassChangeError: Found interface org.vertx.java.core.VertxFactory, but class was expected

Monday, October 28, 2013

HTML5&CSS3 - Link

@Xcode with Web
http://cordova.apache.org/#about

@Maker Css3
http://www.css3maker.com/index.html

@CSS3
http://www.hongkiat.com/blog/html5-web-applications/

@Fonts - You can use these in free.
http://crazypixels.net/50-precious-free-fonts-for-commercial-use/

Java - How to install java on CentOs

@ How to install java on CentOs.

@ Changes a user for the root.
$ sudo -s

@ Decompresses jdk-7u75-linux-x64.tar.gz (or upper version).
$ tar xvf /usr/local/src/jdk-7u75-linux-x64.tar.gz

@ Moves the java directory under the local directory.
$ mv /usr/local/src/jdk1.7.0_75 /usr/local/java

@ Changes the user and group ownership of each given file for root.
$ chown -R root.root /usr/local/java

@ Add the following comment into a user in /home/njoonk/.bash_profile
export JAVA_HOME=/usr/local/java
export PATH=$JAVA_HOME/bin:$PATH

Friday, October 18, 2013

Objective-C - How to remove Cocos2d

@When you can't updates Cocos2d new version.

$ cd /Users/username/Library/Developer/Xcode/Templates/File Templates
$ m -rf ./cocos2d

$ cd /Users/username/Library/Developer/Xcode/Templates
$ rm -rf ./cocos2d

$ cd /Users/username/Library/Application Support/Developer/Shared/Xcode/File Templates
$ rm -rf ./cocos2d\ 1.0.0/

$ cd /Users/username/Library/Application Support/Developer/Shared/Xcode/Project Templates/
$ rm -rf ./cocos2d\ 1.0.0/

Objective-C - iPhone to Server

I will write it
@You need to get the following library.
https://github.com/msgpack/msgpack-objectivec

@ On the Iphone
static void listenerCallback(CFSocketRef socket, CFSocketCallBackType type,
                             CFDataRef address, const void *data, void *info) {

    NSString* str = nil;
    switch (type) {
        case kCFSocketNoCallBack:
            str = @"kCFSocketNoCallBack";
            break;
        case kCFSocketReadCallBack:
            str = @"kCFSocketReadCallBack";
            break;
        case kCFSocketAcceptCallBack:
            str = @"kCFSocketAcceptCallBack";
            break;
        case kCFSocketDataCallBack:
            str = @"kCFSocketDataCallBack";
            break;
        case kCFSocketConnectCallBack:
            str = @"kCFSocketConnectCallBack";
            break;
        case kCFSocketWriteCallBack:
            str = @"kCFSocketWriteCallBack";
            break;
        default:
            break;
    }

    if(type == kCFSocketDataCallBack) {
        // Get a message from server
        NSData* receiveData = (NSData*)data;
        NSDictionary* parsed = [receiveData messagePackParse];
        NSNumber *numx = [parsed objectForKey:@"x"];
        NSNumber *numy = [parsed objectForKey:@"y"];
        NSLog(@"numx is %f", [numx floatValue]);
        NSLog(@"numy is %f", [numy floatValue]);

        /* another way to print
         UInt8 *gotData = CFDataGetBytePtr((CFDataRef)data);
         int len = CFDataGetLength((CFDataRef)data);
         for(int i=0; i < len; i++) {
             NSLog(@"%c",*(gotData+i));
         }
         */
    } else if(type == kCFSocketWriteCallBack) {
        // Send a message to server
        CGPoint translation = CGPointMake(5.0, 6.0);
        NSNumber *numx = [NSNumber numberWithFloat:translation.x];
        NSNumber *numy = [NSNumber numberWithFloat:translation.y];
        NSDictionary *someDictionary = [[NSDictionary alloc] initWithObjectsAndKeys:
                                        numx, @"x",
                                        numy, @"y",
                                        nil];
        NSData* packed = [someDictionary messagePack];
        CFSocketSendData(socket, NULL, (CFDataRef)packed, 10);
        // CFRelease((CFDataRef)packed);
    }

}

@ On the Server
....

Saturday, October 12, 2013

MyStory - An earthquake has happened on 11 march 2011 in Japan

There has been an earthquake in Japan.
It was very sad thing.
When the earthquake happened,
I was very anxious for my wife and my daughter.
So I had called my wife as soon as possible.
But I couldn't connect with her.
I feared for both her and my daughter's safety.
After I was allowed to leave the company,
I left the office a little early at 5 o'clock to get to my wife's job.
I went to the Shinagawa Station from Shibuya.
I think It took about 5 hours.
But we didn't meet there, because the communication network system wasn't working.
After a little time had passed, We were able to speak to each other by smart phone.
My wife said she was walking to the child-care institutions.
I had to walk to the same place.
I had reached home at 5 AM.
So altogether I had walked for 12 hours.
I was happy for my family safety.
But I am very worried about the accident at the nuclear power plant.
I worry about the radioactivity.

Tuesday, October 8, 2013

Java - Setting server.xml on tomcat 7

@ on Server.xml

Java - Basic Authentication on Tomcat 7

@$ cd /usr/local/tomcat/conf
@Add the following this on the web.xml
<security-constraint>
        <web-resource-collection>
                <web-resource-name>
                        My Protected WebSite
                </web-resource-name>
                <url-pattern> /* </url-pattern>
                <http-method> GET </http-method>
                <http-method> POST </http-method>
        </web-resource-collection>
        <auth-constraint>
                <!-- the same like in your tomcat-users.conf file -->
                <role-name> aname </role-name>
        </auth-constraint>
</security-constraint>
<login-config>
        <auth-method> BASIC </auth-method>
        <realm-name>  Basic Authentication </realm-name>
</login-config>
<security-role>
        <description> aname role </description>
        <role-name> aname </role-name>
</security-role>
---------------------------------------------------------------------------------------
 @tomcat-users.xml
  <role rolename="manager-gui"/>
  <role rolename="admin-gui"/>
  <role rolename="aname" />

  <user username="tomcat" password="pwd" roles="manager-gui,admin-gui"/>
  <user username="aname" password="pwd" roles="aname"/>

Mysql - Remove the bin log

@Remove the bin log
mysql -e "PURGE MASTER LOGS BEFORE DATE_SUB(CURRENT_DATE, INTERVAL 60 DAY)"

Friday, September 13, 2013

Mysql - One sequence table can manage

It show you that the sequence table can manage all tables.

// These are Schema Desgins
CREATE TABLE zz_app
(
    id BIGINT UNSIGNED NOT NULL DEFAULT '0',
    app_id VARCHAR(45) NOT NULL,
    app_aaa VARCHAR(45) NULL,
    app_bbb VARCHAR(45) NULL,
    app_status_flag CHAR(1) NULL,
    insert_time TIMESTAMP NOT NULL,
    update_time TIMESTAMP NOT NULL,
    PRIMARY KEY (id)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;

CREATE INDEX zz_app_idx1 ON zz_app(app_id);
CREATE INDEX zz_app_idx2 ON zz_app(insert_time);

CREATE TABLE zz_sequence
(
    seq_name VARCHAR(30) NOT NULL,
    id BIGINT UNSIGNED NOT NULL DEFAULT '0',
    PRIMARY KEY (seq_name)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;

----------------------------------------------------------------------

// Update sequence number into the Mysql
    <insert id="updateSequece" parameterType="map">
        UPDATE
            zz_sequence
        SET
            id=LAST_INSERT_ID(id+1)
        WHERE
            seq_name = #{seqName}
        <selectKey resultType="Long" order="AFTER">
            SELECT
                LAST_INSERT_ID()
        </selectKey>
    </insert>

// Insert a data into the Mysql
    <insert id="insertApp" parameterType="map">
        <selectKey keyProperty="id" resultType="Long" order="BEFORE">
            SELECT
                id
            FROM
                zz_sequence
            WHERE
                seq_name = #{seqName};
        </selectKey>
        INSERT INTO zz_app (
            id,
            app_id,
            app_aaa,
            app_bbb,
            app_status_flag,
            insert_time,
            update_time
        ) VALUES (
            #{id},
            #{appId},
            #{appAaa},
            #{appBbb},
            #{appStatusFlag},
            now(),
            now()
        )
    </insert>

Thursday, September 12, 2013

Mysql - How to remove mysql on the mac

  • sudo rm /usr/local/mysql
  • sudo rm -rf /usr/local/mysql*
  • sudo rm -rf /Library/StartupItems/MySQLCOM
  • sudo rm -rf /Library/PreferencePanes/My*
  • edit /etc/hostconfig and remove the line MYSQLCOM=-YES-
  • sudo rm -rf /Library/Receipts/mysql*
  • sudo rm -rf /Library/Receipts/MySQL*
  • sudo rm -rf /var/db/receipts/com.mysql.*

Objective-C - How to remove Xcode

@Remove the old Xcode
>sudo /Developer/Library/uninstall-devtools --mode=all

Java - Jetty to run in eclipse

・・Main
・Location
/usr/share/maven/bin/mvn

・Working Directory
1.Browser Workspace
2.Select the project name

・Arguments
-P staging
jetty:run

・Execute
$ CD /.../workspace
$ mvn jetty:run -P staging

・・Environment
@ For Debugging
MAVEN_OPTS = -Xdebug -Xnoagent -Djava.compiler=NONE -Xrunjdwp:transport=dt_socket,address=4000,server=y,suspend=y

・・Run/Debug Configure....
Then, pull up the "Run/Debug Configure...." menu item and select "Remote Java Application" and click the "New" button. Fill in the dialog by selecting your webapp project for the "Project:" field, and ensure you are using the same port number as you specified in the address= property above.
Now all you need to do is to Run/External Tools and select the name of the maven tool setup you created in step 1 to start the plugin and then Run/Debug and select the name of the debug setup you setup in step2.

@pom.xml - Sample

Link - Good Information

@ Service for Serverside
https://baas.io/

@Install Visual Studio Express 2012
http://ariy.kr/71

@Manager tool
https://trello.com/

@Prediction system
http://www.iaeng.org/publication/WCECS2008/WCECS2008_pp804-809.pdf

Monday, September 9, 2013

Link - Html5 and JavaScript

@Sample Game
http://www.gamedevacademy.org/create-a-html5-mario-style-platformer-game/

@Open Wysiwyg editor
http://www.openwebware.com

@Plug-in for javascript in Eclipse
http://www.aptana.com/products/studio3/download

@Can test the JavaScript on WEB
http://jsfiddle.net/b9ndZ/1/

@Pick up color as HTML CODE
http://html-color-codes.info/Korean/

@Tutorial
http://www.cadvance.org/?leftmenu=doc/include/total_menu.asp&mainpage=doc/java/tutorial/js_function.asp

Wednesday, August 14, 2013

Hbase - Testing to MapReduce

@If you are facing the following error, you should change version 2.1 on the common.io.
------------------------------------------------------------------------
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.commons.io.FileUtils.isSymlink(Ljava/io/File;)Z
        at org.apache.hadoop.fs.FileUtil.getDU(FileUtil.java:456)
        at org.apache.hadoop.filecache.TrackerDistributedCacheManager.downloadCacheObject(TrackerDistributedCacheManager.java:463)
        at org.apache.hadoop.filecache.TrackerDistributedCacheManager.localizePublicCacheObject(TrackerDistributedCacheManager.java:475)
        at org.apache.hadoop.filecache.TrackerDistributedCacheManager.getLocalCache(TrackerDistributedCacheManager.java:191)
        at org.apache.hadoop.filecache.TaskDistributedCacheManager.setupCache(TaskDistributedCacheManager.java:182)
        at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:124)
        at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:437)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:912)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
        at jp.ameba.hadoop.main.FreqCounter1.main(FreqCounter1.java:92)

Monday, August 12, 2013

Link - Hadoop, Hbase, Zookeeper

@All information about Hadoop
https://www.ibm.com/developerworks/data/library/techarticle/dm-1209hadoopbigdata/

http://www.ne.jp/asahi/hishidama/home/tech/apache/hbase/Filter.html#h_class

@Good install process
http://knight76.tistory.com/entry/hbase-Hbase-%EC%84%A4%EC%B9%98%ED%95%98%EA%B8%B0-Fully-Distributed-mode

@Hbase and Zookeeper
http://blog.naver.com/PostView.nhn?blogId=albertx&logNo=100187419333
http://promaster.tistory.com/82

@Hbase
http://engineering.vcnc.co.kr/2013/04/hbase-configuration/

@Hadoop - Good Install Information
http://blog.beany.co.kr/archives/1373#hdfs-sitexml
http://crazia.tistory.com/entry/%ED%95%98%EB%91%A1-%ED%95%98%EB%91%A1Hadoop-%EC%B4%88-%EA%B0%84%EB%8B%A8-%EC%84%A4%EC%B9%98-%EC%99%84%EC%A0%84-%EB%B6%84%EC%82%B0-Full-Distributed-%EB%B0%A9%EC%8B%9D

Hbase - anything else on the Scan

// Hbase
BinaryComparator comparator = new BinaryComparator(Bytes.toBytes("key"));
Filter filter = new RowFilter(CompareOp.EQUAL, comparator);
// Mysql
[select * from aTable where ROW='key']

// Hbase
byte[] prefix = Bytes.toBytes("key");
Filter filter = new PrefixFilter(prefix);
// Mysql
[select * from aTable where ROW like'key%']

// Hbase
byte[] stop = Bytes.toBytes("key");
Filter filter = new InclusiveStopFilter(stop);
// Mysql
[select * from aTable where ROW<='key']

Hbase - Page limit


@For a paging like [Select * from aTable limit 10]

// HbaseDao.java
    public ResultScanner resultScanner(String tableName, int intPages) throws Exception {

        // Get a object from Pool
        HTableInterface hTable =  htablePool.getTable(tableName);

        long pageSize = intPages;
        Filter filter = new PageFilter(pageSize);
        Scan s =new Scan();
        s.setFilter(filter);

        ResultScanner rs = hTable.getScanner(s);

        return rs;
    }

Monday, August 5, 2013

Link - How to do in Java

@How to configure a Netty 4 project using Spring 3.2+ and Maven
http://nerdronix.blogspot.jp/2013/06/netty-4-configuration-using-spring-maven.html

@HOW TO
http://www.kodejava.org/how-do-i-convert-inputstream-to-string/

@Like tail in Linux
http://blog.naver.com/PostView.nhn?blogId=jchem95&logNo=60008769821&redirect=Dlog&widgetTypeCall=true

@Jetty Document
http://www.eclipse.org/jetty/documentation/current/jetty-maven-plugin.html#get-up-and-running

@Jetty of Eclipse
http://wiki.eclipse.org/Jetty_WTP_Plugin/Jetty_WTP_Install

Spring - Quartz

Tuesday, July 23, 2013

Hbase - Important thing


# When you execute a client App on Tomcat,  If you faced the following the error,
→Will not attempt to authenticate using SASL (unknown error)
# You add host's name in the hosts file. this is sample.
17x.2x.xxx.xx1   master01
17x.2x.xxx.xx2   slave02
17x.2x.xxx.xx3   slave03
17x.2x.xxx.xx4   slave04
# A client App(Hbase) on Tomcat is related with all of hbase server host name.

Friday, July 19, 2013

Hadoop - exclude a node on ruuning server

■dfs.hosts.exclude:
Names a file that contains a list of hosts that are not permitted to connect to the namenode. The full pathname of the file must be specified. If the value is empty, no hosts are excluded.

# Add below this to hdfs-site.xml
       <property>
              <name>dfs.hosts.exclude</name>
              <value>/home/hadoop/hadoop/conf/excludes</value>
      </property>

■mapred.hosts.exclude
Names a file that contains the list of hosts that should be excluded by the jobtracker. If the value is empty, no hosts are excluded. # Add below this to mapred-site.xml
    <property>
        <name>mapred.hosts.exclude</name>
        <value>/home/hadoop/hadoop/conf/excludes</value>
    </property>

# Excute
$ bin/hadoop dfsadmin -refreshNodes

# Excute Banlancer to banlanc for data
bin/hadoop balancer

Thursday, July 18, 2013

Hbase - Connection Pool


-------------------------------------------------------------------------
# You need to set ips in the hosts file.
    <hbase>
        <master>server1:6000</master>
        <zookeeper>
            <quorum>server1</quorum>
            <property>
                <clientPort>2181</clientPort>
            </property>
        </zookeeper>
    </hbase>

Flume - flume-env.sh

JAVA_HOME=/usr/local/java

FLUME_CLASSPATH="/home/hadoop/flume/lib"

Wednesday, July 17, 2013

Git - Push to remote And Delete a branch

# Push origin branch, not upstream
$ git push orign service_branch

@ Delete a local branch, on the master branch.
$ git branch -d branch_name

@ If you get a error like this, but if you change the branch to master, and then
@ commend git branch -d branch_name on the prompt.
@---------------------------------------------------
@error: The branch 'branch_name' is not fully merged.
@If you are sure you want to delete it, run 'git branch -D branch_name'.
@---------------------------------------------------
$ git branch -D dev

@ Delete a remote branch
$ git push origin --delete dev

@ Make a branch in Local
$ git checkout -b branch_name

@ Make a branch in Remote
$ git push origin branch_name

@ You can not delete branches showing list
git branch --no-merged


Thursday, July 11, 2013

Troubleshooting - hadoop

#   ipc.Client: Retrying connect to server: localhost/127.0.0.1:9000.
# You'd better check your hosts file
http://gh0stsp1der.tistory.com/66

# Hbase
#Unable to find region for 99999999999999 after 10 tries
http://nosql.rishabhagrawal.com/2013/04/hbase-orgapachehadoophbaseclientnoserve.html

Monday, July 8, 2013

Spring - Spring3 connect to Mybatis on spring-mybatis.

// A part of ServiceImpl.javaList<HadoopGameModel> hadoopGameList = slaveAdminDao.getMapper(SlaveDao.class).selectGameList(mapSelectGameList);


Git - Let's Fork the project

$ #Fork the "project-name" repository
$ git clone git@github.com:njoon/project-name.git
$ cd project-name/
$ git remote add upstream git@github.com:organizations/project-name.git
$ git fetch upstream

@If you want to remove the upstream
$ git remote remove upstream
----------------------------------------------------------------------------------------------

# If you want to merge from original branch(not forked master)
$ git fetch upstream

# To merge its changes into our local branch.
$ git branch -va
$ git checkout master
$ git merge upstream/master
# And you'd better use the Pull Request.


https://help.github.com/articles/syncing-a-fork


Friday, June 28, 2013

Flume - This is the flume.conf in a service server in Flume NG 1.3.1

# To bring logs and send to other source

agent1.channels = ch1
# Define a memory channel called ch1 on agent1
agent1.channels.ch1.type = memory
agent1.channels.ch1.capacity = 100000
agent1.channels.ch1.transactionCapacity = 1000

# Define an Exec Source called exec1
agent1.sources = exec1
agent1.sources.exec1.type = exec
agent1.sources.exec1.command = tail -F /usr/local/tomcat/logs/api/api.log
agent1.sources.exec1.interceptors = ts
agent1.sources.exec1.interceptors.ts.type = timestamp
agent1.sources.exec1.channels = ch1

# properties of hdfs-Cluster1-sink
agent1.sinks = avro-sink1
agent1.sinks.avro-sink1.type = avro
agent1.sinks.avro-sink1.channel = ch1
agent1.sinks.avro-sink1.hostname = 1xx.xxx.111.01
agent1.sinks.avro-sink1.port = 41414

Flume - This is the flume.conf in Flume NG 1.3.1

# To save logs data to HDFS.

agent1.channels = ch1
# Define a memory channel called ch1 on agent1
agent1.channels.ch1.type = memory
agent1.channels.ch1.capacity = 100000
agent1.channels.ch1.transactionCapacity = 1000

# Define an Avro source called avro-source1 on agent1 and tell it
# to bind to xxx.xxx.xxx.xxx:41414. Connect it to channel ch1.
agent1.sources = avro-source1
agent1.sources.avro-source1.channels = ch1
agent1.sources.avro-source1.type = avro
agent1.sources.avro-source1.bind = 1xx.xxx.111.01
agent1.sources.avro-source1.port = 41414

agent1.sinks = hdfs-sink1
agent1.sinks.hdfs-sink1.type = hdfs
agent1.sinks.hdfs-sink1.channel = ch1
agent1.sinks.hdfs-sink1.hdfs.path = hdfs://xxx.xxx.xxx.xxx:9000/home/hadoop/data/flume/%Y%m%d/%H
agent1.sinks.hdfs-sink1.hdfs.filePrefix = ch1
agent1.sinks.hdfs-sink1.hdfs.inUseSuffix = .txt
agent1.sinks.hdfs-sink1.hdfs.fileType = DataStream
agent1.sinks.hdfs-sink1.hdfs.rollCount = 0
agent1.sinks.hdfs-sink1.hdfs.rollInterval = 1200
agent1.sinks.hdfs-sink1.hdfs.writeFormat = text
agent1.sinks.hdfs-sink1.hdfs.rollSize = 0
agent1.sinks.hdfs-sink1.hdfs.rollCount=1000000
agent1.sinks.hdfs-sink1.hdfs.batchSize = 10
agent1.sinks.hdfs-sink1.hdfs.threadsPoolSize=10



----------------------------------------------------
http://www.nextree.co.kr/p2704/

Tuesday, June 25, 2013

Install - Memcached

1. libevent

wget http://www.monkey.org/~provos/libevent-1.3a.tar.gz

tar xvfz libevent-1.3a.tar.gz
cd libevent-1.3a
./configure --prefix=/usr/local/libevent1.4.4
make
make install



2. memcached

wget http://www.danga.com/memcached/dist/memcached-1.2.1.tar.gz

tar xvfz memcached-1.2.1.tar.gz

cd memcached-1.2.1

./configure --prefix=/usr/local/memcached-1.2.5 --with-libevent=/usr/local/libevent
make
make install

export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/libevent/lib
echo "/usr/local/lib" >>/etc/ld.so.conf
echo "/usr/local/libevent/lib" >>/etc/ld.so.conf
/sbin/ldconfig

memcached -d -m 1024 -l 127.0.0.1 -p 11211 -u root


# to replication
./configure --enable-replication --prefix=/usr/local/memcached-1.2.8 --with-libevent=/usr/local/libevent

MyStory - I went to Tokyo to work in Japan.

I worked as a programmer in Korea, but the IT Bubble had burst. 
I was 29 years old and my job was gone, so I was concerned about my future.
At that time, I thought, "maybe I should move to the other company?"
Should I start my a business that services something online Internet?
I hesitated about to make a decision.
At home, I would study the computer programing and English.(2002 year)
But it had been difficult to make steady progress.
Because of my nephew.
While I would study, he would interrupt me.
For example, he would cry in front of me and would loudly knock on the door.
So my parents and I had some problems at home.
I felt that I wanted to move out of the house. but I couldn't afford to live out on my own.
That was the reason why I didn't have enough money to rent a house.
By the way, I have worked a part-time job(a part-time lecturer in a college).
When I was working,  I saw a web page recruiting software engineers from a educational institution of Government to work abroad. and I decided that I would apply.
I needed some money,  and I had to take classes for Java and Japanese.
But I didn't care about any of that and applied anyway.
For 10 months, I studied advance Java, Oracle and Japanese.
It was particularly difficult to study Japanese.
Honestly, I wondered "can I do this?"
Sometimes, I felt frustrated and despaired.
I just went on, although I was very tired of studying.
One classroom had about 25 students.
I took eight exams. but I failed half of them.
But I didn't get frustrated, and I would try to challenge it again.
In the end, I didn't pass the exam.
However, I passed the interview for a job at the company
Therefore, I would be to going to Japan to work.
Since it was my first time going abroad.
I was very nervous
I was afraid that an accident might happen while I was in Japan.
So, I wasn't at all calm. I took an airplane in February 2003.
At last, I arrived at my destination Narita Airport.

Monday, June 24, 2013

Mysql - Install mysql5.5

cmake install - manual
--------------
./bootstrap
make
make install
--------------

mysql5.5 install
-------------
$ yum groupinstall "Development Tools"
$ yum install ncurses-devel
$ yum install cmake

$ cmake . \
-DCMAKE_INSTALL_PREFIX=/usr/local/mysql \
-DMYSQL_DATADIR=/usr/local/mysql/data \
-DSYSCONFDIR=/etc \
-DWITH_ARCHIVE_STORAGE_ENGINE=1 \
-DWITH_BLACKHOLE_STORAGE_ENGINE=1 \
-DWITH_FEDERATED_STORAGE_ENGINE=1 \
-DWITH_PARTITION_STORAGE_ENGINE=1 \
-DDEFAULT_CHARSET=utf8 \
-DDEFAULT_COLLATION=utf8_general_ci \
-DENABLED_LOCAL_INFILE=1 \
-DENABLED_PROFILING=1 \
-DMYSQL_TCP_PORT=3306 \
-DMYSQL_UNIX_ADDR=/tmp/mysql.sock \
-DWITH_DEBUG=1 \
-DWITH_EMBEDDED_SERVER=1;
$ make
$ make install

$ useradd mysql

$ cd /usr/local/src/mysql-5.5.38

$ chmod 755 scripts/mysql_install_db
$ scripts/mysql_install_db --user=mysql --basedir=/usr/local/mysql --datadir=/usr/local/mysql/data

@ Start and Stop
$ cp /usr/local/mysql/support-files/mysql.server /etc/init.d/
@ Configuration
$ cp /usr/local/mysql/support-files/my-medium.cnf /etc/my.cnf

mac
http://hoyanet.pe.kr/1942

Hadoop - Searching for something

Monday, June 17, 2013

Link - Collection

@Iphone Emoji
http://www.easyapns.com/category/just-for-fun

@Best Site(How to install)
http://xmodulo.com/

@UML
http://www.objectaid.com/home

@Data Compression
https://code.google.com/p/snappy/

@Monitor
https://github.com/Netflix/servo/

@Arrange Json
http://jsonformatter.curiousconcept.com/

@Using sequence in Mysql
http://bryan7.tistory.com/101

@C++ Tutorial
http://www.soen.kr/

@Mysql with cash
https://github.com/ahiguti/HandlerSocket-Plugin-for-MySQL/blob/master/docs-en/installation.en.txt

@Oracle Function List
http://jhbench.tistory.com/27

@Java Sample
http://kodejava.org/

Tuesday, June 4, 2013

Hadoop - Remove node

@1. Get the IP or Hosts list by running "report" command
$ $HADOOP_HOME/hadoop dfsadmin -report | grep Name

@2. You include IP:Port The following file.
@$HADOOP_HOME/conf/excludes
00.xx.xxx.001:50010

@3. invoke command:
$ $HADOOP_HOME/bin/hadoop dfsadmin -refreshNodes

@4. Verification
$ $HADOOP_HOME/bin/hadoop dfsadmin -report | grep -Eiw ‘Name|Decommission’

@4. This time is MapReduce
@If It has the exclude file, You can command this
$ $HADOOP_HOME/bin/hadoop mradmin -refreshNodes

http://pearlin.info/2012/04/best-way-to-blacklist-node-from-live-hadoop-cluster/

Linux - command

@a user register into the group
#/usr/sbin/usermod -g groupname username
$ /usr/sbin/usermod -g hadoop hadoop

@Find files to include strings
find . -exec grep -l "Contents of directory" {} \; 2>/dev/null

@Find files
find . -name "*Status*" -print

@Make ssh keys
ssh-keygen -t dsa -> Make a dsa
ssh-keygen -t rsa -> Make a rsa

Monday, June 3, 2013

Logback - Setting for Spring3.1.4

    <properties>
        <org.slf4j.version>1.7.5</org.slf4j.version>
        <org.logback.version>1.0.13</org.logback.version>
    </properties>

        <!-- Logging -->
        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-api</artifactId>
            <version>${org.slf4j.version}</version>
        </dependency>
        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>jcl-over-slf4j</artifactId>
            <version>${org.slf4j.version}</version>
            <scope>runtime</scope>
        </dependency>
        <dependency>
            <groupId>ch.qos.logback</groupId>
            <artifactId>logback-core</artifactId>
            <version>${org.logback.version}</version>
        </dependency>
        <dependency>
            <groupId>ch.qos.logback</groupId>
            <artifactId>logback-classic</artifactId>
            <version>${org.logback.version}</version>
        </dependency>


#logback.xml
<?xml version="1.0" encoding="UTF-8"?>
<configuration>

  <appender name="HADOOP_FLUME" class="ch.qos.logback.core.rolling.RollingFileAppender">
    <file>${user.dir}/logs/flumeAdmin.log</file>
    <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
        <!-- daily rollover -->
        <fileNamePattern>${user.dir}/logs/flumeAdmin.log.%d{yyyy-MM-dd}.log.zip</fileNamePattern>
        <!-- keep 90 days' worth of history -->
        <maxHistory>90</maxHistory>
    </rollingPolicy>
    <encoder class="ch.qos.logback.core.encoder.LayoutWrappingEncoder">
        <charset>UTF-8</charset>
        <layout class="ch.qos.logback.classic.PatternLayout">
            <pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{35} - %msg%n</pattern>
        </layout>
    </encoder>

  </appender>

  <root>
      <level value="info" />
    <appender-ref ref="HADOOP_FLUME" />
  </root>

</configuration>

Wednesday, May 29, 2013

Hadoop - commands

@Find files you want to see.
hadoop dfs -lsr /hadoop/flume/ | grep [search_term].

@hadoop error(release safe mode)
$./bin/hadoop dfsadmin -safemode leave

Hodoop - Get a content of file from Hadoop sample

public class TestMain {
    /**
     * @param args
     * @throws Exception
     */
    public static void main(String[] args) throws Exception {
        Configuration conf = new Configuration();
        conf.set("fs.default.name", "hdfs://xxx.28.xxx.51:9000");
        FileSystem dfs = FileSystem.get(conf);

        Path filenamePath = new Path("/home/hadoop/data/flume/20130529/12/ch1.1369796403013");
        FSDataInputStream fsIn = dfs.open(filenamePath);

        // org.apache.commons.io.IOUtils
        byte[] fileBytes = IOUtils.toByteArray(fsIn);
        fsIn.read(fileBytes);

        //create string from byte array
        String strFileContent = new String(fileBytes);
        System.out.println(strFileContent);
    }
}

Tuesday, May 28, 2013

Hadoop Manger URL

I will make a arrangement about below this later. 

http://confluence.openflamingo.org/pages/viewpage.action?pageId=5537913&focusedCommentId=7209528&#comment-7209528

Monday, May 27, 2013

Flume - Shell script for starting in Flume NG 1.3.1

# If you have faced the error, should install the below.
sudo yum install redhat-lsb.x86_64
 
# This script is permitted for Hadoop user.
$sudo su - hadoop


Shell - init-functions

@If you want to install this /lib/lsb/init-functions on the linux?
@Just install below this.
$ yum install redhat-lsb

@=======================================
#!/bin/sh

# LSB initscript functions, as defined in the LSB Spec 1.1.0
#
# Lawrence Lim <llim@redhat.com> - Tue, 26 June 2007
# Updated to the latest LSB 3.1 spec
# http://refspecs.freestandards.org/LSB_3.1.0/LSB-Core-generic/LSB-Core-generic_lines.txt

start_daemon () {
        /etc/redhat-lsb/lsb_start_daemon "$@"
}

killproc () {
        /etc/redhat-lsb/lsb_killproc "$@"
}

pidofproc () {
        /etc/redhat-lsb/lsb_pidofproc "$@"
}

log_success_msg () {
        /etc/redhat-lsb/lsb_log_message success "$@"
}

log_failure_msg () {
        /etc/redhat-lsb/lsb_log_message failure "$@"
}

log_warning_msg () {
        /etc/redhat-lsb/lsb_log_message warning "$@"
}

Friday, May 24, 2013

Shell - init.d Script

Let's edit later.

http://werxltd.com/wp/2012/01/05/simple-init-d-script-template/

Thursday, May 23, 2013

Flume - Ganglia Install in Flume NG 1.3.1

$ yum install arp apr-devel
$ yum install rrdtool rrdtool-devel
$ yum install libconfuse libconfuse-devel
$ yum install pcre pcre-devel
$ yum install expat expat-devel
$ yum install zlib zlib-devel

@ Install libconfuse
$ ./configure --with-pic
$ make
$ make install

$ mkdir -p /home/hadoop/ganglia/rrd/
$ chown nobody.nobody /home/hadoop/ganglia/rrd/
$ cd ./ganglia-3.6.0
$ ./configure --with-librrd=/home/hadoop/ganglia/rrd/ --with-gmetad --prefix=/usr/local/
$ make
$ make install

@You can confirm
$ ls /usr/local/bin/gstat
$ ls /usr/local/bin/gmetric
$ ls /usr/local/sbin/gmond
$ ls /usr/local/sbin/gmetad

@[.] is Ganglia compiled home
@ Register to service
$ cp ./gmond/gmond.init /etc/rc.d/init.d/gmond
$ chkconfig --add gmond
$ chkconfig --list gmond
$ vi /etc/rc.d/init.d/gmond
--> Edit ->GMOND=/usr/local/sbin/gmond

$ cp ./gmetad/gmetad.init /etc/rc.d/init.d/gmetad
$ chkconfig --add gmetad
$ chkconfig --list gmetad
$ vi /etc/rc.d/init.d/gmetad
--> Edit -> GMOND=/usr/local/sbin/gmetad

@ Copy conf
$ /usr/local/sbin/gmond --default_config > /usr/local/etc/gmond.conf

@ Set rrd tool
$ vi /etc/local/etc/gmetad
 -> rrd_rootdir "/home/hadoop/ganglia/rrd"


@ Start
# /etc/rc.d/init.d/gmond start
# /etc/rc.d/init.d/gmetad start

@ Confirm process
# telnet localhos 8649
--> Output XML

http://blog.daum.net/_blog/BlogTypeView.do?blogid=0N9yp&articleno=25&_bloghome_menu=recenttext#ajax_history_home

http://apexserver.iptime.org/users/yk.choi/weblog/7eca7/

http://ahmadchaudary.wordpress.com/tag/ganglia-monitoring/

Git - Add a tag in order

1.Edit pom.xml
 -Like 1.0-SNAPSHOT to 1.1 2.Index

2.Add Index
 $ git add *

3.Commit
 $ git commit -m "Tag v1.1"

4.Add a tag
 $ git tag v1.1

5.push to remote server
  @ When I did the following command for Github,
       the system didn't ask me to input Id and Password
       (master = tag version)

 $ git push origin v1.1

6.return to the development
 $ git fetch origin
 $ git reset --hard origin/master

Tuesday, May 21, 2013

How to use Flume ng

bin/flume-ng agent --conf-file conf/flume.conf --name agent1 -Dflume.monitoring.type=GANGLIA -Dflume.monitoring.hosts=172.xx.xxx.xx:5455

Friday, May 17, 2013

Flume - Install Flume NG 1.3.1

■What's Changed?
  • There's no more logical or physical nodes. We call all physical nodes agents and agents can run zero or more sources and sinks.
  • There's no master and no ZooKeeper dependency anymore. At this time, Flume runs with a simple file-based configuration system.

■Install Flume NG 1.3.1
$ git clone https://git-wip-us.apache.org/repos/asf/flume.git flume
$ cd flume
$ git checkout trunk{code}
OR
$ wget http://ftp.kddilabs.jp/infosystems/apache/flume/1.3.1/apache-flume-1.3.1-bin.tar.gz{code}

■Configuration
$ cp conf/flume-conf.properties.template conf/flume.conf
$ cp conf/flume-env.sh.template conf/flume-env.sh

■Change the file (conf/flume.conf)
#=====================================================
# Define a memory channel called ch1 on agent1
agent1.channels.ch1.type = memory

# Define an Avro source called avro-source1 on agent1 and tell it
# to bind to 0.0.0.0:41414. Connect it to channel ch1.
agent1.sources.avro-source1.channels = ch1
agent1.sources.avro-source1.type = avro
agent1.sources.avro-source1.bind = 0.0.0.0
agent1.sources.avro-source1.port = 41414

# Define a logger sink that simply logs all events it receives
# and connect it to the other end of the same channel.
agent1.sinks.log-sink1.channel = ch1
agent1.sinks.log-sink1.type = logger

# Finally, now that we've defined all of our components, tell
# agent1 which ones we want to activate.
agent1.channels = ch1
agent1.sources = avro-source1
agent1.sinks = log-sink1
#=====================================================

■Execute
$ bin/flume-ng agent --conf ./conf/ -f conf/flume.conf -Dflume.root.logger=DEBUG,console -n agent1

■reference
https://cwiki.apache.org/FLUME/getting-started.html

Thursday, May 16, 2013

Linux - user commond

# Add user
$/usr/sbin/useradd -d /home/njoonk -m njoonk -g njoonk

# Set password
$/usr/bin/passwd njoonk
# Delete user
$userdel testuser #Only a user.
$userdel -r testuser #Only a user with user's directory.

# Create a public in linux
$ ssh-keygen -t rsa
# Input the public-key into the authorized_keys
$ vi authorized_keys
$ chmod 644 .ssh/authorized_keys

Wednesday, May 15, 2013

Java - GC

JAVA_OPTS="-server"
JAVA_OPTS="${JAVA_OPTS} -Xms1024m -Xmx1024m -Xmn768m -XX:SurvivorRatio=2 -XX:PermSize=64m -XX:MaxPermSize=256m"
JAVA_OPTS="${JAVA_OPTS} -XX:+PrintGCDetails -Xloggc:/usr/local/tomcat/logs/gc.log"

Reference URL
http://fly32.net/438

http://www.javaservice.com/~java/bbs/read.cgi?m=&b=weblogic&c=r_p&n=1221718848&p=6&s=t

Monday, May 13, 2013

About git information

@Add remote git server URL
git fetch origin
git reset --hard origin/master
 
@Download new version from remote(you need to command the fetch)
git checkout HEAD

@Delete tags in remote.
git push origin :tags/{tag name}

@Upload remote
git push origin v1.5
git push origin --tags

@Delete tags in local
git tag -d {tag name}

@Create a tag
git tag v1.0
@Up load a tag to remote
git push --tags

@Show the commit id
git rev-parse [Tag Name]

@Delete Tag
git tag -d [Tag Name]

@Order a commit
git add *
git commit -m "This is the first commit"
git push

@DownLoad all branch from remote
git fetch origin

@
1.$ vim ./.git/config
2.[branch "master"]
        remote = origin
        merge = refs/heads/master
@a good thing to put this bellow into the config file
[alias]
        hist = log --pretty=format:'%h %ad | %s%d [%an]' --graph --date=short
[color]
        ui = true

How to get a thread dump in Linux.

If you have not been ready to get a thread dump in Linux.

./jstack -l -F 22431 > /home/share/kim_joon/thread3.txt

Friday, May 10, 2013

Ruby

http://dimdim.tistory.com/56
http://www.jacopretorius.net/2012/01/ruby-map-collect-and-select.html
http://ruby-doc.org/core-2.0/Array.html

Thursday, May 9, 2013

Eclipse - c, c++

http://chanroid.tistory.com/6
http://paralaxer.com/cpp-vs-objective-c/