Tuesday 7 September 2021

Create Private Docker Repository in Nexus & Connect with docker

 1. Create Docker Repository ( hosted )

Create Docker repo with host type and select HTTPS Connector with port 18080


2. Login to the Client system which is going to use docker login pointing to this repository 

https://www.tamilcloud.com:8443/repository/tamilcloud/


3. Check Docker info [ Using root user account ]


4. Add Repository URL details into Client System host  file to make sure the domain URL based docker login

#  sudo echo "192.168.1.3      www.tamilcloud.com repo.tamilcloud.com clm.tamilcloud.com nexus.tamilcloud.com" >> /etc/hosts

5. Create Repository name based folder in /etc/docker/certs.d to copy the Docker Root certificate

# sudo mkdir -p /etc/docker/certs.d/www.tamilcloud.com:18079

6. Copy Nexus Root certificate ca.crt to client system /etc/docker/certs.d/www.tamilcloud.com:18079 folder 

sudo cp /cert/ca.crt /etc/docker/certs.d/www.tamilcloud.com:18079

7. Docker Login to Repository



8. Push Local image to Docker Private repository

# docker pull mysql:5.7
# docker images
# docker tag 1d7aba917169 www.tamilcloud.com:18079/mysql:5.7
# docker push  www.tamilcloud.com:18079/mysql:5.7


9. Verify image in private docker repository


Completed :)

Install On-Premise Nexus Sonatype Artefact repository [ SSL Enabled ]

 1.  Download the Nexus Open source from Nexus Sonatype website

https://help.sonatype.com/repomanager3/download/download-archives---repository-manager-3

2. Unzip it and place the file into /opt/nexus folder

3. To create Nexus service to start and stop add nexus.service file with the following content

[Unit]
Description=nexus service
After=network.target
  
[Service]
Type=forking
LimitNOFILE=65536
ExecStart=/opt/nexus/nexus-3.34.0-01/bin/nexus start
ExecStop=/opt/nexus/nexus-3.34.0-01/bin/nexus stop
User=tamilarasan
Restart=on-abort
TimeoutSec=600
  
[Install]
WantedBy=multi-user.target


4. Copy the file into /etc/systemd/system/ folder & enable the service


$ sudo cp nexus.service /etc/systemd/system

$ sudo systemctl daemon-reload

$ sudo systemctl enable nexus.service

$ sudo systemctl start nexus.service

5. Verify the Nexus status 

$sudo systemctl status nexus.service


6. To check the nexus log file 


7. Setup the initial admin password

By default the initial password will be stored into ' admin.password' file into $data-dir


8. Create Self -Signed certificate to enable Nexus with https

# 1 Generate public private key pair using keytool:

keytool -genkeypair -keystore keystore.jks -storepass password -alias tamilcloud.com \

 -keyalg RSA -keysize 2048 -validity 5000 -keypass password \

 -dname 'CN=*.tamilcloud.com, OU=Sonatype, O=Sonatype, L=Unspecified, ST=Unspecified, C=US' \

 -ext 'SAN=DNS:nexus.tamilcloud.com,DNS:clm.tamilcloud.com,DNS:repo.tamilcloud.com,DNS:www.tamilcloud.com'


 # 2. Generate PEM encoded public certificate file using keytool:

keytool -exportcert -keystore keystore.jks -alias tamilcloud.com -rfc > tamilcloud.cert


# 3. Convert our Java specific keystore binary".jks" file to a widely compatible PKCS12 keystore ".p12" file:

keytool -importkeystore -srckeystore keystore.jks -destkeystore tamilcloud.p12 -deststoretype PKCS12


# 4. Extract pem (certificate) from ".p12" keystore file ( this is same as step 2, but openssl spits out more verbose contents ):

openssl pkcs12 -nokeys -in tamilcloud.p12 -out tamilcloud.pem


# 5. Extract unencrypted private key file from ".p12" keystore file:

openssl pkcs12 -nocerts -nodes -in tamilcloud.p12 -out tamilcloud.key


# 6. List and verify new keystore file contents:

keytool -list -keystore tamilcloud.p12 -storetype PKCS12


# 7. copy tamilcloud.cert to the same folder as ca.crt root certificate

mv tamilcloud.cert ca.crt

# 8. Add host name in to /etc/hosts file

1*.*.*.*      www.tamilcloud.com repo.tamilcloud.com clm.tamilcloud.com nexus.tamilcloud.com


9. Copy Java keystore file at $data-dir/etc/ssl/keystore.jks


$ mkdir -p /opt/nexus/sonatype-work/nexus3/etc/ssl

cp keystore.jks /opt/nexus/sonatype-work/nexus3/etc/ssl/


10. Edit $data-dir/etc/nexus-default.properties and save the file after making the following changes. Add a new line containing:

application-port-ssl=8443


10. update the nexus-arg int the nexus-default.properties file

nexus-args=${jetty.etc}/jetty.xml,${jetty.etc}/jetty-http.xml,${jetty.etc}/jetty-https.xml,${jetty.etc}/jetty-requestlog.xml

11. Update the application host value as specific ip address

application-host=1*.*.*.*

12.  Run the Nexus and start Use it :)


Reference: 

1. SSL Configuration :  https://help.sonatype.com/repomanager3/system-configuration/configuring-ssl#ConfiguringSSL-ServingSSLDirectly

2. Prepare Self sign certificate : https://support.sonatype.com/hc/en-us/articles/213465768-SSL-Certificate-Guide?_ga=2.104409540.550049495.1630897268-1578994290.1629198342

3. Configure Runtime environment : https://help.sonatype.com/repomanager3/installation/configuring-the-runtime-environment


Gogs - Create Git Repository

 1. Create New git repository for java application 


2. Created Repository for java Application


3. Git Project clone into Local folder

$ git clone http://192.168.1.110:8300/STR/pacspruapi.git


4. Move Local Files into the folder to start commit

$ git status


5. Git add all local files
$ git add .
$ git status


6. Git Commit 

$ git commit -m 'comment for the commit'


7. Push the committed code to Gogs Repository

$ git status
$ git remote -v # to display the available remote repository 
$ git push origin master 


8. If the Remote system code updated and the local code not yet sync then 
$ git remote show origin


# Use Git pull to sync with the remote
$ git pull



9. If Local file changed which is not sync with Remote
Change the Readme file 


Add only update file into git

$ git add README.md


Git Commit the code 
$ git commit -m 'Readme file updated - Tamilarasan'



Git Push the changes to Repository 
$ git push origin master



10. Verify the Code in Gogs repository


                                     Completed :)

Nexus Maven Repository & Setup

Steps to Configure Nexus Maven & Application integration

PART - A Nexus Maven Creation


1. Create User Account in Nexus

2. Create a Maven Group repository

3. Add existing  maven host & proxy repositories into the new maven repo

4. Make Sure , enable the HTTP user authentication for Maven Central Repository part of newly created Group Repository.

5. Verify the Repository  Access using the URL



PART - B  Maven Config files creation

1. Create Maven Master Password Creation

$ mvn --encrypt-master-password <PASSWORD>

2. Save the result into a file "settings-security.xml" 
<settingsSecurity>
<master>{XOZTXCnPjDsHo1jxbPOEdjSCkMamoy4fgdfYej7588I=}</master>
</settingsSecurity>
Notes:
If you like to keep the password somewhere USB to plug & Build 
Store the file into USB drive specific folder 
like '/Volumes/mySecureUsb/secure/settings-security.xml' & use it.

<settingsSecurity>
    <relocation>/Volumes/mySecureUsb/secure/settings-security.xml</relocation></settingsSecurity>

3. Using Master password encrypt Repository Password to be used in pom.xml for build

For Example :

Repository URL : https://www.tamilcloud.com:8443/repository/maven-central/
Repository User Name : tamilarasan
Repository Password  : abcdefgh => needs to be encrypted to be used in CI CD pipeline

$ mvn --encrypt-password "abcdefgh" -s Settings.xml -Dsettings.security=settings-security.xml

4. Update the encrypted password into Settings.xml in server password Section

<servers>
<server>
<id>nexus</id>
<username>tamilarasan</username>
<password>{zVwvg21CdkIHM9hA5GsKv+9rzIZlslT3qAmkxcvh+xA=}</password> <!-- Encrypted Repo Password -->
</server>
</servers>

5. Update the Application pom.xml with nexus repository details.

<distributionManagement> <repository> <id>nexus</id> <name>Releases</name> <url>http://www.tamilcloud.com:8081/repository/maven-releases/</url> </repository> <snapshotRepository> <id>nexus</id> <name>Snapshot</name> <url>http://www.tamilcloud.com:8081/repository/maven-snapshots/</url> </snapshotRepository> </distributionManagement>

6. Compile and build with Settings.xml & settings-Security.xml

mvn -s settings.xml -Dsettings.security=settings-security.xml clean compile package deploy

7. Verify build result.

8. Verify the Nexus repository

Source Code : Click here

Done :)












Friday 14 November 2014

Hadoop Map Reduce with MongoDB Database

Objective : 

  •  Reading MongoDB data from Hadoop Mapreduce for data mining process.
  • Develop Mapreduce program in windows based system with Maven to prepare executable jar file.
     
                   Here in this example Hadoop will read all the rows from MongoDB and counting number of rows in collection.
It will also support text processing custom searched MongoDB documents also.
It will also support the searched result store into MongoDB another collection table.

Windows Environment


  •  Create a Maven project Named MongoHadoop.
  •  Add maven dependencies in pom.xml file.

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.tamil</groupId>
    <artifactId>MongoHadoop</artifactId>
    <version>0.1</version>
    <dependencies>
        <dependency>
            <groupId>org.mongodb</groupId>
            <artifactId>mongo-hadoop-core</artifactId>
            <version>1.3.0</version>
        </dependency>
        <dependency>
            <groupId>org.mongodb</groupId>
            <artifactId>mongo-hadoop-streaming</artifactId>
            <version>1.3.0</version>
        </dependency>

        <dependency>
            <groupId>jdk.tools</groupId>
            <artifactId>jdk.tools</artifactId>
            <version>1.7.0_05</version>
            <scope>system</scope>
            <systemPath>${JAVA_HOME}/lib/tools.jar</systemPath>
        </dependency>
    </dependencies>
    <build>
        <plugins>
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <configuration>
                    <archive>
                        <manifest>
                            <mainClass>com.tamil.MongoDBDriver</mainClass>
                        </manifest>
                    </archive>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id> <!-- this is used for inheritance merges -->
                        <phase>package</phase> <!-- bind to the packaging phase -->
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
</project>

  •  Create a MongoDBMapper.java class under com.tamil package.


package com.tamil;
import java.io.IOException;
import java.util.StringTokenizer;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.Mapper;
import org.bson.BSONObject;

public class MongoDBMapper extends Mapper<Object, BSONObject, Text, LongWritable> {
    @Override
    public void map(Object key, BSONObject value, Context context)
        throws IOException, InterruptedException {
        String twitte = (String) value.get("Text");
        Text text = new Text("Count");
        context.write(text, new LongWritable(1));
    }
}
  •   Create a MongoDBReducer.java class under com.tamil package.

package com.tamil;

import java.io.IOException;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.Reducer;
public class MongoDBReducer extends Reducer<Text, LongWritable, Text, LongWritable> {
    @Override
    public void reduce(Text key, Iterable<LongWritable> values,
        Reducer<Text, LongWritable, Text, LongWritable>.Context context) throws IOException, InterruptedException {
        long sum = 0;
        for (LongWritable value : values) { sum += value.get();}
        context.write(key, new LongWritable(sum));
    }
}
  •   Create a MongoDBDriver.java class under com.tamil package.  


package com.tamil;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.util.GenericOptionsParser;
import com.mongodb.hadoop.MongoInputFormat;
import com.mongodb.hadoop.util.MongoConfigUtil;

public class MongoDBDriver {
    public static void main(String[] args) {
        try {
            final Configuration config = new Configuration();
            MongoConfigUtil.setInputURI(config,"mongodb://localhost:27017/MytDB.MyTable");
            String[] otherArgs =new GenericOptionsParser(config, args)             .getRemainingArgs();
            if (otherArgs.length != 1) {
                System.err.print("Useage: MongoDBDriver <out>");
                System.exit(2);
            }
            Job job = new Job(config, "MongoTitle");
            job.setJarByClass(MongoDBDriver.class);
            job.setMapperClass(MongoDBMapper.class);
            job.setCombinerClass(MongoDBReducer.class);
            job.setReducerClass(MongoDBReducer.class);
            job.setOutputKeyClass(Text.class);
            job.setOutputValueClass(LongWritable.class);
            job.setInputFormatClass(MongoInputFormat.class);
            System.out.println("Dummy URl "+ otherArgs[1]);
            FileOutputFormat.setOutputPath(job, new Path(otherArgs[1]));
            System.exit(job.waitForCompletion(true) ? 0 : 1);
        }
        catch (Exception e) { e.printStackTrace();}
    }
}
  • To create Executable jar file with dependencies , run maven assembly command
>  mvn clean compile package assembly:assembly

 Linux Environment 

  • Copy the jar file into Linux system and run hadoop command.


$ jar MongoDBHadoop.jar com.tamil/MongoDBDriver hdfs://localhost.localdomain:8020/user/cloudera/output
  • Hadoop map reduce job will run and the results will be stored into hdfs://localhost.localdomain:8020/user/cloudera/output/part-r-00000 file.
  • Using hadoop cat command we can see the content of  part-r-00000 file

$ hadoop fs -cat  hdfs://localhost.localdomain:8020/user/cloudera/output/part-r-00000
Count    111793

So number of documents in the mongodb collection is 111793.
Now Its easy to develop Hadoop Map reduce program in Windows Environment itself using maven.
Great Job :-)

Thursday 13 November 2014

Google Spread Sheet - To Store Hadoop MapReduce Result

Objective :

                  Simple Service to store Hadoop MapReduce result into Google Spread Sheet.

 Purpose :

                 Dynamicaly Hadoop MapReduce result should reflect in Web Application chart which referes the Google spread sheet.
At Run time my Hadoop Mapreduce program result will be uploaded into Google spread sheet using this service.
My Blog's Current chart will use the Spread Sheet Data and display the result in my Blog without any changes in blog.

Flow:


Hadoop MapReduce Result -> Google Spread Sheet -> Google Chart -> Blog

Google Service:


                       Using Maven repository add google core jar file . I Used com.google.gdata.core.1.47.1.jar for this service class.

GSpreadSheetService.java

import java.io.IOException;
import java.net.*;
import java.util.*;

import com.google.gdata.client.spreadsheet.SpreadsheetService;
import com.google.gdata.data.Link;
import com.google.gdata.data.PlainTextConstruct;
import com.google.gdata.data.batch.*;
import com.google.gdata.data.spreadsheet.*;
import com.google.gdata.util.*;

public class GSpreadSheetService {
    private String user;
    private String password;
    private String application;
    private SpreadsheetService spreadsheetService;
    private SpreadsheetFeed spreadsheetFeed;
    private static final String SHEET_URL = "https://spreadsheets.google.com/feeds/spreadsheets/private/full";

    public GSpreadSheetService(String app, String us, String pwd,
            String proxyHost, String proxyPort) {
        this(app, us, pwd);
        System.setProperty("https.proxyHost", proxyHost);
        System.setProperty("https.proxyPort", proxyPort);

    }

    public GSpreadSheetService(String app, String us, String pwd) {
        this.application = app;
        this.user = us;
        this.password = pwd;
    }

    private void initiate() throws AuthenticationException,
            MalformedURLException, IOException, ServiceException {
        spreadsheetService = new SpreadsheetService(application);
        spreadsheetService.setProtocolVersion(SpreadsheetService.Versions.V3);
        spreadsheetService.setUserCredentials(user, password);
        URL feedURL = new URL(SHEET_URL);
        spreadsheetFeed = spreadsheetService.getFeed(feedURL,
                SpreadsheetFeed.class);
    }

    public List<String> getAllWorkSheetsNames() {
        List<String> names = new ArrayList<String>();
        try {
            if (spreadsheetService == null || spreadsheetFeed == null)
                initiate();
            List<SpreadsheetEntry> spreadsheets = spreadsheetFeed.getEntries();
            for (SpreadsheetEntry spreadsheet : spreadsheets) {
                names.add(spreadsheet.getTitle().getPlainText());
            }
        } catch (Exception e) {
            e.printStackTrace();
        }
        return names;
    }

    public boolean deleteWorkSheetInSpreadSheet(String spreadSheetName,
            String workSheetName) {
        try {
            if (spreadsheetService == null || spreadsheetFeed == null)
                initiate();
            WorksheetEntry worksheet = getProperWorkSheet(spreadSheetName,
                    workSheetName);
            if (worksheet != null) {
                worksheet.delete();
            }
            return true;
        } catch (Exception e) {
            e.printStackTrace();
        }
        return false;
    }

    public boolean createWorkSheetWithDataInSpreadSheet(String spreadSheetName,
            String workSheetName, String[] headers, String[][] rows) {
        int rowCount = 5;
        int columnCount = 4;
        try {
            if (spreadsheetService == null || spreadsheetFeed == null)
                initiate();
            System.out.println("Objec Initialized");
            SpreadsheetEntry spreadSheet = getProperSpreadsheet(spreadSheetName);
            rowCount = rows.length;
            columnCount = headers.length;
            WorksheetEntry worksheet = new WorksheetEntry();
            worksheet.setTitle(new PlainTextConstruct(workSheetName));
            worksheet.setColCount(columnCount);
            worksheet.setRowCount(rowCount);
            WorksheetEntry createdWorkSheet = getProperWorkSheet(
                    spreadSheetName, workSheetName);
            if (createdWorkSheet == null) {
                URL worksheetFeedUrl = spreadSheet.getWorksheetFeedUrl();
                createdWorkSheet = spreadsheetService.insert(worksheetFeedUrl,
                        worksheet);
                System.out.println("Work Sheet created");
            }
            if (createdWorkSheet != null) {
                WorksheetEntry searchedWorksheet = getProperWorkSheet(
                        spreadSheetName, workSheetName);

                URL cellFeedUrl = searchedWorksheet.getCellFeedUrl();
                CellFeed cellFeed = spreadsheetService.getFeed(cellFeedUrl,
                        CellFeed.class);

                List<CellAddress> cellAddrs = new ArrayList<CellAddress>();
                for (int col = 0; col < headers.length; ++col) {
                    cellAddrs.add(new CellAddress(1, (1 + col), headers[col]));
                }

                Map<String, CellEntry> cellEntries = getCellEntryMap(
                        spreadsheetService, cellFeedUrl, cellAddrs);
                System.out.println("Map constructed");

                CellFeed batchRequest = new CellFeed();
                for (CellAddress cellAddr : cellAddrs) {
                    CellEntry batchEntry = new CellEntry(
                            cellEntries.get(cellAddr.idString));
                    batchEntry.changeInputValueLocal(cellAddr.value);
                    batchEntry.setImmutable(true);
                    BatchUtils.setBatchId(batchEntry, cellAddr.idString);
                    BatchUtils.setBatchOperationType(batchEntry,
                            BatchOperationType.UPDATE);
                    batchRequest.getEntries().add(batchEntry);
                }

                // Submit the update
                Link batchLink = cellFeed.getLink(Link.Rel.FEED_BATCH,
                        Link.Type.ATOM);
                CellFeed batchResponse = spreadsheetService.batch(new URL(
                        batchLink.getHref()), batchRequest);
                System.out.println("batch Submitted");
                // Check the results
                boolean isSuccess = true;
                for (CellEntry entry : batchResponse.getEntries()) {
                    String batchId = BatchUtils.getBatchId(entry);
                    if (!BatchUtils.isSuccess(entry)) {
                        isSuccess = false;
                        BatchStatus status = BatchUtils.getBatchStatus(entry);
                        System.out.printf("%s failed (%s) %s", batchId,
                                status.getReason(), status.getContent());
                    }
                }

                System.out.println("Header Cell Insertion Completed");
                URL listFeedUrl = searchedWorksheet.getListFeedUrl();
                ListFeed listFeed = spreadsheetService.getFeed(listFeedUrl,
                        ListFeed.class);
                for (int i = 0; i < rows.length; i++) {
                    ListEntry row = new ListEntry();
                    for (int j = 0; j < rows[i].length; j++) {
                        row.getCustomElements().setValueLocal(headers[j],
                                rows[i][j]);
                    }
                    row = spreadsheetService.insert(listFeedUrl, row);
                    System.out.println("Row Inserted");
                }
            }
            return true;
        } catch (Exception e) {
            e.printStackTrace();
        }
        return false;
    }

    public static Map<String, CellEntry> getCellEntryMap(
            SpreadsheetService ssSvc, URL cellFeedUrl,
            List<CellAddress> cellAddrs) throws IOException, ServiceException {
        CellFeed batchRequest = new CellFeed();
        for (CellAddress cellId : cellAddrs) {
            CellEntry batchEntry = new CellEntry(cellId.row, cellId.col,
                    cellId.idString);
            batchEntry.setId(String.format("%s/%s", cellFeedUrl.toString(),
                    cellId.idString));
            BatchUtils.setBatchId(batchEntry, cellId.idString);
            BatchUtils.setBatchOperationType(batchEntry,
                    BatchOperationType.QUERY);
            batchRequest.getEntries().add(batchEntry);
        }

        CellFeed cellFeed = ssSvc.getFeed(cellFeedUrl, CellFeed.class);
        CellFeed queryBatchResponse = ssSvc.batch(
                new URL(cellFeed.getLink(Link.Rel.FEED_BATCH, Link.Type.ATOM)
                        .getHref()), batchRequest);

        Map<String, CellEntry> cellEntryMap = new HashMap<String, CellEntry>(
                cellAddrs.size());
        for (CellEntry entry : queryBatchResponse.getEntries()) {
            cellEntryMap.put(BatchUtils.getBatchId(entry), entry);
            // System.out.printf( "batch %s {CellEntry: id=%s editLink=%s inputValue=%s\n",
            // BatchUtils.getBatchId(entry), entry.getId(), entry.getEditLink().getHref(), entry.getCell()
            // .getInputValue());
        }

        return cellEntryMap;
    }

    private WorksheetEntry getProperWorkSheet(String spreadSheetName,
            String workSheetName) {
        try {
            if (spreadsheetService == null || spreadsheetFeed == null)
                initiate();
            SpreadsheetEntry spreadSheet = getProperSpreadsheet(spreadSheetName);
            WorksheetFeed worksheetFeed = spreadsheetService.getFeed(
                    spreadSheet.getWorksheetFeedUrl(), WorksheetFeed.class);
            List<WorksheetEntry> worksheets = worksheetFeed.getEntries();
            for (WorksheetEntry workSheet : worksheets) {
                if (workSheetName.trim().equalsIgnoreCase(
                        workSheet.getTitle().getPlainText())) {
                    return workSheet;
                }
            }
        } catch (Exception e) {
            e.printStackTrace();
        }
        return null;
    }

    private static class CellAddress {
        public final int row;
        public final int col;
        public final String idString;
        public final String value;

        public CellAddress(int row, int col, String v) {
            this.row = row;
            this.col = col;
            this.idString = String.format("R%sC%s", row, col);
            this.value = v;
        }
    }

    private SpreadsheetEntry getProperSpreadsheet(String spreadSheetName) {
        try {
            if (spreadsheetService == null || spreadsheetFeed == null)
                initiate();
            List<SpreadsheetEntry> spreadsheets = spreadsheetFeed.getEntries();
            for (SpreadsheetEntry spreadSheet : spreadsheets) {
                if (spreadSheetName.trim().equalsIgnoreCase(
                        spreadSheet.getTitle().getPlainText())) {
                    return spreadSheet;
                }
            }
        } catch (Exception e) {
            e.printStackTrace();
        }
        return null;
    }
}


Google Chart [Using GoogleSpreadSheet]