Blogger Posts Searcher using Google Data .NET/Java Client APIs

It just happened today that I wanted to know if I had already published a post with a given title in one of the blogs I publish: http://jes4us.blogspot.com. During translation (I translate the posts from English to Portuguese) I had a feeling that I had  already worked on a similar text… well, it turns out I was mistaken!

Instead of going through the extensive list of posts looking one by one I thought why not leverage the power of Google Data API? You may say: why not do a simple Google search instead? Good point. As I like to play with code I couldn’t resist.

So here it is. A simple and faster way of knowing if I have a post with a given title. Bellow you’ll find the codez to both the .NET client API and the Java one.

Blogger Data API for .NET
1 - Download the client library here: http://code.google.com/p/google-gdata/downloads/list

2 - Install the .msi package Google_Data_API_Setup_1.9.0.0.msi.

3 - Create a new Console project and reference the DLL Google.GData.Client that’s in this folder: C:\Google Data API SDK\Redist

using System;
using System.Linq;
using Google.GData.Client;

namespace BlogPostsSearcher
{
    class Program
    {
        static void Main(string[] args)
        {
            Service bloggeService = AcquireService();

            AtomFeed feed = AcquireAndSetupFeed(bloggeService);

            // Search posts that contain the word "StringToSearchFor" in their titles
            var query = feed.Entries.Where(p => p.Title.Text.Contains("StringToSearchFor");

            // Writes the Blog's Title
            Console.WriteLine(feed.Title.Text);

            // Prints each post found...
            foreach (AtomEntry entry in query)
            {
                Console.WriteLine(string.Format("Post Title: {0} - Date Published: {1}", entry.Title.Text, entry.Published.ToShortDateString()));
            }

        }

        private static AtomFeed AcquireAndSetupFeed(Service service)
        {
            FeedQuery blogFeedUri = new FeedQuery("http://www.blogger.com/feeds/" + YourBlogID + "/posts/default");

            // Setting the number of posts to retrieve
            blogFeedUri.NumberToRetrieve = 1000;

            AtomFeed feed = service.Query(blogFeedUri);
            
            return feed;
        }

        private static Service AcquireService()
        {
            Service service = new Service("blogger", "YourCompanyName-BloggerPostsSearcher");

            service.Credentials = new GDataCredentials("YourEmailAddress@gmail.com", "YourPassword");

            GDataGAuthRequestFactory factory = (GDataGAuthRequestFactory)service.RequestFactory;
            
            return service;
        }
    }
}

Blogger Data API for Java
1 - Download the client library here: http://code.google.com/p/gdata-java-client/downloads/list

2 - Unzip the file http://code.google.com/p/gdata-java-client/downloads/detail?name=gdata-src.java-1.46.0.zip

3 - Create a new Java Project and add references to:
- gdata-client-1.0.jar that’s in this path: gdata/java/lib/
- google-collect-1.0-rc1
that’s in this path: gdata/java/deps/

import java.io.IOException;
import java.net.URL;
import java.util.List;

import com.google.gdata.client.GoogleService;
import com.google.gdata.data.Entry;
import com.google.gdata.data.Feed;
import com.google.gdata.util.AuthenticationException;
import com.google.gdata.util.ServiceException;

/**
 * @author Leniel Macaferi
 * @date 11-21-2011
 */
public class BloggerClient
{ public static void main(String[] args) throws IOException, ServiceException { try { GoogleService bloggerService = new GoogleService("blogger", "YourCompanyName-BloggerPostsSearcher"); bloggerService.setUserCredentials("YourEmailAddress@gmail.com", "YourPassword"); searchPosts(bloggerService, "YourBlogID", "StringToSearchFor"); } catch (AuthenticationException e) { // TODO Auto-generated catch block e.printStackTrace(); } } public static void searchPosts(GoogleService myService, String blogId, String search) throws ServiceException, IOException { // Request the feed URL feedUrl = new URL("http://www.blogger.com/feeds/" + blogId + "/posts/default"); Feed resultFeed = myService.getFeed(feedUrl, Feed.class); // Setting the number of posts to retrieve... resultFeed.setTotalResults(1000); List<Entry> posts = resultFeed.getEntries(); // Print the results System.out.println(resultFeed.getTitle().getPlainText()); for (Entry post : posts) { if(post.getTitle().getPlainText().contains(search)) { System.out.println("\t" + post.getTitle().getPlainText()); } } System.out.println(); } }

In the code above you need to replace accordingly the following parts:

- YourEmailAddress
- YourPassword
- YourBlogID

References
Blogger Client Libraries and Sample Code

Blogger Developer's Guide: .NET

Blogger Developer's Guide: Java

RavenDB Embedded with Management Studio UI

Go directly to solution with no bla bla bla…

I’ve been playing with RavenDB (a NoSQL document-oriented database) in an ASP.NET MVC 4 project for the past week. One thing I tried to do was to access RavenDB Management Studio UI so that I could see what’s actually present within the document store. This is important because one needs to check if docs are really being inserted, related docs are being deleted, etc…

Given that I’m running the embedded version of RavenDB (RavenDB-Embedded.1.0.499 package installed via NuGet in Visual Studio 2010), I was stuck trying to access the management studio since there isn’t much documentation on this subject when it comes to the EmbeddableDocumentStore. After struggling with it for about an hour of Googling and try and error, I decided to post a question at StackOverflow: Running RavenDB as an EmbeddableDocumentStore and accessing RavenDB Management Studio. Then I took a break to have launch and took a nap. After that I got back here to try a different approach and it really does work. Of course this is only a way to achieve what I want. This may not be the best approach but it’s enough. Just follow theses steps:

1 - Grab RavenDB latest build here:
http://builds.hibernatingrhinos.com/downloadlatest/ravendb

2 - Extract the files to C:\RavenDB-Build-499

3 - Edit the .config file in C:\RavenDB-Build-499\Server\Raven.Server.exe.config to point to your embedded database:

<appSettings>
   
<add key="Raven/Port" value="8088"/>
   
<add key="Raven/DataDir" value="C:\MyProject\trunk\MyProject\
App_Data\Database"
/>
   
<add key="Raven/AnonymousAccess" value="Get"/>
</appSettings>

4 - Click the Start.cmd present in the root folder C:\RavenDB-Build-499\Start.cmd

The server status output window should appear while it starts:

RavenDB server status windowFigure 1 - RavenDB server status window

When the server finishes its starting process, the Silverlight Management UI should be automatically opened in your preferred browser.

RavenDB Management UI (Web UI)Figure 2 - RavenDB Management Studio UI (Web UI)

Now I can see my docs, indexes, etc… and I hope you can too! :D

Note to self
According to John Allers, one should be able to access the Management Studio without having to start the server manually. That’s fine and I had already tried that, but I could not get it working at first (some days ago). This has led me to try everything else today and my last resort was posting a question at StackOverflow. After trying once more the same procedure, that is, trying to access the management studio using the URL http://localhost:8080, I finally got it working! Go figure. One possibility is that I had another service running on port 8080 when I first attempted to access the UI. As Windows has restarted since then, that service (Hudson probably) that was running on port 8080 is stopped and now everything just works as expected.

Things to do:

1 - Instantiate your EmbeddableDocumentStore this way:

_documentStore = new EmbeddableDocumentStore
            {
                ConnectionStringName = "YourDbName",
                UseEmbeddedHttpServer = true
            };

2 - Copy Raven.Studio.xap present in C:\RavenDB-Build-499\Server\ folder to the root folder of your web project

3 - Run you your web app

4 - Access http://localhost:8080 and voila… everything SHOULD work out of the box.

5 - Select Default Database:

RavenDB Management Studio accessed without running the server manuallyFigure 3 - RavenDB Management Studio accessed without running the server manually

Resources
Embedding RavenDB into an ASP.NET MVC 3 Application

Tree Graph Ordered Traversal Level by Level in C#

Recently as part of a job interview process, I was asked to solve some programming problems. This post shows the solution for one of such problems.

Problem
The problem ( or could we call it an algorithm exercise? ) is this:

Consider a tree of integers. Knowing that its root node is 0, and given its adjacency list as a two dimensional array of integers, write a function that prints out the elements/nodes in order/level by level starting from the root. That is, the root is printed in the first line, elements that can be reached from the root by a path of distance 1 in the second line, elements reached by a path of distance 2 in the third line, and so forth. For example, given the following adjacency list (draw the tree for a better view):

0 => 1, 2, 3
1 => 0, 4
2 => 0
3 => 0, 5
4 => 1, 6
5 => 3
6 => 4

The program should print:

0
1 2 3
4 5
6

Little bit of theory
If you read about Tree in Graph theory, you’ll see that we can represent a tree using a graph because a tree is an undirected graph in which any two vertices are connected by exactly one simple path. In other words, any connected graph without cycles is a tree.

The tree in this problem isn’t a binary tree, it’s a n-ary tree.

Solution
With theory in mind, here goes my proposed solution…

I’m reusing some code from past posts. In special, the Graph, AdjacencyList, Node, NodeList and EdgeToNeighbor classes.

I use this method to fill a Graph with the Tree structure:

/// <summary>
/// Fills a graph with a given tree structure.
/// </summary>
/// <param name="graph"></param>
private static void FillGraphWithTreeStructure(Graph graph)
{
    // Vertexes
    graph.AddNode("0", null);
    graph.AddNode("1", null);
    graph.AddNode("2", null);
    graph.AddNode("3", null);
    graph.AddNode("4", null);
    graph.AddNode("5", null);
    graph.AddNode("6", null);

    // Edges
    graph.AddDirectedEdge("0", "1");
    graph.AddDirectedEdge("0", "2");
    graph.AddDirectedEdge("0", "3");

    graph.AddDirectedEdge("1", "4");

    graph.AddDirectedEdge("4", "6");

    graph.AddDirectedEdge("3", "5");

    /* This is the tree:
               
            0
          / | \
         1  2  3
        /       \
       4         5
      /
     6
             
        This is the expected output:
             
        Level 1 = 0
        Level 2 = 1 2 3
        Level 3 = 4 5
        Level 4 = 6

    */
}

This is the method that does the hard work:

/// <summary>
/// Performs an ordered level-by-level traversal in a n-ary tree from top-to-bottom and left-to-right.
/// Each tree level is written in a new line.
/// </summary> 
/// <param name="root">Tree's root node</param>
public static void LevelByLevelTraversal(Node root)
{
    // At any given time each queue will only have nodes that
    // belong to a level
    Queue<Node> queue1 = new Queue<Node>();
    Queue<Node> queue2 = new Queue<Node>();

    queue1.Enqueue(root);

    while (queue1.Count != 0 || queue2.Count != 0)
    {
        while (queue1.Count != 0)
        {
            Node u = queue1.Dequeue();

            Console.Write(u.Key);

            // Expanding u's neighbors in the queue
            foreach (EdgeToNeighbor edge in u.Neighbors)
            {
                queue2.Enqueue(edge.Neighbor);
            }
        }

        Console.WriteLine();

        while (queue2.Count != 0)
        {
            Node v = queue2.Dequeue();

            Console.Write(v.Key);

            // Expanding v's neighbors in the queue
            foreach (EdgeToNeighbor edge in v.Neighbors)
            {
                queue1.Enqueue(edge.Neighbor);
            }
        }

        Console.WriteLine();
    }
}

To spice things up I have implemented a Parallel version of the above method using a ConcurrentQueue:

/// <summary>
/// Performs an ordered level-by-level traversal in a n-ary tree from top-to-bottom and left-to-right in Parallel using a ConcurrentQueue.
/// Each tree level is written in a new line.
/// </summary> 
/// <param name="root">Tree's root node</param>
public static void LevelByLevelTraversalInParallel(Node root)
{
    // At any given time each queue will only have nodes that
    // belong to a level
    ConcurrentQueue<Node> queue1 = new ConcurrentQueue<Node>();
    ConcurrentQueue<Node> queue2 = new ConcurrentQueue<Node>();

    queue1.Enqueue(root);

    while (queue1.Count != 0 || queue2.Count != 0)
    {
        while (queue1.Count != 0)
        {
            Node u;
                    
            queue1.TryDequeue(out u);

            Console.Write(u.Key);

            // Expanding u's neighbors in the queue
            foreach (EdgeToNeighbor edge in u.Neighbors)
            {
                queue2.Enqueue(edge.Neighbor);
            }
        }

        Console.WriteLine();

        while (queue2.Count != 0)
        {
            Node v;
                    
            queue2.TryDequeue(out v);

            Console.Write(v.Key);

            // Expanding v's neighbors in the queue
            foreach (EdgeToNeighbor edge in v.Neighbors)
            {
                queue1.Enqueue(edge.Neighbor);
            }
        }

        Console.WriteLine();
    }
}

Now it’s time to measure the execution time using a StopWatch:

private static void Main(string[] args)
{
    Graph graph = new Graph();

    FillGraphWithTreeStructure(graph);

    Stopwatch stopWatch = new Stopwatch();

    stopWatch.Start();

    LevelByLevelTraversal(graph.Nodes["0"]);

    stopWatch.Stop();

    // Write time elapsed
    Console.WriteLine("Time elapsed: {0}", stopWatch.Elapsed);

    //Resetting the watch...
    stopWatch.Reset();

    stopWatch.Start();

    LevelByLevelTraversalInParallel(graph.Nodes["0"]);

    stopWatch.Stop();

    // Write time elapsed
    Console.WriteLine("Time elapsed: {0}", stopWatch.Elapsed);

    Console.ReadKey();
}

Now the results:

Sequential
0
1 2 3
4 5
6
Time elapsed: 00:00:00.0040340

Parallel
0
1 2 3
4 5
6
Time elapsed: 00:00:00.0020186

As you see, time is cut by a factor of 2. I currently have a Core 2 Duo processor in my Mac mini.

Hope you enjoy it and feel free to add your 2 cents to improve this code! Of course there are other ways of solving this very problem and I would like to see those other ways. Do you have any other better idea?

Download
You can get the Microsoft Visual Studio Console Application Project at:

https://sites.google.com/site/leniel/blog/TreeLevelTraversal.rar

To try out the code you can use the free Microsoft Visual C# 2010 Express Edition that you can get at: http://www.microsoft.com/visualstudio/en-us/products/2010-editions/visual-csharp-express

SVN, Hudson & MSBuild - Building code on post commit

SVN, Hudson and MSBuild - Revision control repository
SVN, Hudson and MSBuild - Continuous Integration

This is the third and last installment in the series I’m writing about SVN, Hudson and MSBuild.

Today I’m going to show you the last piece that actually makes the whole thing work. We could call this the plumbing. The piece lies within a specific SVN folder related to your project. It’s called hooks. The path to the hooks folder is this:

Project’s hooks folder before the set upFigure 1 - Project’s hooks folder before the setup

As you can see there are some template files ( .tmpl ). The one we’re going to use to inform Hudson that it’s time to build the code just committed to the repository is the file post.commit.tmpl. Make a copy of this file and change its extension to .bat since it’ll be used by SVN to execute some commands. The file should be named post-commit.bat.

Open the .bat file and add this code at the end:

SET REPOS=%1
SET REV=%2
SET CSCRIPT=C:\WINDOWS\system32\cscript.exe
SET VBSCRIPT=C:\svn\post-commit-hook-hudson.vbs
SET SVNLOOK=C:\Program Files\VisualSVN Server\bin\svnlook.exe
SET HUDSON=http://leniel-pc:8080/
"%CSCRIPT%" "%VBSCRIPT%" "%REPOS%" %REV% "%SVNLOOK%" %HUDSON%

Note above that we’re setting some vars and pointing to some specific files:

- CSCRIPT points to cscript.exe file that should be present in your Windows system32 folder.

- VBSCRIPT points to to the post-commit-hook-hudson.vbs file and its code is as follows:

repos   = WScript.Arguments.Item(0)
rev     = WScript.Arguments.Item(1)
svnlook = WScript.Arguments.Item(2)
hudson  = WScript.Arguments.Item(3)

Set shell = WScript.CreateObject("WScript.Shell")

Set uuidExec = shell.Exec(svnlook & " uuid " & repos)
Do Until uuidExec.StdOut.AtEndOfStream
  uuid = uuidExec.StdOut.ReadLine()
Loop
Wscript.Echo "uuid=" & uuid

Set changedExec = shell.Exec(svnlook & " changed --revision " & rev & " " & repos)
Do Until changedExec.StdOut.AtEndOfStream
  changed = changed + changedExec.StdOut.ReadLine() + Chr(10)
Loop
Wscript.Echo "changed=" & changed

url = hudson + "subversion/" + uuid + "/notifyCommit?rev=" + rev
Wscript.Echo url

Set http = CreateObject("Microsoft.XMLHTTP")
http.open "POST", url, False
http.setRequestHeader "Content-Type", "text/plain;charset=UTF-8"
http.send changed

- SVNLOOK points to svnlook.exe file that comes with VisualSVN Server (see part 1 of this series for more details about it).

- HUDSON points to your Hudson server address. Change it accordingly.

With it all configured we should be ready to get an automatic build when code is committed to the repository.

To test your environment, change any file already versioned and commit it. Open Hudson in your browser and watch a new build start automatically.

If you look in Hudson’s build Console Output you’ll see that the build was initiated by an SCM change.

That’s all!

This is how your SVN project hooks folder should look like now:

Project’s hooks folder after the set upFigure 2 - Project’s hooks folder after the setup

Can you spot another .bat file in the folder? It’s the pre-revprop-change.bat. I’ve been using it so that I can modify the commit’s log message/comment when I forget to mention something or to correct spelling. More info about this file can be seen in this StackOverflow question: What is a pre-revprop-change hook in SVN and how do I create it?

Tracking Last.fm profile visits with Google Analytics

I'm a big fan of analytics data; therefore I try to collect stats about anything – like visitors that check my web pages/profiles. It’s no different with my Last.fm profile.

Last.fm profile page uses BBCode markup that is a lightweight markup language used to format posts in many message boards and unfortunately it has no JavaScript support.
Google Analytics (GA) on the other hand makes use of JavaScript code when it comes to the tracking code one must include in their website. This causes a frustrating experience to anyone that tries to integrate the GA tracking code in places where JavaScript code isn’t accepted.

While searching for a solution some time ago I didn’t find a way to solve this impedance, but yesterday after putting a little bit more caring on my Google search query, I finally found a way of filling the gap so that both services could talk to each other: NoJSStats, short for No JavaScript Stats is the missing piece. It’s a free web service. Basically NoJSStats uses Google App Engine to track visitors using just an image or anything that allows external resource requests in the website you want to track.

Here’s what you need to do in order to get analytics data popping up at your Google Analytics account in respect to your Last.fm profile:

First, let’s configure GA side…

1 - Go to Google Analytics and click the Gear icon at the top right of any page and create a new Web Property:

 Creating a new Last.fm Google Analytics Web PropertyFigure 1 - Creating a new Last.fm Google Analytics Web Property

2 - Name the Web Property and in the Web Site URL field enter your Last.fm profile URL and click Create property:

Filling Last.fm Web Property with correct dataFigure 2 - Filling Last.fm Web Property with the correct data

3 - Take note of the Web Property ID assigned to your new Web Property:

Checking Web Property ID that’ll be used when configuring Last.fm profileFigure 3 - Checking Web Property ID that’ll be used when configuring Last.fm profile

Now let’s configure Last.fm side:

4 - Edit your Last.fm profile clicking in the Edit link located in the right side of your profile page:

Edit button used to get access to Last.fm profile dataFigure 4 - Edit button used to get access to Last.fm profile data

5 - Head to the “About You” field and add this line at the end:

[img]http://nojsstats.appspot.com/UA-1234567-89/www.last.fm/user/leniel[/img]

Do not forget to click the Save details button.

Two things related to that img URL deserve a note here:

You’re using the Last.fm GA Web Property ID, for example UA-1234567-89 taken from step 3 above.
You’re pointing to your Last.fm profile at the last part of the URL.

DO CHANGE these two parts accordingly.

Interesting enough to try is that if you copy & paste the URL from step 5 in your browser, you get this result:

http://www.google-analytics.com/__utm.gif?utmwv=1&utmn=37172618&utmsr=-&utmsc=-&utmul=-&utmje=0&utmfl=-&utmdt=-&utmhn=www.last.fm/user/leniel&utmr=&utmp=&utmac=UA-1234567-89&utmcc=__utma%3D167996071.415430387.1319322154.1319322154.1319322154.1%
3B%2B__utmb%3D167996071%3B%2B__utmc%3D167996071%3B%2B__utmz%
3D167996071.1319322154.2.2.utmccn%3D%28direct%29%7Cutmcsr%3D%28direct%
29%7Cutmcmd%3D%28none%29%3B%2B__utmv%3D167996071.189.24.47.122%3B

Awesome! NoJSStats does its magic and all GA data -- every single item is assembled and packed into the Request URL's query string (everything after the '?'). For more technical details, see this StackOverflow question: Why does Google Analytics use __utm.gif?

This is everything you need to do. Easy, no?

If you have problems
I have tested this workflow and it’s indeed working as expected. In case you can’t get it working:

If you check the Tracking Code status in GA it says: "Tracking Not Installed".

Last.fm Tracking Code statusFigure 5 - Last.fm Tracking Code status

It doesn't matter... really. That's because we aren't using the standard JavaScript code provided by GA. GA is waiting to detect the presence of that code. We are not using this approach and so it's OK if it reports "Tracking Not Installed".

Analytics data appear within a time frame. It's not real-time at the moment.

If you still do not see any data after a day or two, please leave a comment in this post and I’ll try to help.

Final notes
You should also see data on GA about your own visits to your profile! You can block this from being counted. Check this: How do I exclude my internal traffic from reports?

One of the advantages of using this approach instead of others like FLAG counter that I usually see all over Last.fm profiles is that you keep your data private and get way more stats from your visitors.

Last.fm is missing visitor Analytics features in its current incarnation. I sure hope it implements something interesting in this area.

For now, I hope you get a better insight of how your profile is performing in the internetz.

Xcode iPhone beginner projects with GitHub integration

I decided to follow a different path to learn software development for the iPhone - instead of online tutorials and Apple docs I got a book. I postponed my desire to learn but it’s time to revive it. I grabbed a beginner’s book on the subject: A Beginner's Guide to iOS SDK Programming by James A. Brannan & Black Ward. This book covers iOS 4.2 + Xcode 4. iOS 5 is on the verge of being released…Smiley confuso

I had to download the recent Xcode and its accompanying SDK bits again ( 3.17 GB ) as the ones I had installed were out of date (from September 2010) Smiley pensativo. It was just a matter of hitting Mac App Store and looking for Xcode. The download has everything you need to install to be able to follow the book samples.

As I started creating the sample projects in Xcode I thought it’d be an excellent opportunity to store these samples in an online repository ( repo ) with source code control for further reference and to share/allow the beginner developer to download and study all the samples. It’s also a good chance I have to play with Git since I’ve been using Subversion during the last years.

This post covers the basics to integrate Xcode with GitHub for the Mac OS user. GitHub is a web-based hosting service for software development projects that use the Git revision control system.

I try to synthetize lengthy and scattered docs you find on the subject and provide links to key docs and posts…

I learned how to use Xcode Organizer to integrate my online GitHub repository with Xcode built in support for software control management and started sending the projects to GitHub right after the second book sample. It’s better to start early or you’ll never do it!

This article in Mac OS Developer Library: Managing Versions of Your Project has everything you need to configure your project to use it with Git.

When you create an online/remote repository in GitHub you get instructions on how to set up the repo as getting a copy of the repo to work locally in your computer or sending an existing project to the remote repo. This help article from GitHub clarifies some things: Set up Git in Mac OS.

This post has the steps you have to follow to get a GitHub repo to work with Xcode projects: Version Control System with XCode 4 and Git Tutorial.

You’ll have to use Mac OS Terminal to type some commands. Nothing difficult.
As a matter of fact you should familiarize yourself with Terminal if you haven’t yet. The real fun is when you play with git commands in Terminal (take for example the powerful rebase command). Later in this post I’m going to use the support offered by Xcode which has UI for basic git commands as commit, push, pull, merge, etc - but Xcode doesn’t give you the power of the full set of git commands that are only available through the command line.

These are the Terminal commands I typed to send the book’s initial sample projects (QuickStart and C Main Project) to my remote repository located at GitHub: https://github.com/leniel/iPhone-Beginner-Guide
Take a special look at the highlighted commands:

Last login: Fri Aug 19 19:29:32 on ttys001
Leniel-Macaferis-Mac-mini:~ leniel$ cd /
Leniel-Macaferis-Mac-mini:/ leniel$ cd iPhone
Leniel-Macaferis-Mac-mini:iPhone leniel$ cd Local
Leniel-Macaferis-Mac-mini:Local leniel$ ls
C Main Project                iPhone Beginner's Guide.xcworkspace
QuickStart
Leniel-Macaferis-Mac-mini:Local leniel$ cd QuickStart
Leniel-Macaferis-Mac-mini:QuickStart leniel$ ls
QuickStart        QuickStart.xcodeproj
Leniel-Macaferis-Mac-mini:QuickStart leniel$ git remote add origin git@github.com:leniel/iPhone-Beginner-Guide.git
Leniel-Macaferis-Mac-mini:QuickStart leniel$ git push -u origin master
Counting objects: 16, done.
Delta compression using up to 2 threads.
Compressing objects: 100% (14/14), done.
Writing objects: 100% (16/16), 8.27 KiB, done.
Total 16 (delta 2), reused 0 (delta 0)
To git@github.com:leniel/iPhone-Beginner-Guide.git
* [new branch]      master -> master
Branch master set up to track remote branch master from origin.

Leniel-Macaferis-Mac-mini:Local leniel$ cd "C Main Project"
Leniel-Macaferis-Mac-mini:C Main Project leniel$ ls
C Main Project            C Main Project.xcodeproj
Leniel-Macaferis-Mac-mini:C Main Project leniel$ git push -u origin master
To git@github.com:leniel/iPhone-Beginner-Guide.git
! [rejected]        master -> master (non-fast-forward)
error: failed to push some refs to 'git@github.com:leniel/iPhone-Beginner-Guide.git'
To prevent you from losing history, non-fast-forward updates were rejected.
Merge the remote changes (e.g. 'git pull') before pushing again.
  See the 'Note about fast-forwards' section of 'git push --help' for details.
Leniel-Macaferis-Mac-mini:C Main Project leniel$ git pull origin master
warning: no common commits
remote: Counting objects: 16, done.
remote: Compressing objects: 100% (12/12), done.
remote: Total 16 (delta 2), reused 16 (delta 2)
Unpacking objects: 100% (16/16), done.
From github.com:leniel/iPhone-Beginner-Guide
* branch            master     -> FETCH_HEAD
Merge made by recursive.
QuickStart.xcodeproj/project.pbxproj             |  288 ++++++++++++++
QuickStart/QuickStart-Info.plist                 |   38 ++
QuickStart/QuickStart-Prefix.pch                 |   14 +
QuickStart/QuickStartAppDelegate.h               |   19 +
QuickStart/QuickStartAppDelegate.m               |   73 ++++
QuickStart/QuickStartViewController.h            |   13 +
QuickStart/QuickStartViewController.m            |   44 +++
QuickStart/en.lproj/InfoPlist.strings            |    2 +
QuickStart/en.lproj/MainWindow.xib               |  444 ++++++++++++++++++++++
QuickStart/en.lproj/QuickStartViewController.xib |  156 ++++++++
QuickStart/main.m                                |   17 +
11 files changed, 1108 insertions(+), 0 deletions(-)
create mode 100644 QuickStart.xcodeproj/project.pbxproj
create mode 100644 QuickStart/QuickStart-Info.plist
create mode 100644 QuickStart/QuickStart-Prefix.pch
create mode 100644 QuickStart/QuickStartAppDelegate.h
create mode 100644 QuickStart/QuickStartAppDelegate.m
create mode 100644 QuickStart/QuickStartViewController.h
create mode 100644 QuickStart/QuickStartViewController.m
create mode 100644 QuickStart/en.lproj/InfoPlist.strings
create mode 100644 QuickStart/en.lproj/MainWindow.xib
create mode 100644 QuickStart/en.lproj/QuickStartViewController.xib
create mode 100644 QuickStart/main.m
Leniel-Macaferis-Mac-mini:C Main Project leniel$ git push -u origin master
Counting objects: 14, done.
Delta compression using up to 2 threads.
Compressing objects: 100% (12/12), done.
Writing objects: 100% (13/13), 3.95 KiB, done.
Total 13 (delta 3), reused 0 (delta 0)
To git@github.com:leniel/iPhone-Beginner-Guide.git
   05ec270..fe84a7a  master -> master
Branch master set up to track remote branch master from origin.
Leniel-Macaferis-Mac-mini:C Main Project leniel$

Great! With the above commands I have sent both projects to my repo located at GitHub.

It’s important to note that for each project I created in Xcode I selected the option to create a local git repository as shown in Figure 1:

Xcode - Selecting Create local git repository for this projectFigure 1 - Xcode - Selecting Create local git repository for this project

With this in place I can now safely delete my local copy of both projects (folder Local I used above in Terminal commands) and work directly with the code of my remote repository. Let’s do it:

Open Xcode Organizer selecting the menu Window => Organizer:

Organizer window accessible through Xcode’s Window menuFigure 2 - Organizer window accessible through Xcode’s Window menu

I suppose you have already configured and added (+ button in the bottom left of Figure 2) your GitHub repo (green circle in Figure 2) to the Organizer following the docs I linked above.

To get a local working copy of your remote repository you must click the Clone button (see bottom part of Figure 2) and choose a location to place the repo files. After doing this you’ll get a new repo (mine is located in the folder /iPhone/iPhone-Beginner-Guide as you see in Figure 2). When I click my local copy of the repo I get this beautiful screen where I can see commit comments and changes I made to each file along the way (click to enlarge):

Local ( Clone ) copy of my online GitHub repository seen in Xcode OrganizerFigure 3 - Local ( Clone ) copy of my online GitHub repository seen in Xcode Organizer

Now it’s just a matter of working and modifying the project files or adding new projects and commit them to the repository through the menu File => Source Control => Commit…

One more important note is: when you commit something, it’s just committed in your local copy. You need one additional step: push the changes to GitHub. In Xcode you can select the file(s) or project(s) you want and go to File => Source Control => Push… For more on this, read: Commit Files to Add Them to a Repository.

In my case, when I select Push I get this Xcode dialog where I can select the Remote endpoint (GitHub repository) to which my committed files will go:

Xcode Push user interface and GitHub remote locationFigure 4 - Xcode Push user interface and GitHub remote location

As a bonus I created a Workspace as seen in Figure 5 to have all the sample projects at hand in a single Xcode window. The workspace has references to the projects and work somewhat like Microsoft Visual Studio’s solution file if you’re used to Microsoft developer tools. The workspace helps a lot during the commit and push tasks!

Xcode workspace with projects at left side paneFigure 5 - Xcode workspace with projects at left side pane

Well, I’m new to this new Xcode world and I think I’ll learn a lot from these simple sample beginner projects.

The next thing I'm gonna do is learn what file types I can ignore when committing… Thanks to StackOverflow there’s already a question addressing this very topic: Git ignore file for Xcode projects

Edit: following the advice of the above StackOverflow question, I added a .gitignore file to the repo.

Hope this helps.

SQL UPDATE statement with SELECT and SQL Server Image data type column

In a SQL Server database I have a table called users which has a column named signature. This column is of type Image.

My production SQL Server is located in a shared hosting environment.

One problem I’ve been facing lately is that I need permission to execute an UPDATE statement to insert a signature image for a given user. This problem occurs because to insert a signature image for a given user I have to execute a BULK statement like this for example:

--Update existing user
UPDATE users SET [signature] =(SELECT MyImage.* from Openrowset
Bulk 'C:\MyProject\trunk\MyCompany.Product\MyCompany.Product.Web\
Images\Signature.jpg'
, Single_Blob) MyImage) where Id
= '1111aaaa-1111-11aa-a111-111111a1a1a1'

The query above works fine in my local machine but when I tried to execute it on the remote/production server, I got this beautiful message:You do not have permission to use the bulk load statement.

In the shared hosting environment the execution of Bulk command is disabled by default for security reasons. This is annoying but totally understandable!

Using Bulk load - allows the user to populate a database from a file. It’s not available in shared environment because it is necessary to insert client files on SQL server locally (the production server).

So… great! I need a way to bypass this limitation because I won’t spend tubes of money paying a dedicated server… it’d make sense if and only if I needed a dedicated server.

1st try: import a specific user row from my local SQL Server to the remote instance using SQL Server Management Studio Import task. I got an error about constraint key violation because I already had the same row (for that user) on both databases. It’s just a matter of updating the signature column in the production database. This seemed to be a pain path.

2nd try: consider a dedicated server? Smiley pensativo No thanks… hehehe

3rd try and solution: a few days later I found myself thinking about this problem again (this signature column updating thing is a recurring task) and so I decided to find another way and it came to light - link the remote server to my local SQL Server Express instance and write a beautiful SQL query that does the job.

First I stopped in this excellent blog post with a step by step guide written by jen: Create Linked Server SQL Server 2008
This post provided everything I needed to link both SQL Server instances.

To make sure you have linked your server correctly, you can execute this query in your local server:

select server_id, name, product, provider, data_source, [catalog], is_linked
from sys.servers

The above query gave me this result:

Linked servers linked to my local SQL Server Express instance
Figure 1 - Linked servers linked to my local SQL Server Express instance

Then I Googled about Update with Select + SQL Server or something like that and found this StackOverflow question: Updating a table with multiple values from a select statement where the date matches. Lieven’s answer helped. I just had to adapt it to my case. This is the SQL code that does the dirty work:

UPDATE  U
SET     U.[signature] = users.[signature]
FROM    [LOCAWEB].[laudotech].[dbo].[users] U INNER JOIN users
ON users.id = U.id AND U.id = '1234aaaa-5678-90aa-b123-456789a0a1a2'

The above query must be executed within the context of the local SQL Server instance of course. Where the linked server resides.

To give you a view… this is how all this is configured inside SQL Server Management Studio (SSMS):

SSMS Object Explorer and the Linked Server LOCAWEB in my local SQL Server Express instanceFigure 2 - SSMS Object Explorer and the Linked Server LOCAWEB in my local SQL Server Express instance

There’s so many things one can do with SSMS that I feel really happy in learning one more of those things. Last week I blogged about Import/Export SQL Server database data with SSMS. Take a look at it.

Man! Have I said that I Iike working with databases!?

Hope it helps.