Saturday, August 13, 2016

How to network at conferences

I came across a post by an entrepreneur on Office Chai about why he stopped attending startup conferences.
He says:
"And it sure was fun. In the beginning, the a-ha moment used to be spotting a celebrity. “I’m in the same room as Sachin Bansal!”....
And I wasn’t my usual self at these conferences. I’m an engineer, and a bit of an introvert, but a switch used to flick on in my head as soon as I had a conference badge around my neck. I’d be approaching people, exchanging business cards, and talking to anyone who’d listen about my startup.
Except that it never went anywhere

and then...

"It took me two years (and lots of money in conference fees) to figure out what the problem was. Conferences are horribly artificial ways to meet people. Everyone is trying to meet as many people in as little time as possible, and the result is that interactions are superficial and hackneyed. Real relationship-building time and effort. By trying to meet everyone, you end up meeting no one."


What this entrepreneur mentioned here is one of the classic misconceptions people have when they attend conferences. What I'm going to write below, is not my wisdom, but something I've read about from a source I don't recollect (I think it was Rajesh Shetty's blog). But the words of advice remained in my mind and I've been practicing it too.

Here's what you need to remember:

Everyone who attends conferences and networking events have just one thing in mind: "What's in it for me?". So when you attend with the same thing in mind, how do you expect others to help you, when they are there trying to figure out how others can help them?
When you go to such events, go with this thought in mind "How can I bring value to someone else?" or "How can I truly make myself or my business useful to someone else?".

When you have this mindset, you'll automatically eliminate the conversations that only help you. You'll start thinking from the other person's perspective and truly evaluating whether you will actually be able to help that person or not. When you see that you can help the person and you explain it, the person will obviously see that there is a point in remaining in touch with you.

The relationship is retained and it grows.

Sometimes you may not see any opportunity to add value to anyone, but you will meet people whom you see you have a natural connect with. When you meet these people repeatedly, a familiarity and friendship does evolve.

Networking and conferences are indeed useful when you stop seeing it from the perspective of "What's in it for me?".

Saturday, August 6, 2016

Aha!

Continued from the previous Aha!


Internet
Share with this link




To be continued in the next Aha!


Monday, July 11, 2016

The better way to get date and time on your photos

On one of the latest Sony cameras, I switched on a feature that imprints the date the photograph was taken, onto the bottom right corner of the photo.

I was shocked at the result.

Sony's software puts a fat orange date stamp on the photo without even antialiasing it. Looks like we've been transported to the 1970's.

Searching for a better alternative led me to a script written by Terdon that makes use of imagemagick to extract exif information from the picture.

  • You can run it via the linux commandline
  • It's fast 
  • Doesn't mess up your existing image and 
  • Is also configurable

See the difference between Sony's orange date stamp and imagemagick's white one.



Here's what to do at your bash terminal:

sudo apt-get install imagemagick
sudo apt-get install exiv2
  
Then put this script in a file named watermark.sh:

#!/usr/bin/env bash

## This command will find all image files, if you are using other
## extensions, you can add them: -o "*.foo"

find . -iname "*.jpg" -o -iname "*.jpeg" -o -iname "*.tif" -o \
 -iname "*.tiff" -o -iname "*.png" |

## Go through the results, saving each as $img
while IFS= read -r img; do
 ## Find will return full paths, so an image in the current
 ## directory will be ./foo.jpg and the first dot screws up
 ## bash's pattern matching. Use basename and dirname to extract
 ## the needed information.

 name=$(basename "$img")
 path=$(dirname "$img")
 ext="${name/#*./}";

 ## Check whether this file has exif data
 if exiv2 "$img" 2>&1 | grep timestamp >/dev/null
 ## If it does, read it and add the water mark
 then
 echo "Processing $img...";
 convert "$img" -gravity SouthEast -pointsize 22 -fill white \
 -annotate +30+30 %[exif:DateTimeOriginal] \
 "$path"/"${name/%.*/.time.$ext}";
 ## If the image has no exif data, use the creation date of the
 ## file. CAREFUL: this is the date on which this particular file
 ## was created and it will often not be the same as the date the
 ## photo was taken. This is probably not the desired behaviour so
 ## I have commented it out. To activate, just remove the # from
 ## the beginning of each line.


 # else
 # date=$(stat "$img" | grep Modify | cut -d ' ' -f 2,3 | cut -d ':' -f1,2)
 # convert "$img" -gravity SouthEast -pointsize 22 -fill white \
 # -annotate +30+30 "$date" \
 # "$path"/"${name/%.*/.time.$ext}";

 fi
done


Change the file to an executable and run it:

chmod +x watermark.sh
./watermark.sh

Run it in the folder where your images are placed.

Works like a charm!

Sunday, July 3, 2016

The ingenuity of the humble zipper

"Clasp locker" and "hookless fastener". That's what the zip was called, once-upon-a-time.

See how the clasps get locked. A rather ingenious invention by Whitcomb L. Judson.




The reason I'm writing about it today, is because the zippers of my bag stopped working and I thought I'd have to replace the zipper teeth and the slider.

The person at the shop though, told me I'd only have to change the slider. Turned out that with time, the insides of the slider get worn out and they aren't able to make the zipper teeth clasp together anymore. The teeth are still ok, and you don't have to replace them.

Replacing involves removing some of the stitches that hold the teeth...


 ...removing the old slider completely, replacing it with a new slider and stitching it back up with a sewing machine.


That's it!
This entire process happened in less than 10 minutes, and cost me Rs.30 and Rs.20 for a large and small slider (stitching charges included).


The next time you worry about throwing away your bag or jacket because of a defective zipper, remember this post. You can get it fixed in a jiffy. If possible, try to get a new slider that matches your bag/jacket. The person at the shop might not have a variety of colours.

Friday, July 1, 2016

Discovering the unknown

X-rays were discovered by accident. We have never seen electrons. We don't know if light is a particle or a wave because of properties which observed. We don't see or hear many wavelengths which other insects and animals can.

In order for our senses of sight, smell, touch, taste and hearing to be able to receive this data and for the brain to process it, we needed to convert X-rays into something that we could see. Electrons paths into something we could experiment upon and prove.

We really are very limited by our senses.

But what if we created a machine and an AI that was capable of manufacturing sensors which would detect properties of the Earth and Universe which we never discovered, and then translate that into data that we can understand?

We would be able to not just discover so many more dimensions, we might even be able to re-program our DNA to be able to take advantage of that data.

How, is the question.

Tuesday, June 14, 2016

Aha!

Continued from the previous Aha!


Shoe manufacturing
Share with this link



Continued in the next Aha...


Saturday, May 14, 2016

Aha!

Continued from the previous Aha!


Egotistical vehicles!
Share with this link


Continued in the next Aha
 

Wednesday, May 11, 2016

The 3D hologram on the smartphone actually works!

A colleague of my classmate showed me an article & video of a hologram created on a smartphone, and I didn't quite believe it.



It did make me curious though. Holograms & VR are something I'd love to work on, so I actually tried it out.

Find a CD case or even transparent plastic or glass. Perhaps even a plastic bottle as one person did.


Cut out four pieces of a trapezium from it. Drawing a diamond shape as shown below is by far the most convenient and space-efficient way of doing it. Once you've cut it out on paper, stick it to the plastic with cello-tape and carefully cut the plastic.
 
You end up with this

Super-glue would be a better option for sticking it, but I just used cello-tape. Make sure the 1cm edge part is even, because that's the part you'll be keeping on your smartphone.

Now take your pick from the various videos available.


and watch it come alive!






Turns out there are far more impressive holographic illusions than the smartphone illusion. Play the one below and see.



And all this isn't modern technology. Such illusions have existed since the year 1584. There are plenty of examples scattered across history. More recently, holograms were used in Mr.Narendra Modi's election campaign!

It all began when an Italian scientist named Giambattista della Porta created the illusion of what later came to be known as Pepper's ghost.

This is how the illusion works:


Modern technology comes into play when you want to actually touch the hologram. As of today, that's made possible in a small way with lasers.




Beautiful!



How the smartphone hologram illusion works

In reality, the smartphone hologram is not a hologram at all. It's simply a reflection that can be viewed from all 4 sides. I remember when a person first showed me the video, my first question to him was "what happens if we remove one of the plastic panes?"

While the video is playing, try putting your finger into the middle of the plastic pyramid you created. You'll still be able to see the 'hologram' because your finger isn't blocking anything. In-fact, instead of creating four trapezium's, if you held just one pane of plastic or glass at that angle, facing you, you'd still be able to see the 'hologram'.

When you stand 2 feet in front of a mirror, your mirror image appears to be 2 feet behind the glass. That's the exact same illusion the plastic trapezium creates. At that angle, it just reflects the video, making you think the image is somewhere behind the plastic, in the air. The reflections from the other three panes aren't visible to you because they are at angles that don't reflect or refract the light in your direction. The only way having 4 trapeziums helps, is that you get to see the reflection suspended in the air, in the four directions you view it from. It also creates the illusion that an object that looks three dimensional, is contained within the pyramid shaped container you created. That's all there is to it.
Still...impressive!

Makes me wonder....could a rainbow be called a real hologram?


Saturday, May 7, 2016

More concepts of Apache Storm you need to know

As my mentor tells me -  
"To be able to program in Storm, you first need to think in Storm".


That's very true. Storm is designed in a way that allows you to structure your application design in a way that allows it to scale. But before you begin that journey, there are a few concepts you need to know, which aren't explained in the official documentation (or sometimes not explained clearly enough).




The Constructor concept
[Thanks to Matthias J. Sax on the Storm mailing list for explaining this]

When you create a Spout or a Bolt, Storm calls that class constructor only once. After that, the class gets serialized and from then on, whenever Storm needs to create a new instance of the Spout or Bolt, Storm uses the serialized instance to do so.

But for every new instance Storm creates, Storm will call the open() function for Spouts and the prepare() function for Bolts.
So open() and prepare() are like constructors.



Making your Spouts or Bolts do things at specified intervals

Use tick tuples.



The reason you should exit the nextTuple() function ASAP
[Thanks to Spico Florin on the Storm mailing list for explaining this]

I had created a while loop in nextTuple() of my Spout to emit multiple tuples, but I didn't receive any ack's at all.
Turns out that nextTuple() and the ack() method are called in the same thread by the framework. So if you have  heavy computation in the next tuple, your ack() method will never be called and the buffers that are responsible for receiving the ack messages will not be emptied. The nextTuple() acts as producer for the these buffers while ack() as a consumer.

So remember to emit a tuple and exit the nextTuple() function immediately.
For those who don't know about the ack() method, you can override it (and the fail() method) in your Spout like this:

    @Override
    public void ack(Object msgId) {
        System.out.println("Ack received for Spout"+msgId);
        tupleAck = true;
    }   
   
    @Override
    public void fail(Object msgId) {
        System.out.println("Failed tuple msgID: "+msgId);
        //---tuple replay logic should be here
    }


This helps you know whether your tuple was received and processed by the Bolt or whether the transmission or processing failed.



More creative topology structures
[Thanks to Matthias J. Sax on the Storm mailing list for the idea]

When learning Storm, we come across simple examples and are conditioned into thinking that way.



It doesn't have to be that way though. When you use streams and the various techniques of grouping, you'll find a whole new world open up.

Example:
If you want to create a topology where the spout notifies the end bolts that it has no more input, you can do it this way:

Just specify a separate stream in the spout and emit the notification tuples. When creating the topology, specify an allGrouping for the receiving bolts. What happens is that no matter how many instances of the bolt are created, the spout will send the tuple to all of them. It's like a broadcast.

So the topology would be created like this:

TopologyBuilder b = new TopologyBuilder();

b.setSpout("SpoutA_name", new SpoutA(), 1)
.setNumTasks(1);      
     
b.setBolt("boltA_name", new boltA(), 2)
.shuffleGrouping("SpoutA_name");

b.setBolt("boltB_name", new boltB(), 5)
.fieldsGrouping("boltA_name", new Fields(someID))
.allGrouping("SpoutA_name", "StreamName");



This is how the spout sends a stream to the bolts at the end:

@Override
public void declareOutputFields(OutputFieldsDeclarer ofd) {
    ofd.declare(new Fields("theNormalTuple"));
    ofd.declareStream("StreamName", new Fields("someID"));//this specifies the stream that reaches the end bolt B
}
   
@Override
public void nextTuple() {       

    if (nothingMoreToProcess) {
        collector.emit("StreamName", new Values(someID));//this emits the stream to bolts B
    }       
    else {
        collector.emit(new Values(someTuples), someTuples);//this emits to bolts A
    }
}     



...and this is how the bolt receives it:

@Override
public void execute(Tuple tuple) {
    
    if (("StreamName").equals(tuple.getSourceStreamId())) {//recognizes the stream from the spout
        //do whatever you do when there is nothing more to process
    }
    else {  
        //do your usual processing
    }
}


Don't stop here. There are plenty more ways to think of emitting streams and directing the flow. Remember that Storm is designed to be a Directed Acyclic Graph. You can design your topology as such.




In the code, which are the tasks and executors?

There's a confusion about tasks and executors, because in this case:

builder.setBolt("bolt1", new BoltA(), 3)
                .setNumTasks(2)

Storm creates 3 executors and 2 tasks.

but

In this case (if you don't specify setNumTasks)

builder.setBolt("bolt1", new BoltA(), 2) 

Storm creates 2 executors and 2 tasks.

Remember that a task is an instantiation of a serialized instance of BoltA class (see the constructor concept at the top of this page). An executor is just a thread which processes a task class. If an executor has to process two task classes, then the executor will process the first one and only then process the second one.



Additional links:

If you are looking for a simple tutorial or code sample for Storm, they are here:

Sunday, May 1, 2016

Mass brainwashing

Do you know that diamonds are not a woman's best friend?
Well, allow me to introduce you to the diamond myth:

"...diamonds only came into popularity in the 20th century...But in 1870, a huge cache of diamonds was unearthed in South Africa...With a voluble increase in available diamonds, how could they be priced for their scarcity and rareness? Diamond mine financiers realized that they needed to protect their interests...And that’s how De Beers — which held a monopoly on diamond mining and pricing — was created...

In the 1930s...to promote diamond story lines, place engagement rings in films and get diamonds into the hands of burgeoning celebrities...in the 1940s, a series of lectures was promoted in high schools...

All the advertising, film and television placement and mass psychology worked. After 20 years on the campaign...the younger generation had successfully been implanted with the idea that diamonds were a de rigeur part of courtship. To this new generation a diamond ring is considered a necessity to engagements...

...De Beers sold the idea that a diamond was an expensive but necessary token of affection...Conversely, a woman who didn’t have an engagement ring –who wasn’t showered with diamonds throughout her relationship — was somehow “less loved” than her diamond-swathed counterparts....It’s a lie that started less than a 100 years ago, and it’s a lie the diamond industry has been banking on ever since."

I found this report very intriguing. Haven't we all been brainwashed in similar ways? Made to feel that doing certain things were an absolute necessity to be accepted in society?

Even before mass advertising took over the planet, there were Shaman's, Rainmakers, Soothsayers, Witch doctors, Oracles, Astrologers. After that came the need to drink Complan to gain height. To drink Bournvita to gain capability. To remember only Maggi for a quick snack. To drink only Coca Cola / beer when thirsty. To drink Boost / Glucon D for energy.
As though there were no other cheaper, healthier and much much better alternatives available!


Religion

Take religion for example. What exactly does your religion actually want you to do?

To help people. To respect and be kind to others. To live peacefully. To recognize and appreciate that a beautiful universe may have been created by a much wiser, powerful being.

Is this what religious people actually do? What I see religious people do, is robotically following a set of pre-defined rituals, organizing cash-flow, killing in the name of religion, spreading fear, superstition, greed, ignorance, anger, groupism, hatred and selfishness.

It's very surprising that even grown adults don't realize that they would have been following the rituals of some other religion if they were born into that family. It isn't surprising though, that most people think they are being religious by simply following rituals. They forget what their religion actually wants them to do. Such is the power of mass brainwashing, fear and hysteria.


Social Customs

Same applies to other social customs. Going out for a movie and dinner is somehow considered cool. Having a party at a club, going out for a company-sponsored lunch, volunteering grandly for one hour...

These may be enjoyable sometimes. These may be enjoyable to some people. But do you really enjoy it?
Would you find it more enjoyable to read an interesting book? To go on a long drive? To explore places?


A comic by Zen Pencils captures this nicely: http://zenpencils.com/comic/nerdist/


We live in a society that sees someone doing something they love, see the happiness on their faces and somehow believe that if we do the same thing, we would be happy too. What would really make you happy is the removal of the pressure to imitate others. To realize your interests and to do what makes you happy, no matter what the brainwashed masses think of it. It is in that moment that we find true peace and joy.
- Navin Ipe


That's also when you realize the true meaning of "Be yourself".

Of course, it's also important to keep in mind the laws of the land, the practicalities of finance, your dependents and responsibilities of life.

Think for yourself, people. You own your mind, you have the right to know the truth and to live life the way you wish.

Wednesday, April 27, 2016

Aha!

Continued from the previous Aha!


Abracadabra traffic!
Share with this link




Continued in the next Aha!
 

Monday, April 18, 2016

A simple Apache Storm tutorial [Part 2: Implementing failsafes]


Continued from part1


If you really want to understand what the Value class is, what the Tuple class is etc., the best place to look, is not the tutorials on the internet. Look at the actual Storm source code.
It's available here: https://github.com/apache/storm
Go into the "storm-core/src/jvm/org/apache/storm" folder and have a look at those Java files. The code is very simple to understand and I promise you, it will be an enlightening experience.

Now, onto the ack and fail aspects of Storm.

Given below, is the exact same program as Part 1 of this tutorial. The added sections and sections that need your attention are highlighted.


BasicStorm.java:

package com.sdint.basicstorm;

import org.apache.storm.Config;

import java.util.concurrent.TimeUnit;
import org.apache.storm.LocalCluster;
import org.apache.storm.topology.TopologyBuilder;

public class BasicStorm {

    public static void main(String[] cmdArgs) {
       
        Config config = new Config();
        //config.put(Config.TOPOLOGY_DEBUG, false);
        config.put(Config.TOPOLOGY_MAX_SPOUT_PENDING, 1);
        config.put(Config.TOPOLOGY_MESSAGE_TIMEOUT_SECS, 10);//alters the default 30 second of tuple timeout to 10 second
       
        TopologyBuilder builder = new TopologyBuilder();
        builder.setSpout("myDataSpout", new DataSpout());
       
        builder.setBolt("proBolt", new ProcessingBolt()).shuffleGrouping("myDataSpout");
       
        LocalCluster localCluster = new LocalCluster();
        localCluster.submitTopology("BasicStorm", config, builder.createTopology());
       
        System.out.println("\n\n\nTopology submitted\n\n\n");
        pause(120);//pause for 120 seconds during which the emitting of tuples will happen
       
        //localCluster.killTopology("BasicStorm");
        localCluster.shutdown();
    }//main


    public static void pause(int timeToPause_InSeconds) {
        try {TimeUnit.SECONDS.sleep(timeToPause_InSeconds);} 
        catch (InterruptedException e) {System.out.println(e.getCause());}
    }
 }//class


DataSpout.java:

package com.sdint.basicstorm;

import java.util.Map;
import org.apache.storm.spout.SpoutOutputCollector;
import org.apache.storm.task.TopologyContext;
import org.apache.storm.topology.OutputFieldsDeclarer;
import org.apache.storm.topology.base.BaseRichSpout;
import org.apache.storm.tuple.Fields;
import org.apache.storm.tuple.Values;

import java.util.concurrent.TimeUnit;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

public class DataSpout extends BaseRichSpout {
    private TopologyContext context;
    private SpoutOutputCollector collector;
   
    //---logger
    private final Logger logger = LoggerFactory.getLogger(DataSpout.class);
   
    private boolean tupleAck = true;
    private Long oldTupleValue;
   
   
    @Override
    public void open(Map map, TopologyContext tc, SpoutOutputCollector soc) {
        this.context = tc;
        this.collector = soc;
       
        System.out.println("\n\n\nopen of DataSpout\n\n\n");      
    }
   
    public DataSpout() {
        System.out.println("\n\n\nDataSpout ctor called\n\n\n");
    }//ctor

    @Override
    public void declareOutputFields(OutputFieldsDeclarer ofd) {
        System.out.println("\n\n\ndeclareoutputfields of DataSpout\n\n\n");
       
        ofd.declare(new Fields("line"));
    }

    @Override
    public void nextTuple() {
        System.out.println("\n\n\nnexttuple of DataSpout\n\n\n");
       
        Long newTupleValue;
        if (tupleAck) {
            newTupleValue = System.currentTimeMillis() % 1000;
            oldTupleValue = newTupleValue;
        }
        else {newTupleValue = oldTupleValue;}

       
        this.collector.emit(new Values(newTupleValue), newTupleValue);
        System.out.println("\n\n\nEmitting "+newTupleValue+"\n\n\n");
        pause(1);
    }
   
    @Override
    public void ack(Object msgId) {
        System.out.println("\n\n\nAck received for DataSpout"+msgId+"\n\n\n");
        tupleAck = true;
    }   
   
    @Override
    public void fail(Object msgId) {
        System.out.println("\n\n\nFailed tuple msgID: "+msgId+"\n\n\n");
        //replay logic should be here
        tupleAck = false;
    }

 

    public void pause(int timeToPause_InSeconds) {
        try {TimeUnit.SECONDS.sleep(timeToPause_InSeconds);} 
        catch (InterruptedException e) {System.out.println(e.getCause());}
    }
    
}//class



ProcessingBolt.java:

package com.sdint.basicstorm;

import java.util.Map;
import org.apache.storm.task.OutputCollector;
import org.apache.storm.task.TopologyContext;
import org.apache.storm.topology.OutputFieldsDeclarer;
import org.apache.storm.topology.base.BaseRichBolt;
import org.apache.storm.tuple.Tuple;

public class ProcessingBolt extends BaseRichBolt {
    private OutputCollector collector;

    @Override
    public void declareOutputFields(OutputFieldsDeclarer ofd) {
        System.out.println("\n\n\ndeclareOutputFields of ProcessingBolt called\n\n\n");
    }

    @Override
    public void prepare(Map map, TopologyContext tc, OutputCollector oc) {
        System.out.println("\n\n\nprepare of ProcessingBolt called\n\n\n");
        collector = oc;
    }

    @Override
    public void execute(Tuple tuple) {
        System.out.println("\n\n\nTuple received in ProcessingBolt:"+tuple+" \n\n\n");
        collector.ack(tuple);
    }

   
}



Notice that this time when you run the program, the ack function in the Spout will get called whenever the Bolt executes the collector.ack(tuple); statement.

But suppose you comment out collector.ack(tuple);, then after a certain time period (normally 30 seconds, but in our program we made it 10 seconds), the fail function will get called.

This is how the Spout (and we) know whether a tuple has been received by the Bolt and acknowledged or not. The above program basically uses the System time as a Tuple and in case the Bolt does not acknowledge that it has received the Tuple, then the Spout sends the same old Tuple to the Bolt again.


Before getting into hardcore Storm programming, there is this important thing:

Apache Storm concepts you really need to know.



A simple Apache Storm tutorial [Part1]

Apache Storm is actually well documented. Problem is, you won't understand any of it until you actually try out some code (even if it's in the form of a nice explanation by Chandan Prakash), and there's a dearth of simple code available on the internet. NRecursions comes to your rescue :-)

To run this program, you can either do it with Gradle (which I've used mainly so that the jar dependency management would automatically be handled by Gradle) or you could simply create a normal Java project and manually add the necessary jar's. The jar's you'll need are:
  • asm-5.0.3.jar
  • bson4jackson-2.7.0.jar
  • clojure-1.7.0.jar
  • disruptor-3.3.2.jar
  • kryo-3.0.3.jar
  • log4j-api-2.1.jar
  • log4j-core-2.1.jar
  • log4j-over-slf4j-1.6.6.jar
  • log4j-slf4j-impl-2.1.jar
  • logback-classic-1.1.3.jar
  • logback-core-1.1.3.jar
  • logback-core-1.1.3.jar
  • minlog-1.3.0.jar
  • objeneiss-2.1.jar
  • reflectasm-1.10.1.jar
  • servlet-api-2.5.jar
  • slf4j-api-1.7.12.jar
  • storm-core-1.0.0.jar

Create a Gradle project named "BasicStorm" and create a source package named "com.sdint.basicstorm".

Within that package, create BasicStorm.java

package com.sdint.basicstorm;

import org.apache.storm.Config;

import java.util.concurrent.TimeUnit;
import org.apache.storm.LocalCluster;
import org.apache.storm.topology.TopologyBuilder;

public class BasicStorm {

 public static void main(String[] cmdArgs) {

 Config config = new Config();
 //config.put(Config.TOPOLOGY_DEBUG, false);
 config.put(Config.TOPOLOGY_MAX_SPOUT_PENDING, 1);
 config.put(Config.TOPOLOGY_MESSAGE_TIMEOUT_SECS, 10);//alters the default 30 second of tuple timeout to 10 second

 TopologyBuilder builder = new TopologyBuilder();
 builder.setSpout("myDataSpout", new DataSpout());

 builder.setBolt("proBolt", new ProcessingBolt()).shuffleGrouping("myDataSpout");

 LocalCluster localCluster = new LocalCluster();
 localCluster.submitTopology("BasicStorm", config, builder.createTopology());

 System.out.println("\n\n\nTopology submitted\n\n\n");
 pause(120);//pause for 120 seconds during which the emitting of tuples will happen

 //localCluster.killTopology("BasicStorm");
 localCluster.shutdown();
 }//main


 public static void pause(int timeToPause_InSeconds) {
    try {TimeUnit.SECONDS.sleep(timeToPause_InSeconds);} 
    catch (InterruptedException e) {System.out.println(e.getCause());}
}

}//class



and DataSpout.java

package com.sdint.basicstorm;

import java.util.Map;
import org.apache.storm.spout.SpoutOutputCollector;
import org.apache.storm.task.TopologyContext;
import org.apache.storm.topology.OutputFieldsDeclarer;
import org.apache.storm.topology.base.BaseRichSpout;
import org.apache.storm.tuple.Fields;
import org.apache.storm.tuple.Values;

import java.util.concurrent.TimeUnit;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

public class DataSpout extends BaseRichSpout {
 private TopologyContext context;
 private SpoutOutputCollector collector;

 //---logger
 private final Logger logger = LoggerFactory.getLogger(DataSpout.class);

 @Override
 public void open(Map map, TopologyContext tc, SpoutOutputCollector soc) {
 this.context = tc;
 this.collector = soc;

 System.out.println("\n\n\nopen of DataSpout\n\n\n");
 }

 public DataSpout() {
 System.out.println("\n\n\nDataSpout constructor called\n\n\n");
 }//ctor

 @Override
 public void declareOutputFields(OutputFieldsDeclarer ofd) {
 System.out.println("\n\n\ndeclareoutputfields of DataSpout\n\n\n");

 ofd.declare(new Fields("line"));
 }

 @Override
 public void nextTuple() {
 System.out.println("\n\n\nnexttuple of DataSpout\n\n\n");

 Long newTupleValue =System.currentTimeMillis() % 1000;

 this.collector.emit(new Values(newTupleValue), newTupleValue);
 System.out.println("\n\n\nEmitting "+newTupleValue+"\n\n\n");
 pause(1);
 }


 public void pause(int timeToPause_InSeconds) {
    try {TimeUnit.SECONDS.sleep(timeToPause_InSeconds);} 
    catch (InterruptedException e) {System.out.println(e.getCause());}}
}//class




and ProcessingBolt.java

package com.sdint.basicstorm;

import java.util.Map;
import org.apache.storm.task.OutputCollector;
import org.apache.storm.task.TopologyContext;
import org.apache.storm.topology.OutputFieldsDeclarer;
import org.apache.storm.topology.base.BaseRichBolt;
import org.apache.storm.tuple.Tuple;

public class ProcessingBolt extends BaseRichBolt {
 private OutputCollector collector;

 @Override
 public void declareOutputFields(OutputFieldsDeclarer ofd) {
 System.out.println("\n\n\ndeclareOutputFields of ProcessingBolt called\n\n\n");
 }

 @Override
 public void prepare(Map map, TopologyContext tc, OutputCollector oc) {
 System.out.println("\n\n\nprepare of ProcessingBolt called\n\n\n");
 collector = oc;
 }

 @Override
 public void execute(Tuple tuple) {
 System.out.println("\n\n\nTuple received in ProcessingBolt:"+tuple+" \n\n\n");
 collector.ack(tuple);
 }

}




Your build.gradle file would look like this (if you choose to create a Gradle project. You won't need this if you're creating a simple Java project):

apply plugin: 'java'
apply plugin: 'eclipse'

defaultTasks 'jar'

jar {
 from {
        (configurations.runtime).collect {
            it.isDirectory() ? it : zipTree(it)
        }
    }   
    manifest {
        attributes 'Main-Class': 'com.sdint.basicstorm.BasicStorm'
    }
}

sourceCompatibility = '1.8'
[compileJava, compileTestJava]*.options*.encoding = 'UTF-8'

if (!hasProperty('mainClass')) {
    ext.mainClass = 'com.sdint.basicstorm.BasicStorm'
}

repositories {
    mavenCentral()
}

dependencies { 
    //---apache storm
    compile 'org.apache.storm:storm-core:1.0.0'  //compile 'org.apache.storm:storm-core:0.10.0'
    //---logging
    compile "org.slf4j:slf4j-api:1.7.12"
    compile 'ch.qos.logback:logback-classic:1.1.3'
    compile 'ch.qos.logback:logback-core:1.1.3'   
       
    testCompile group: 'junit', name: 'junit', version: '4.10'

}



Hope you've already read a little about how the Spout's and Bolt's work in Storm.

This is what happens in BasicStorm:

  • The moment a Spout or Bolt is instantiated, its declareOutputFields function gets called. For our simple program we don't need to know what it is, so let's ignore it for now.
  • When the topology is submitted to Storm, the open function gets called for Spouts, and this function gets called only once for a Spout instance.
  • For Bolts, the equivalent of open is the prepare function which gets called. You can use open and prepare to do initializations, declarations etc. for your program.
  • After submitting the topology, we pause for a while in main() (pause(120);) so that Storm would get time to run the topology. This is the time during which Storm calls nextTuple() of the Spout, and when the Spout emits a Tuple, the Tuple is sent to the Bolt (because in main() we configured the Bolt to receive Tuples from the Spout. See the line builder.setBolt("proBolt", new ProcessingBolt()).shuffleGrouping("myDataSpout");). 
  • When the Bolt receives the value, the execute() function of the Bolt is called.
  • BasicStorm is designed for a simple task. DataSpout emits a Long value (it's a Tuple), ProcessingBolt receives it and ProcessingBolt acknowledges (the collector.ack(tuple); line) that it has received the Long value and that the data processing for the tuple is complete.
  • When DataSpout receives the acknowledgement, it calls nextTuple() again and another tuple gets emitted.
  • This process keeps going on for the 120 seconds we have paused the main() thread for. After that, the topology shuts down.

Try tweaking the values here and there to find out how the program works. Try substuting some other value in place of the Long for the Tuple. Try substituting it with a class object. Remember that if you substitute it like that, you'll have to extend that class with Serializable.

One more thing to try out:
Try commenting out collector.ack(tuple); in ProcessingBolt.java. You'll basically not be telling the Spout that the tuple has been received by the Bolt. So after some time, the Spout will emit the tuple again. The interval of time the Spout waits for an acknowledgement (ack) is normally 30 seconds, but if you scroll up, you'll see that in main() we had set the time to 10 seconds (config.put(Config.TOPOLOGY_MESSAGE_TIMEOUT_SECS, 10);), so the Spout will wait just 10s before emitting the tuple again.



Continued in Part 2


Wednesday, April 6, 2016

A comfortable Git branching model

Git has a very flexible approach toward branching, and workflows. The best I've seen yet.
Novices would probably be happy using the popular branching model Vincent Driessen has shown on his webpage: http://nvie.com/posts/a-successful-git-branching-model/



I've learnt from and liked Vincent's work, but during the course of working with the branching model, it has been more comfortable to use a slightly different model, especially when collaborating with other developers.



What I propose to be different...

...is simply that instead using the master branch to maintain the "stable" version of the code, it is more comfortable to have the master branch as the branch that the entire team collaborates on

A separate branch named "stable" (see; even the name is more appropriate than 'master') is created, which is also pushed to the remote/blessed repository, and that's the branch that keeps your stable version of the code that is ready for release.

It's not only about the comfort that the master branch affords you for collaborative development: ie: simply using "git pull" instead of "git pull origin development" etc.

It's also the fact that you'll have to do fewer merges between branches, which could end up looking like this:


The above screenshot is from GitUp. One of the best Git clients I've seen. Once you do a merge, you can actually undo it too (only available for MacOS though. When I wanted to port it to Linux myself, I saw the developers post that he's used too much of Objective C code for the graphics to be able to port it).

Monday, March 28, 2016

Aha!

Continued from the previous Aha!



Creation
Share with this link



Continued in the next Aha...
 

Friday, March 4, 2016

Generate a gitignore for Netbeans, Eclipse, IntelliJ Idea, Gradle or anything else!

I've written about gitignore for Netbeans, but recently came across something much better. There's actually a gitignore generator which can generate the file and folder list that Git should ignore. It's available for 255 types of IDE's, OS'es etc.

Either just type the name of the IDE, OS or programming language and click "Generate"...

OR

... install the commandline version of it and at the terminal, simply type:

gi <IDE or language or OS name>

and the custom gitignore gets generated.

or if you want to directly generate the file, just add the redirection operator:

gi <IDE or language or OS name>  >>  .gitignore

or if you don't want to generate specific gitignores for every project, simply generate a global one.

Eg:
gi linux, netbeans >> ~/.gitignore_global



As simple as that. It's quite amazing that someone took the time to actually create a gitignore generator. Thoughtful, and very much a gem for programmers around the world!



A little video if you still need help:

Sunday, February 28, 2016

National science day: The cultivation of scientific temper

Today was Breakthrough Society's celebration of National Science Day and the observation of the "Cultivation of scientific temper". It included many school children and adults who were invited to the Raman Research Institute at Bangalore for a seminar and an exhibition.





T'was my first visit to the campus, and I just loved the beauty of it. Reminded me of Bangalore, the way it was before concrete, asphalt and smoke took over the city!




The conference touched upon topics of scientific importance, about the need for research in India, the realization that people should have about science and the way it impacts humanity and an appeal that people go through some trouble in life; even if it involves some amount of struggle, so that their research and discoveries can bring greater good to humanity.

The seminar was followed by a documentary on Dr.C.V.Raman's life.




and a visit to the science museum, where there were various exhibits and publications of Dr.C.V.Raman.

We were shown how Dr.Raman (at the age of 16 or 17), had questioned why the Veena sounded like a human, and pursued the question. Even though Helmholtz had said that a plucked string could oscillate in only an odd number of ways (1, 3 or 5) etc..., Dr.Raman went on to prove that the Veena could even oscillate in even numbers, hence producing sounds similar to that of a human. This was because Helmholtz had experimented with a Violin, whereas the strings of a Veena are strung very differently, and when plucked, they oscillate differently.

Dr.Raman also dropped a metal bearing on glass and noticed the circular crack it made on impact. He tried the same on quartz crystals and noticed that here, the cracks were triangular in shape. Typical scientist :-)

The science museum has a large collection of objects he studied and collected. Said to be the largest collection any scientist has. Photos were forbidden since they said that if the photos were shared on social media, that would dampen the curiosity of the children. I disagree. The collection is so unique and amazing, that people would actually want to send their kids to actually see it. Believe me; there are no photographs that can capture the glimmer and wonder of those stones which are captured by the eye.

Also on display were some interesting quotes:






There was also an interesting quote from Einstein, which I followed up on:
I have now reached the point where I may indicate briefly what to me constitutes the essence of the crisis of our time. It concerns the relationship of the individual to society. The individual has become more conscious than ever of his dependence upon society. But he does not experience this dependence as a positive asset, as an organic tie, as a protective force, but rather as a threat to his natural rights, or even to his economic existence. Moreover, his position in society is such that the egotistical drives of his make-up are constantly being accentuated, while his social drives, which are by nature weaker, progressively deteriorate. All human beings, whatever their position in society, are suffering from this process of deterioration. Unknowingly prisoners of their own egotism, they feel insecure, lonely, and deprived of the naive, simple, and unsophisticated enjoyment of life. Man can find meaning in life, short and perilous as it is, only through devoting himself to society.
The economic anarchy of capitalist society as it exists today is, in my opinion, the real source of the evil. We see before us a huge community of producers the members of which are unceasingly striving to deprive each other of the fruits of their collective labor—not by force, but on the whole in faithful compliance with legally established rules. In this respect, it is important to realize that the means of production—that is to say, the entire productive capacity that is needed for producing consumer goods as well as additional capital goods—may legally be, and for the most part are, the private property of individuals.

But perhaps one of the most important quotes the community was missing out on, was this one:


It's not really about the survival of the strongest or the most intelligent. It is about the one that is most responsive to change.

My view

We see many protests in our country about growing intolerance and the lack of promotion of science. Makes me wonder why this happens. Many decades ago, it was the will of certain curious people which helped them make discoveries, helped the world to advance and brought pride to the nation. Any government would be happy to encourage such people and I believe this is why science was given importance. This is why we had impressive research institutes and dedicated people.

It created value for the nation.

As the years passed by, did something else create more value for the nation? Did research cease to add as much value as something else? Did our nation become self-reliant enough that it no longer needed science? Is it more economical to purchase a technology from a neighbouring nation than to invest in our own research? Does the work of private enterprises add more value to our nation?

As Darwin's quote in the above picture says; it is the species that is most responsive to change that survives. Has our scientific community adapted to the changes? We live at an age that has seen more peace on Earth than in historical times. Is this an age where the entire world works as one (except North Korea)? Where research in basic sciences, can be done at select countries so that other countries can focus on other things? Does a country really need to worry that it is not doing enough research?

What exactly is the bigger picture here? I'm quite sure that if a nation needed more research, the government would have actively pursued it. If a government isn't doing so, there's a reason. What is that reason?

Is our advanced, industrialized society really deteriorating (as Einstein said) and becoming focussed on the trivialities of life?

I believe things will change. Not by dissent; not by protest. The change will be brought about by a mind that conceives such a thought, that it would alter the very way we live our lives. In the same way that a hundred years ago, everyone who predicted the future did not in the least bit take into account or conceptualize the advent of software.

There is hope.

Saturday, February 27, 2016

Descriptive names

We've all been through our initial programming lessons, creating for loops as for(int i = 0; i<10; ++i) and so on. float s = d / t.

Variable names didn't matter much then.

The confusion begins, when you enter the professional world, write large programs, forget about them and months later, look at the code and wonder what in the world you had written long back. Class names, variable names and function names don't make any sense at all!!!
If you took care to write comments, then they are a saving grace, but only for a while.

Which of these would make more sense to you?

//check if customer age is equal to threshold
void check(double v
{
   if (val == v) {print("yes");} else {print("no");}
}

or

void checkIfCustomerAgeIsEqualToAgeThreshold(double customerAge
{
   if (ageThreshold == customerAge) {print("yes");} else {print("no");}
}

See the difference? No need of comments. No confusion. You understand what the function and variables are there for, just by looking at the code.

The other big advantage is that if you decided to do some refactoring, you wouldn't have to bother changing any comments, because the class/function/variable name itself is the comment!

Same goes with macros.
I see a lot of C++ programmers using...

#if 0
//some code
#endif

...to temporarily deactivate some code. If these programmers get hit by a bus or even if they look at the code many years later, they won't have a clue of why the code was deactivated and whether it is ok to simply delete it.

What if they used this instead:

#ifdef YEAR_CALCULATION_CODE_TEMPORARILY_DEACTIVATED_DONT_DELETE
//some code
#endif

or

#ifdef CODE_FOR_DEBUGGING_DELETEME_ANYTIME

or

#ifdef SWITCH_ON_COLOR_OUTPUT


Same goes with class names. Which makes more sense?:

class Filter{}
class MicroFilter extends Filter {}

or

class AirFilterSuperclass {}
class SpecializedMicroFilterOfCompanyABC extends AirFilterSuperclass {}


  • It makes a world of a difference to the person who maintains the code.
  • It makes another world of a difference to the person who reviews your code.
  • And it makes an entirely different world of a difference to the business that profits because developers, tech leads and QA teams spent lesser time figuring out the code and spent more time in being able to create a splendid product!

Monday, February 8, 2016

Aha!

 Continued from the previous Aha!


Signatures
Share with this link




After the sneeze
Share with this link



Continued in the next Aha!
 

Friday, January 22, 2016

Aha!

 Continued from the previous Aha!


The game of life
Share with this link





Choices! Choices!
Share with this link




Continued in the next Aha!

Tuesday, January 12, 2016

Files and folders to add to gitignore for a Netbeans project

You're obviously on this page because you're a Netbeans fan. Congrats for choosing it over Eclipse!

When it came to creating a .gitignore file for my Netbeans project, it was hard to find resources on the internet to figure it out. Needing to push and pull to a Git repository meant that I shouldn't mess up my colleague's or my project settings, but still be able to share code.

So after some trial and error, these are the files I figured that need to be added to .gitignore.


/.project
*.o
*.o.d
/build/
/dist/
/lib/
/nbproject/private/
/nbproject/Makefile-Debug.mk
/nbproject/Makefile-impl.mk
/nbproject/Makefile-Release.mk
/nbproject/Makefile-variables.
mk

/nbproject/Package-Debug.bash
/nbproject/Package-Release.
bash
.dep.inc

DefaultComponent.mak
DefaultConfig.cg_info
MainDefaultComponent.cpp
MainDefaultComponent.h 


Some interesting trivia about the Netbeans-Eclipse war from James Gosling himself: http://nrecursions.blogspot.in/2014/07/preview-your-webpage-realtime-while.html#useanyideyoulike