Category Archives: Featured

The best articles

“Drupalkar” – an initiative to disseminate Drupal Love to Pune colleges

(This post gives an overview of a new initiative by the Pune Drupal Community to increase Drupal awareness amongst Pune college students. It was originally posted on drupal.org by Dipen Chaudhary and is reposted here for wider dissemination.)

Overview

drupal icon, svg version
Click on the logo to see all PuneTech posts about Drupal. Image via Wikipedia

In the last pune drupal user meetup ( 26th dec, 09) we all agreed to the need of more drupal awareness as a whole and one of the ways we discussed was to tap the interest early at the student level. To this effect project code named drupalkar (having both hindi and marathi connotations) was planned in which seasoned drupal developers will donate 1 hr to a student gathering in a college to spread drupal joy/way of web development. Approach here is simple, in that 1 hr drupal developer would demo how a website can be made from scratch in 1 hour in form a interactive demo and giving out more ideas and possibilities at the end of the demo. The same would be followed in many colleges with different developers donating time with different demo’s (or maybe same, if one works out to be exceptionally good to catch attention)

Approach

Approach to reach out to students is simple, show something they can make in an hour rather than a speech/talk on why drupal, how drupal? as the talks get incoherent ( its upto the drupal developer conducting the demo session to open up for Q/A which I strongly recommend) and over the time ineffective. We would pickup models from internet and show students how to make that site in drupal and how easy web development is with drupal.

Pilot

To pilot the drupalkar project we decided to pick 5 colleges (list pending and hopefully will be sorted in comments) and 5 drupal volunteers, after which we will map colleges to drupal volunteers and let the individual drupal developer run the show, with some syncing among developers over whats being demo’ed, what was the response etc. As the events would be linear we can adapt and evolve from the feedback of the initial events.

Drupal Volunteers (Drupalkar’s)

  • Dipen Chaudhary
  • Rajeev Karajgikar
  • Parsad Shirgaonkar
  • Nikhil Kale
  • Abhishek Nagar

Colleges

  • PICT (TBC)
  • COEP (TBC)
  • Wadia College (TBC)
  • Narangkar (not sure of the name; TBC)
  • Symbiosis ( Abhishek to confirm)

College Coordinators

  • Arun Nair
  • Amit Karpe
  • Vipul (PICT contact)

Timeline

Tentative timeline for the pilot to be executed is 5-6 weeks, which might be adapted and updated on this page for clarification.

Notes

Drupalkar currently is only for pune and maybe taken over to new cities in India by drupal evangelists else where. I am cross posting this to India group for feedback on the approach etc.

Any comments, feedback on the approach or volunteering is important for success of drupalkar, most importantly we need students and organizers to patch colleges in for the drupalkar pilot.

(Comments on this post are closed. Please comment at the original website)

Meeting Report: Pune Rails Meetup (Dec 2009)

(This is a report of Pune Ruby on Rails meetup that happened on 12th December. This report was originally written by Gautam Rege on his blog, and is reproduced here with permission for the benefit of PuneTech readers.)

Click on the logo to find all punetech articles about Rails in Pune

It was great to be a part of the Pune Rails Meetup which was held yesterday (19th December, 2009) at ThoughtWorks, Pune. It was an idea initiated by Anthony Hsiao of Sapna Solutions which has got the Pune Rails community up on their feet. Helping him organize was a pleasure!

It was great to see almost 35 people for this meet — it was a probably more than what we expected. It was also heartening to see a good mix in the crowd – professionals in rails, students working in rails and students interested in rails – not to forget entrepreneurs who were very helpful.

Proceedings began with Vincent and _______ (fill in the gaps please — am really lousy with names) from ThinkDRY gave an excellent presentation on BlankApplication – a CMS++ that they are developing. I say CMS++ because its not just another CMS but has quite a lot of ready-to-use features that gets developers jump-started. There were interesting discussions regarding how ‘workspaces’ are managed and how its indeed easier to manage websites.

After this technical talk, I spoke next on my experience at the Lone Star Ruby Conference in Texas. I tried to keep the session interactive with the intention of telling everyone how important it is to know and use Ruby effectively while working in Rails. Dave Thomas’s references to the ‘glorious imperfection’ of Ruby did create quite a buzz. To quote a little from Dave’s talk:

name {}

This is a method which takes a block as a parameter but the following line is a method which takes a has as a parameter! A simple curly parenthesis makes all the difference!

name ( {} )

Similarly, the following line is a method m() whose result is divided by ‘n’ whose result is divided by ‘o’

m/n/o

but add a space between this and its a method m() which takes a regular expression as a parameter!

m /n/o

It was nice to see everyone get involved in these interactive sessions. More details about my experience at LSRC is here.

After this there was another technical talk about a multi-app architecture  that has been developed by Sapna Solutions. Anthony and Hari gave a talk on this and it was very interesting to see it work. Using opensource applications like shopify, CMS and other social networking apps to work with a shared-plugin and a single database, its possible to create a mammoth application which is easily customizable and scalable.

Hari did mention a few problems like complexity in migrations and custom routes which they currently ‘work-around’ but prefer a cleaner approach. Some good suggestions were provided by Scot from ThoughtWorks regarding databases. I suggested some meta-programing to align models. Working with git submodules and ensuring rake scripts to sync up data, this indeed seems to have a lot of potential.

There were some new entrepreneurs from ______ who have already developed a live application in Merb which they discussed and explained details of. It was good to hear about how they managed performance and scalability testing. The Q&A forum which was the next event was extremely interactive. Some of the discussions were:

Which are really great CMS in Rails?

There were some intense discussions regarding RadiantCMS, Adva and even BlankApp. The general consensus was a ‘programmable CMS’ Vs WYSIWYG. Those who prefer more of the content management prefer CMS’s like Drupal, Joomla. Those who prefer more customization via programing and code, prefer Radiant. This topic could not close and is still open for discussion.. Do comment in your views – I am a radiant fan ;)

What about testing? Cucumber, Rspec, others?

Usually its still adhoc – testing is expensive for smaller firms — so adhoc blackbox testing is what is done. I opined that cucumber and rspec ROCK! Cucumber is great for scenario testing and testing controller logic and views. Rspec is great for Direct Model Access and Cucumber can make great use of Webrat for browser testing.

In Rpsec, when do we use mocks and stubs?

It was suggested that mocks and stubs should be used when there are no ready model and code. If the code is ready, its probably just enough not to use mocks and stubs directly. Comments welcome on this!

How do you do stress testing?

Stress testing, concurrency testing and performance testing can be done using http-perf. It was interesting to note that ____ have actually done their own implementation for stress and concurrency testing. I recommended they open source it.

How are events, scheduled job and delayed jobs handled?

This was my domain :) Using delayed_job is the way to go. Following the leaders (github) and using Redis and resque would be great too but definitely not backgrounDrb or direct cron!

What project management tools do you use? Pivotal Tracker, Trac, Mingle?

Pivotal tracker suits startup needs. Mingle rocks but becomes expensive. Scott ? ;) Dhaval from TW mentioned how easy it was to co-ordinate an ‘mingle’ with their 200 strong team over distributed geographies.

Which SCM do you use? git, svn, cvs?

People have been very comfortable with git and more and more are migrating from svn to git.  It was heartening to see that nobody uses CVS :) Jaju (I have have misspelt) gave an excellent brief about how code and diffs can be squished and ‘diff’ed with another repository before the final merge and push to the master. Dhaval gave an idea about how they effectively used git for managing their 1GB source code (wow!)

Some pending questions – probably in next meet-up

  1. Which hosting service do you use and why?
  2. TDD or BDD?

Suggestions are welcome!

About the Author – Gautam Rege

Gautam Rege is the co-founder and managing director at Josh Software, Pune.

Gautam has an engineering degree in Computer Science from PICT, Pune. In his 9 years in the IT industry, he has worked in companies like Symantec, Zensar and Cybage before starting Josh 2 years ago.

Gautam’s technical knowledge spans from various languages like C, C++, Perl, python, Java to software expertize in various industry domains like Finance, Manufacturing, Insurance and even advertising.

As with the company name, Gautam has a lot of ‘josh’ about new and emerging technologies. His company is one of the few which works almost exclusively in Ruby on Rails, the cutting edge web technology that has taken the industry by storm.

(Comments on this article are closed. Please comment at the location of the original article)

Reblog this post [with Zemanta]

Conference report: The 4th IndicThreads conference on Java Technologies

(The IndicThreads conference on Java Technologies was held in Pune last weekend. This conference report by Dhananjay Nene was published on his must-read blog and is re-published here with permission. The slides used during the presentations can be downloaded from the conference website here and are also linked to in context in Dhananjay’s report below. In general, PuneTech is interested in publishing reports of tech events and conferences that happen in Pune, as long as they go into sufficient technical depth, and especially if links to slides are available. So please do get in touch with us if you have such a report to share.)

indicthreads logo smallThe annual indicthreads.com java technology conference is Pune’s best conference on matters related to Java technologies. I looked forward to attending the same and was not disappointed a bit. The last one was held about 3 days ago on Dec 11th and 12th, and this post reviews my experiences at the same.

As with any other conference usually something or the other isn’t quite working well in the morning, so I soon discovered we had a difficulty with the wireless network being swamped by the usage. There were some important downloads that needed to be completed, so my early morning was spent attempting to get these done .. which meant I missed most of Harshad Oak’s opening session on Java Today.

The next one i attended was Groovy & Grails as a modern scripting language for Web applications by Rohit Nayak. However I soon discovered that it (at least initially) seemed to be a small demo on how to build applications using grails. Since that was something I was familiar with, I moved to the alternative track in progress.

The one I switched to even as it was in progress was Java EE 6: Paving the path for the future by Arun Gupta. Arun had come down from Santa Clara to talk about the new Java EE6 spec and its implementation by Glassfish. Arun talked about a number of additional or changed features in Java EE6 in sufficient detail for anyone who got excited by them to go explore these in further detail. These included web fragments, web profile, EJB 3.1 lite, increased usage of annotations leading to web.xml now being optional, and a number of points on specific JSRs now a part of Java EE6. Some of the things that excited me more about Glassfish were, (a) OSGi modularisation and programmatic control of specific containers (eg Servlet, JRuby/Rails etc.), embeddability, lightweight monitoring. However the one that excited me the most was the support for hot deployment of web apps for development mode by allowing the IDEs to automatically notify the running web app which in turn automatically reloaded the modified classes (even as the sessions continued to be valid). The web app restart cycle in addition to the compile cycle was alway one of my biggest gripes with Java (second only to its verbosity) and that seemed to be going away.

I subsequently attended Getting started with Scala by Mushtaq Ahmed from Thoughtworks. Mushtaq is a business analyst and not a professional programmer, but has been keenly following the developments in Scala for a couple of years (and as I later learnt a bit with Clojure as well). Unlike a typical language capability survey, he talked only about using the language for specific use cases, a decision which I thought made the presentation extremely useful and interesting. The topics he picked up were (a) Functional Programming, (b) DSL building and (c) OOP only if time permitted. He started with an example of programming/modeling the Mars Rover movements and using functions and higher order functions to do the same. Looking back I think he spent lesser time on transitioning from the requirements into the code constructs and in terms of what he was specifically setting out to do in terms of higher order functions. However the demonstrated code was nevertheless interesting and showed some of the power of Scala when used to write primarily function oriented code. The next example he picked up was a Parking Lot attendant problem where he started with a Java code which was a typical implementation of the strategy pattern. He later took it through 7-8 alternative increasingly functional implementations using Scala. This one was much easier to understand and yet again demonstrated the power of Scala quite well in terms of functional programming. Onto DSLs, Mushtaq wrote a simple implementation of a “mywhile” which was a classical “while” loop as an example of using Scala for writing internal DSLs. Finally he demonstrated the awesome power of using the built in support for parser combinators for writing an external DSL, and also showed how a particular google code of summer problem could be solved using Scala (again for writing an external DSL). A very useful and thoroughly enjoyable talk. (Here is a link to the code used in this presentation. -PuneTech)

The brave speaker for the post lunch session was Rajeev Palanki who dealt both with overall IBM directions on Java and a little about MyDeveloperworks site. In his opinion he thought Java was now (post JDK 1.4) on the plateau of productivity after all the early hype and IBM now focused on Scaling up, Scaling down (making it easier to use at the lower end), Open Innovation (allow for more community driven innovation) and Real Time Java. He emphasised IBMs support to make Java more predictable for real time apps and stated that Java was now usable for Mission Critical applications referring to the fact that Java was now used in a USS Destroyer. He referred to IBMs focus on investing in Java Tooling that worked across different JRE implementations. Tools such as GCMV, MAT, and Java Diagnostic Collector. Finally he talked about the IBM MyDeveloperWorks site at one stage referring to it as the Facebook for Geeks.

The next session was Overview of Scala Based Lift Web Framework by Vikas Hazarati, Director, Technology at Xebia. Another thoroughly enjoyable session. Vikas dealt with a lot of aspects related to the Lift web framework including various aspects related to the mapper, the snippets, usage of actors for comet support etc. I was especially intrigued by Snippets which act as a bridge between the UI and the business logic have a separate abstraction for themselves in the framework and how the construct and functionality in that layer is treated so differently from other frameworks.

I subsequently attended Concurrency: Best Practices by Pramod Nagaraja who works on the IBM JRE and owns the java.nio packages (I think I heard him say owns). He talked about various aspects and best practices related to concurrency and one of the better aspects of the talk was how seemingly safe code can also end up being unsafe. However he finished his session well in time for me to quickly run over and attend the latter half of the next presentation.

Arun Gupta conducted the session Dynamic Languages & Web Frameworks in GlassFish which referred to the support for various non java environments in Glassfish including those for Grails/Groovy, Rails/JRuby, Django/Python et. al. The impression I got was Glassfish is being extremely serious about support for the non java applications as well and is dedicating substantial efforts to make Glassfish the preferred platform for such applications as well. Arun’s blog Miles to go … is most informative for a variety of topics related to Glassfish for both Java and non Java related aspects.

The last talk I attended during the day was Experiences of Fully Distributed Scrum between San Francisco and Gurgaon by Narinder Kumar, again from Xebia. Since a few in the audience were still not aware of agile methodologies (Gasp!), Narinder gave a high level overview of the same before proceeding down the specific set of challenges his team had faced in implementing scrum in a scenario where one team was based in Gurgaon, India and another in San Fransciso, US. To be explicit, he wasn’t describing the typical scrum of scrum approaches but was instead describing a mechanism wherein the entire set of distributed teams would be treated as a single team with a single backlog and common ownership. This required some adjustments such as a meeting where only one person from one of the locations and all from another would take part in a scrum meeting in situations where there were no overlapping working hours. There were a few other such adjustments to the process also described. The presentation ended with some strong metrics which represented how productivity was maintained even as the activities moved from a single location to a distributed model. Both during the presentation and subsequently Narinder described some impressive associations with senior Scrum visionaries and also some serious interest in their modified approach from some important companies. However one limitation I could think of the model was, that it was probably better geared to work where you had developers only in one of the two locations (offshoring). I perceived the model as a little difficult to work if developers were located across all locations (though that could end up being just my view).

The second day started with a Panel Discussion on the topic Turning the Corner between Arun Gupta, Rohit Nayak, Dhananjay Nene (thats yours truly) and moderated by Harshad Oak. It was essentially a discussion about how we saw some of the java and even many non java related technologies evolving over the next few years. I think suffice to say one of the strong agreements clearly was the arrival of Java the polyglot platform as compared to Java the language.

The next session was Developing, deploying and monitoring Java applications using Google App Engine by Narinder Kumar. A very useful session describing the characteristics, opportunities and challenges with using Google App Engine as the deployment platform for Java based applications. One of the take away from the sessions was that subject to specific constraints, it was possible to use GAE as the deployment platform without creating substantial lockins since many of the Java APIs were supported by GAE. However there are a few gotchas along the way in terms of specific constraints eg. using Joins etc.

I must confess at having been a little disappointed with Automating the JEE deployment process by Vikas Hazrati. He went to great depths in terms of what all considerations a typical J2EE deployment monitoring tool should take care of, and clearly demonstrated having spent a lot of time in thinking through many of the issues. However the complexities he started addressing started to get into realms which only a professional J2EE deployment tool writer would get into. That made the talk a little less interesting for me. Besides there was another interesting talk going on simultaneously which I was keen on attending as well.

The other talk I switched to half way was Create Appealing Cross-device Applications for Mobile Devices with Java ME and LWUIT by Biswajit Sarkar (who’s also written a book on the same topic). While keeping things simple, Biswajit explained the capabilities of Java ME. He also described LWUIT which allowed creation of largely similar UI across different mobile platforms. He explained that while the default Java ME used native rendering leading to differing look and feel across mobile handsets just like Java AWT, using LWUIT allowed for a Java Swing like approach where the rendering was performed by the LWUIT library (did he say around 300kb??) thus allowing for a more uniform look and feel. He also showed sample programs and how they worked using LWUIT.

Allahbaksh Asadullah then conducted the session on Implementing Search Functionality With Lucene & Solr, where he talked about the characteristics and usage of Lucene and Solr. It was very explicitly addressed at the very beginners to the topic (an audience I could readily identify myself with) and walked us through the various characteristics of search, the different abstractions, how these abstractions are modeled through the API and how some of these could be overridden to implement custom logic.

How Android is different from other systems – An exploration of the design decisions in Android by Navin Kabra was a session I skipped. However I had attended a similar session by him earlier so hopefully I did not miss much.

However Navin did contribute occasionally into the next session Java For Mobile Devices – Building a client application for the Android platform by Rohit Nayak. Rohit demonstrated an application he is working on along with a lot of the code that forms the application using Eclipse and the Android plugin. A useful insight into how an Android application is constructed.

As the event drew to a close, the prizes were announced including those for the Indicthreads Go Green initiative. A thoroughly enjoyable event, leaving me even more convinced to make sure to attend the next years session making it a third in a row.

(Comments on this post are closed. Please comment at the site of the original article.)

Pune’s SMSONE gets techcrunched: Mirco-local news to make Silicon Valley Jealous

Pune-based company SMSONE (see previous PuneTech coverage) has just been covered by TechCrunch, one of the most influential and widely read tech blogs in the world (as a result of an introduction by PuneTech).

Sarah Lacy, editor-at-large at TechCrunch was in India for about a month in November, and she was in Pune for a day, hosted by Abinash Tripathy. During her Pune visit, PuneTech introduced her to a bunch of local companies, and SMSONE was one of them.

Excerpts from her article:

But every once in a while I find a company that hits the trifecta: It’s addressing a big problem locally, it’s something I don’t think is offered in the US, and…. I want it. And when a product in undeveloped, chaotic, messy India can make someone in Silicon Valley feel jealous, you know that entrepreneur has come up with something good.

I’m talking about SMSONE Media, a company I met in Pune about a week ago. Like most of the impressive companies I saw in India, it’s aimed squarely at the base of the pyramid and is using basic SMS to deliver services to people some of India’s most unconnected areas. It was started by Ravi Ghate, who proudly points out that none of his core team graduated from high school, much less attended an IIT or IIM. (Typically not something you brag about in India.)

Later, the article quotes Ravi Ghate, CEO of SMSONE, on their future plans:

Right now Ghate’s operation is in 400 communities, reaching roughly 400,000 readers. He just got an investment from the government of Bangalore to boost that reach to five million readers in the next four months.

Ghate is clear that the money will be used strictly to reach more people. The company already breaks even and Ghate makes enough to pay his basic living expenses. He doesn’t care about fancy cars or clothes. It wasn’t too long ago that he was one of those disadvantaged kids, selling flags and berries on the side of the road and being told to go away. He still regularly travels between villages by bus and stays in $5/a night hotels

FYI: There’s one detail that her article gets wrong. The article says:

The economics work out like this: Out of a 1000 rupee ad sale, 300 of it goes to the reporter, and Ghate pays him an additional 50 rupees for each news story. That adds up to a nice income for a village kid

Actually, of the Rs. 1000 that an ad earns, Rs. 300 is kept by SMSONE and the rest goes to the reporter. But other than this inaccuracy, the article does a great job of capturing the essence of SMSONE.

Reblog this post [with Zemanta]

PuneChips Editor’s Blog – SystemVerilog and Designer Productivity

The most recent PuneChips event was easily the most successful one in the short history of the group. Over 50 engineers attended the “SystemVerilog” talk by Clifford Cummings, President of Sunburst Design and SystemVerilog industry guru. A big thank you to a few folks who made this possible is in order; first off Parag Mehta of Qlogic for connecting us with Cliff; secondly in addition to Parag, Pravin Desale and Deepak Lala of LSI, and Jagdish Doma of Virage Logic for driving the attendance. Last, but not the least, we must also thank Cliff for taking us through a complex topic in a very engaging manner. Cliff certainly held the audience in rapt attention through an hour of highly technical discussion. The Q&A session was also very engaging. Of course, Cliff being the industry celebrity that he is, was mobbed by engineers asking questions after his speech.

It is very clear that SystemVerilog is clearly targeted at improving designer productivity. Failing productivity due to increasing design complexity is one of the biggest challenges faced by chip designers today, and it is not at all surprising that the EDA tool industry is focused on rectifying this. The chart below (source: SEMATECH) shows a rather grim picture – while design complexity has been growing at 58% CAGR, productivity has been increasing at only 21% CAGR. It is obvious to anyone that tools that fill this gap will be in great demand.

Failing Designer Productivity (Source: SEMATECH)
Failing Designer Productivity (Source: SEMATECH)

The reason for increasing design complexity is multifold – decreasing geometries allow designers to add more and more elements to the chip, making the entire process challenging. Number of IP cores per chip has grown from ~30 in 2003 to over 250 in 2006 and possibly much more today (source: EETimes). In addition, a big bull’s eye has been painted on power consumption numbers and most chips now must be designed using low power techniques. Plus, increasing complexity means that chip verification becomes more complex; 50% of all ASIC designs today require respins due to functional/logic errors (Source: Colette International Research).

Rather than a single solution, it is very likely that a multitude of innovative solutions that address individual problems will emerge. For example, better modeling techniques that can give a very accurate QoR estimate at the architecture stage itself can reduce the design complexity downstream. Languages such as SystemVerilog literally reduce the lines of code that a designer or verification engineer must write, thus boosting productivity. Time also may be right for ESL design, which has been around for a while, as conventional techniques fail to keep up.

All in all, we live in very interesting times. Faster and smaller is not always for the better. The industry must innovate and rise up to the economic and design challenges if it is to survive and prosper.

Reblog this post [with Zemanta]

Analyzing Pune’s top twitter users

Twitter
Image via Wikipedia

(Twitter has quickly become one of the most important new methods of communication, and Pune’s techies have taken to it quite enthusiastically. As its popularity grows and more and more people find out about its utility as a medium of communication, conversation, networking, as a source of news, or as a source of information about interesting hobbies or people, the number of people on twitter is growing. One of the questions most people have is – “Whom should I follow?” and the related question “Who are the top twitterers of Pune?” That is a difficult question to answer because everybody’s criteria are bound to be different – and existing “objective” mechanisms of measuring this are not really that good. Last week twitter released lists, and Dhananjay Nene argues that lists are a new way of measuring the “follow-worthiness” of twitter accounts. With this in mind, he analyzed who would be Pune’s top twitterers according to a number of different criteria. His he published some results of his investigation on his blog /home/dhananjay, and it is reproduced here with permission for the benefit of PuneTech readers.

This should be interesting to you for a number of different reasons. First, of course, this gives a list of the top twitterers in Pune. It is also an example of how a simple question can get quite complicated when you try to get computers to find the answer – and the approaches taken by different algorithms and their results are interesting to see. Finally, I think this is a sign of things to come – I’m convinced that twitter will be an integral part of the communications of the future, and twitter lists are an important way in which we will separate out the spammers and idiots from the useful content on twitter.

And, oh, by the way, are you following PuneTech on twitter? You should – there is info+links in the PuneTech twitter that will not be found on the PuneTech site. (And if you’re not on twitter at all, then please crawl out of your cave and get with the program.)

Anyway, here’s Dhananjay’s article.)

So twitter launched lists and many believe these will be a new mechanism for computing reputation instead of the current defacto followers. It is not a restricted knowledge that using follower counts as a measure of effectiveness of twitter is a extraordinarily error prone and brave exercise due to the obvious. Given the appearance of twitter lists, I was keen on figuring out if there is a way to reasonably measure effectiveness of a twitter id. This post details the exercise I went through. While there could be discussion around the exact semantics of such a computation and whether the results are consistent with everyone’s expectations, let me assert that I find the result sufficiently superior to anything else I’ve seen or I’ve been able to imagine so far. And that may stem from or despite the fact that two of my twitter handles (@dnene and @d7y) feature in this list.

As an input I took the top 50 handles from pune from twittergrader.com. Why top 50 ? Only part of the process was automated – the remaining required manual input. I did not want to spend too much time on doing data entry. This also gives you the twitter grader grade. I subsequently looked at the reputation of the handle in Klout, looked at the lists which included the handle and finally also looked at the twitter rank as expressed by yet another site twitter-friends.com. I computed rankings using each of these. I finally created a sum of all the ranks, and create a composite rank based on the sums. The interesting aspect of this computation was not just the end results but also some of the intermediate results.

So without further ado – here’s what I found

Ranked as per twitter grader

  1. shinils
  2. arthut
  3. indianguru
  4. sandeepjain
  5. tmalhar
  6. brajeshwar
  7. rohit_shah
  8. ghoseb
  9. rkartha
  10. prateekgupta
  11. ajinkyaforyou
  12. gauravsaha
  13. inkv
  14. aparanjape
  15. scepticgeek
  16. meetumeetu
  17. nishantmodak
  18. czaveri
  19. phpcamp
  20. ngkabra

The rank based on followers or twitter grader ranks was not well correlated with the other ranks. In my mind there is a sufficient rationale to question the effectiveness of both followers count or twitter grader as an ability to reach or influence or engage with others, even though twitter grader grade is slightly better than a folliower count. Thats why the other ranks turned out sufficiently differently ?

Ranked by Klout

  1. brajeshwar
  2. scepticgeek
  3. gauravsaha
  4. ichaitanya
  5. sahilk
  6. indianguru
  7. irohan
  8. rkartha
  9. phpcamp
  10. dnene
  11. ghoseb
  12. ngkabra
  13. prateekgupta
  14. d7y
  15. trakin
  16. aparanjape
  17. adityab
  18. punetech
  19. inkv
  20. nishantmodak

To my lay reading this had a stronger emphasis on people who engaged with others, were conversational and had a high update count as well.

Ranking by Twitter Lists

  1. sandygautam
  2. indianguru
  3. scepticgeek
  4. dnene
  5. brajeshwar
  6. phpcamp
  7. ghoseb
  8. adityab
  9. inisa
  10. rkartha
  11. aparanjape
  12. gauravsaha
  13. prateekgupta
  14. meetumeetu
  15. punetech
  16. ngkabra
  17. trakin
  18. freemanindia
  19. aaruc
  20. rush_me

To me this reflected not the spread of the following as much as the strength of the following. Notice how @sandygautam who very tightly focuses on psychology and is well respected twitterer in that area moves to the top (in a rather dominating way I might add)

Rank using Twitter Rank computed by Twitter-Friends

  1. scepticgeek
  2. ghoseb
  3. prateekgupta
  4. gauravsaha
  5. aaruc
  6. dnene
  7. rkartha
  8. adityab
  9. aparanjape
  10. sandygautam
  11. trakin
  12. d7y
  13. meetumeetu
  14. irohan
  15. aditto
  16. clickonf5
  17. rush_me
  18. sahilk
  19. punetech
  20. brajeshwar

This is an interesting metric and while I couldn’t help clearly identify what drove this, would be certainly willing to lend a ear if you want to come up with a suggested rationale.

So the final 20 pune power twitterers based on a composite of the 3 metrics, which in my perception is not terribly different than a list that I would come up with using my gut feel (though perhaps with different rankings) is … drumroll … drumroll …

Pune power twitterers

  1. scepticgeek
  2. gauravsaha
  3. ghoseb
  4. dnene
  5. rkartha
  6. brajeshwar
  7. prateekgupta
  8. indianguru
  9. adityab
  10. aparanjape
  11. sandygautam
  12. phpcamp
  13. trakin
  14. sahilk
  15. d7y
  16. irohan
  17. ngkabra
  18. punetech
  19. meetumeetu
  20. ichaitanya

Note : All the computations results are visible in the attached PDF. Also in a few case klout ratings or twitter friends rankings were not available. In such cases I have applied a klout rating of 0 and twitter friend ranking of 999999. Obviously it reduces the probability of such handles appearing in the overall rankings substantially – but there was no other reasonable option I could think of.

Disclaimer : At the end I am certain there can be a number of views on how such an exercise could be conducted. There might even be some complaints. Being aware of that, I list results of what I believe to be a “fair” exercise. Whether it is a “just” exercise is left to the reader. Also be aware that I have two of my twitter handles in the list above. You may choose to believe my assurance that I did not tweak the logic based on a first pass of results – the logic I decided to apply was not changed once the results were visible.

(Comments are closed on this article. Please comment at the original article.)

Reblog this post [with Zemanta]

Pune’s KQInfoTech is porting Sun’s ZFS File-System to Linux

Pune-based KQInfoTech is working on porting Sun‘s ZFS file-system to the Linux Platform. ZFS is arguably one of the best file-systems available today, and Linux is one of the most widely used operating systems for servers by new startups. So, having ZFS available on Linux would be great. And, With many years of experience in Veritas building VxFS, another one of best file-systems in the world, the founders of KQInfoTech do have the technical background to be able to do a good job of this. Check out the full announcement on their blog:

We have a ZFS building as a module and the following primitive operations are possible.

  • Creating a pool over a file (devices not supported yet)
  • Zpool list, remove
  • Creating filesystems and mounting them

But we are still not at a stage, where we can create files and read and write to them

See the full article, for more details and some interesting issues related to the license compatibility between ZFS and Linux.

About KQInfoTech

Pune-based KQInfoTech is an organization started by Anurag Agarwal and Anand Mitra, both of whom chucked high-paying jobs in the industry because they felt that there was a desperate need to work on the quality of students that is being churned out by our colleges. For the 2 years or so, they have been trying various experiements in education, at the engineering college level. All their experiments are based on one basic premise: students’ ability to pay should not be a deterrent – in other words, the offerings should be free for the students; KQInfoTech focuses on finding alternative ways to pay for the costs of running the course. See all PuneTech articles related to KQInfoTech for more details.

Reblog this post [with Zemanta]

Business Continuity Management Lifecycle and Key Contractual Requirements

(This overview of Business Continuity Management is a guest post by Dipali Inamdar, Head of IT Security in Geometric)

In emergency situations like pandemic outbreaks, power failures, riots, strikes, infrastructure issues, it is important that your business does not stop functioning. A plan to ensure this is called a Business Continuity Plan (BCP), and it is of prime importance to your business to ensure minimum disruption and smooth functioning of your operations. Earlier most companies would document business continuity plans only if their clients asked for it and would focus mainly on IT recovery. But scenarios have changed now. Corporations of all sized have now realized the importance of keeping their business functioning at all time and hence they are working towards a well defined business continuity management framework. Business continuity (BC) is often understood as a process to handle events that could disrupt business. However, BC is more than just recovery. The plan should also ensure proper business resumption after recovering from the disruption.

Business continuity management is a continuous life cycle as follows:

Business Continuity Planning Lifecycle
Click on the image to see it in full size

How does one start with BCM?

Business Impact Analysis (understanding the organization)

The first step is to conduct a Business Impact analysis. This would help you to identity critical business systems and processes and how their outage (downtime) could affect your business. You cannot have plan in place for all the processes without considering financial investments needed to have those in place. CEO’s inputs and client BC requirements also serve as input for impact analysis.

Defining the plan (Determining BCM strategy)

The next step is to identify the situations that could lead to disruption of the identified critical processes.

The situations could be categorized as:

  • Natural and environmental: – Earthquakes, floods, hurricanes etc
  • Human related: – Strikes, terrorist attacks, pandemic situation, thefts etc
  • IT related: – critical systems failure, virus attacks etc
  • Others: – Business Competition, power failure, Client BC contractual requirements

It might not be feasible to have plans for each and every situation, as implementing the defined plans needs to be practically possible. After the situations have been identified one needs to identify different threats, threat severity (how serious will be the impact on business if threat materializes) and their probability of occurrence (likelihood of threat materialization). Based on threat severity and occurrence levels critical risks are identified.

Implementing the plan (Developing and implementing BCP response)

The identified risks and additional client specific BCP requirements serve as inputs to the creation of BCPs. BCPs should focus on mitigation plan for the identified risks. BCP should be comprehensive, detailing roles and responsibilities of all the response teams. Proper budget needs to be allocated. Once the plan is documented the plan should be implemented.

The different implementation as per BCP could include having redundant infrastructure, signing up Service Level Agreements (SLAs) with service providers, having backup power supply, sending backup tapes to offshore sites, and training people in cross skills, having proper medicines or masks for addressing pandemic situations.

BCP should also have proper plans in place to resume business as usual. Business resumption is a critical and very important aspect of business continuity framework.

Testing and improving plan (Exercising, maintaining and reviewing)

Once the plans are documented and implemented the plans should be regularly tested. The tests could be scheduled or as and when the need arises. One can simulate different tests like moving people to other locations, having primary infrastructure down, testing UPS and diesel generator capacity, calling tree tests, evacuation drills, having senior management backups to take decisions, transport arrangements etc.

The tests will help you identify areas which need improvement in the BCP. The gaps between the expected and actual results need to be compared. The test results needs to be published to senior management. The plan needs to be reviewed regularly to update latest threats and have mitigations for the critical ones which will result in continuous lifecycle. One can schedule internal audits or apply for BS25999 certification to ensure proper compliance to BCP requirements.

Pune faces threats of irregular power supply, pandemic out break etc which could lead to business disruptions. One needs to have detailed plans for critical threats to ensure continuity of critical operations. The plans should also have detailed procedure to ensure proper business resumption. Plans may be documented but actual action during emergency situations is very important.

Important note: Contractual requirements

When signing off specific contractual requirements with clients, certain precautions must be taken as follows:

  • Before signing stringent SLAs it should be checked that there is a provision for exclusions or relaxations during disaster situations as you will not be able to achieve SLAs during disaster scenarios
  • When BCP requirements are defined in client contracts the responsibilities or expectations from the clients should also be clearly documented and agreed to ensure effective execution of the BCP
  • BCP requirements can only be effectively implemented when proper budget allocations are planned. So for specific BCP requirements cost negotiations with the client are important. Usually this is ignored, so it is important that the sales team should be appraised before agreeing on BCP requirements with the client.
  • Do not sign-off on vague BCP requirements. They should be clear, specific and practically achievable
  • Before signing off any contract which has a penalty clause, it should be reviewed thoroughly to ensure that compliance to those clauses is practically possible

About the author: Dipali Inamdar

Dipali Inamdar, Head – IT security in Geometric Ltd, has more than 11 years of experience in Information Technology and Information Security domain. She is a certified CISA, ISO27001 Lead Auditor, BS25999 Lead Auditor and ISO2000 Internal auditor. She has worked in sectors spanning BPO, IT and ITES companies, Finance sector for Information Security and Business Continuity Management. She is currently operating out of Pune and is very passionate about her field. See her linked-in profile for more details.

Reblog this post [with Zemanta]

The Venture Center Library for Entrepreneurs and Innovators in Pune

Pune’s resource for startups, the Venture Center has yet another service that could be valuable for Pune’s startups. The Venture Center Library has been created specifically to support and enhance the entrepreneurial ecosystem in and around Pune. They are targeting entrepreneurs, scientific researchers, technology innovators, IP & technology commercialization professionals and venture investors to take advantage of their collection of books, periodicals, reports and research services.

Click on the Venture Center Logo to see all PuneTech articles about Venture Center
Click on the Venture Center Logo to see all PuneTech articles about Venture Center

Here are key features of the Venture Center Library:

  • ~ 1000 books – with an emphasis on technology innovation, commercialization & entrepreneurship
  • Many *good* magazines (MIT Tech Review, SciAm, etc.)
  • Book collection listed online & searchable: http://www.vcenterlibrary.org/book.php
  • Increasing data base on electronic articles and e-books
  • Open Mon-Sat, ample parking
  • Internet access, scanning, etc. available
  • Events featuring books, videos, etc. http://www.vcenterlibrary.org/events.php

If you just want to browse/read books at the library itself, it is free until the end of 2009, and after that it will cost Rs. 400 per year. If you want to check out books, there’s a Rs. 2000 refundable deposit and a Rs. 400 yearly fee – which allows you to check out 2 books for up to 14 days each. Look here for details of membership and fees.

About Venture Center

Venture center is an incubator mainly targeted towards startups in biotech, chemical and material sciences. It has been set up using government funds, and is housed in NCL‘s premises, but is planned as an independent entity that needs to become self-sustaining in a few years (based on taking equity/fees from the startups it helps). Check out the venturecenter tag on PuneTech for all PuneTech articles about Venture Center.

Reblog this post [with Zemanta]

Suggest ways for Pune Techies to collaborate online and win a Google Wave invitation

Update: the competition is over – Sandeep Gautam has won the Google Wave invitation for this suggestion.

The offline tech scene in Pune is thriving, as one look at the PuneTech events listing and the PuneTech calendar will show.

And there are a whole bunch of online places for techies in Pune to hang out:

Most of these are basically mailing lists, and forums. I wonder whether there are other ways in which techies in Pune can find other like-minded people, and collaborate in more ways. Would chat be interesting, like proto.in uses? Or IRC? Should we be focusing on Orkut or Facebook or both? Is there something intersting that can be done with YouTube?  Can we use some new technology in new ways to bring people closer together? Maybe Google Wave?

Give your suggestions in the comments section below. The best suggestion gets a Google Wave invitation. You can get the invitation for yourself, or you can use it to invite someone else. If you’re not interested in the invitation, please say so in your comment.

Give a specific suggestion for online collaboration/communication amongst Pune’s techies. Don’t just give the mechanism of collaboration – also give the purpose. For example, saying, “use an online chat room” is useless. Much more useful is something like “use an online chat-root where students from engineering colleges can ask questions about career to people from industry.” Also, a suggestion that is easy to implement is much more valuable than a suggestion that is going to require a lot of setup and/or effort. And, you get lots of plus points if you’re also willing to drive the effort. (And if you like somebody else’s suggestion, and would be willing to help/join that effort, please leave a comment indicating that.)

(Thanks to Amit Somani for graciously agreeing to donate one of his Google Wave invitations for this purpose.)

Reblog this post [with Zemanta]