I’m liveblogging the Pune OpenCoffee Club meeting organized by nFactorial. There are over 50 people and SEED Infotech‘s classroom is overflowing. They are having to bring in more chairs for people who are still standing.
Neill Brownstein of Footprint Ventures is giving a pitch about his company
Footprint has created $30 billion in value over 35 years. 20% IRR. 23 IPOs
Among other companies, Footprint has invested in our own Veritas Software.
Average run-rate of companies they fund is currently in the $1 million
They don’t invest in startups. They acquire. But they work closely with some seed funds.
These are the guys that did the acquisition of a company in Australia which ultimately became Google maps
They are not looking for anything in particular. Consumer, enterprise, early stage, late stage. Is all OK. Pre-revenue is good. But must be a product company.
Question: do they only acquire companies with technology that is compatible with theirs? Answer: if it needs to integrate with Google products, then yes, the technology does need to be compatible.
The company needs to relocate. Either to Mountain View, US, or Bangalore, in India. (in most cases)
They sometimes buy companies just because they think it is cool, even if they can’t figure out where it will really fit in
Registration and Fees: The event is free for all. Register here.
Details: PLUG meeting
The PLUG meeting is open to all, there are no charges or pre-requisites to attend the meeting. If you are intrested in FOSS (Free/Open Source Software) you are welcome to the meeting.
Details: Contrasting Java and Dynamic Languages
Given the increasing interest in dynamic languages such as Python, Ruby, PHP along with the increasing threat perception to Java, Dhananjay Nene talks about his experience with using these languages and how they distinguish themselves from Java. The session “Contrasting Java and Dynamic Languages” will also discuss the role of Java based scripting languages such as Groovy, JRuby and Jython.
About the Presenter: Dhananjay Nene has been programming for 17 years and was associated with Citicorp Overseas Software Limited and AT&T and his last assignment was being the CTO and Head of Product Development for CashTech Solutions. He has worked on a number of domains including banking, telecom, network management, wireless networking and educational software. He is an MBA from IIM Ahmedabad, has worked in senior management positions and has managed large sized development teams.
In recent years he had focused on building very high performance java based frameworks and solutions. Currently he is developing software using Python. He blogs at http://blog.dhananjaynene.com
Proto.in’s online startup chat
If you don’t feel like leaving your house, you can check out the proto.in online start-up chat event that starts at 3pm.
Registration and Fees: This event is free for everyone. Register at upcoming.
Details:
Hemant Joshi, of nFactorial would like to invite POCC members to an informal get-together with special invitees: Footprint Ventures, and Google Corporate Development on 3rd July, 5pm.
Footprint ventures will be there to meet startups and talk about what they look for when considering Series A investments.
Neeraj Arora will be representing Google’s Corporate Development and M&A division and will be there to interact with Startups.
The venue will be Seed Infotech’s office in Erandawana.
The complete address is:
Seed Infotech
Nalanda building, Opposite Gandhi Lawns,
Erandwane, Pune
Many thanks to Naredra Barhate, CEO of Seed Infotech for arranging place for the meeting.
MultiEyeVision Observer is a product for remote monitoring and remote observation. The Observer is a pre-configured mobile phone, which in addition to functioning as a normal mobile phone, can also be used at any time as a remote observation camera. High quality image stream captured by the Observer is viewable in real-time from anywhere over the internet.
The Observer can be used for a wide range of business and consumer applications. For example, keep the Observer at home, and keep an eye on your kids while you are at work. Or place the Observer in your office / workshop, and supervise your staff when you are travelling.
The biggest advantages of the MultiEyeVision Observer are its simplicity of use and mobility. There is no cabling / wiring or installation. It can be moved at any time to the place you want to monitor. And all you do is press a button on the mobile to start monitoring.
What: ‘Autodesk Live’, event for information about 2009 Autodesk Manufacturing Solutions When: 8th July 2008, 9:30am to 5:30pm Where: Le Meridien, Pune Registration: Free registration here.
‘Autodesk Live’ would give an idea on the kind of impact Autodesk 2009 will have on business. Focus will be on ‘Digital Prototyping‘ and how that can be used to:
Connect conceptual design, engineering and manufacturing teams through the use of a single digital model
Simulate complete product and better optimise and manage the design
Create accurate digital prototypes that enable to validate the design as one work, minimising the need for costly physical prototypes
Footprint Venture will be visiting Pune in next week, 3rd July. They are an early stage venture capital fund, based out of India. (http://footprintventures.com/index.htm ). I am trying to arrange their meeting with POCC members where they can informally interact with POCC member and talk for a few minutes on their prospective, what they look for, etc. The tentative plan is to have this meeting from 5:00 PM to 6:00 PM. The venue is TBD. I will confirm the venue and meeting in a few days. If anyone can arrange for place, it will be great.
Footprint is an early stage venture capital fund, based out of India. They are targeting entrepreneurs seeking Series A investment.
The fund is the brainchild of Neill Brownstein, a co-founder of Bessemer Venture Partners. It aims to invest in companies with a strong India focus, i.e. companies with offerings that are either exclusively for India or for India & international markets.
(by Bob Spurzem, Director of International Business, Mimosa Systems, and T.M. Ravi, Founder, President and CEO, Mimosa Systems. This article is reposted with permission from CSI Pune‘s newsletter, DeskTalk. The full newsletter is available here.)
In this era of worldwide electronic communication and fantastic new business applications, the amount of unstructured, electronic information generated by enterprises is exploding. This growth of unstructured information is the driving force of a significant surge in knowledge worker productivity and is creating an enormous risk for enterprises with no tools to manage it. Content Archiving is a new class of enterprise application designed for managing user-generated, unstructured electronic information. The purpose of Content Archiving is to manage unstructured information over the entire lifecycle, ensuring preservation of valuable electronic information, while providing easy access to historical information by knowledge workers. With finger-tip access to years of historical business records, workers make informed decisions driving top line revenue. Workers with legal and compliance responsibility search historical records easily in response to regulatory and litigation requests; thereby reducing legal costs and compliance risk. Using Content Archiving enterprises gain “finger-tip” access to important historical information – an important competitive advantage helping them be successful market leaders.
Unstructured Electronic Information
One of the most remarkable results of the computer era is the explosion of usergenerated electronic digital information. Using a plethora of software applications, including the widely popular Microsoft® Office® products, users generate millions of unmanaged data files. To share information with co-workers and anyone else, users attach files to email and instantly files are duplicated to unlimited numbers of users worldwide. The University of California, Berkeley School of Information Management and Systems measured the impact of electronically stored information and the results were staggering. Between the years 1992 to 2003, they estimated that total electronic information grew about 30% per year. Per user per year, this corresponds to almost 800 MB of electronic information. And the United States accounts for only 40% of the world’s new electronic information.
Email generates about 4,000,000 terabytes of new information each year — worldwide.
Instant messaging generates five billion messages a day (750GB), or about 274 terabytes a year.
The World Wide Web contains about 170 terabytes of information on its surface; in volume this is seventeen times the size of the Library of Congress print collections.
This enormous growth in electronic digital information has created many unforeseen benefits. Hal R. Varian, a business professor at the University of California, Berkeley, notes that, “From 1974 to 1995, productivity in the United States grew at around 1.4 percent a year. Productivity growth accelerated to about 2.5 percent a year from 1995 to 2000. Since then, productivity has grown at a bit over 3 percent a year, with the last few years looking particularly strong. But unlike the United States, European countries have not seen the same surge in productivity growth in the last ten years. The reason for this is that United States companies are much farther up the learning curve than European companies for applying the benefits of information technology.”
Many software applications are responsible for the emergence of the electronic office place and this surge in productivity growth, but none more so than email. From its humble beginning as a simple messaging application for users of ARPANET in the early 1970’s, email has grown to become the number one enterprise application. In 2006, over 171 billion emails were being sent daily worldwide, a 26% increase over 2005 and this figure is forecasted to continue to grow 25-30% throughout the remaining decade. A new survey by Osterman Research polled 394 email users, “How important is your email system in helping you get your work done on a daily basis?” 76% reported that it is “extremely important”. The Osterman survey revealed that email users spend on average 2 hours 15 minutes each day doing something in email, but 28% users spend more than 3 hours per day using email. As confirmed by this survey and many others, email has become the most important tool for business communication and contributes significantly to user productivity.
The explosive growth in electronically stored information has created many challenges and has created a fundamental change in the way electronic digital information is accessed. Traditionally, electronic information was managed in closely guarded applications used by manufacturing, accounting and engineering and only accessed by a small number of trained professionals. These forms of electronic information are commonly referred to as structured electronic information. User-generated electronic information is quite different because it is in the hands of all workers – trained and untrained. User-generated information is commonly referred to as unstructured electronic information. Where many years of IT experience have solved the problems of managing structured information; the tools and methods necessary to manage unstructured information, for the most part, do not exist. For a typical enterprise, as much as 50% of total storage capacity is consumed by unstructured data and another 15-20% is made up of email data. The remaining 25-30% of enterprise storage is made up of structured data in enterprise databases.
Content Archiving
User-generated, unstructured electronic information is creating a chasm between IT staff whose responsibility it is to manage electronic information and knowledge workers who want to freely access current and historical electronic information. Knowledge workers desire “finger-tip” access to years of information which strains the ability of IT to provide information protection and availability, cost effectively. Compliance officers desire tools to search electronic information and preserve information for regulatory audits. And overshadowing everything is the need for information security. User-generated electronic information is often highly sensitive and requires secure access. As opposed to information that exists on the World Wide Web, electronic information that exists in organizations is meant only for authorized access.
Content Archiving represents a new class of enterprise application designed for managing user-generated unstructured electronic information in a way that addresses the needs of IT, knowledge workers and compliance officers. The nature of Content Archiving is to engage seamlessly with the applications that generate unstructured electronic information in a continuous manner for information capture, and to provide real-time end-user access for fast search and discovery. The interfaces currently used to access unstructured information (e.g. Microsoft Outlook®) are the same interfaces used by Content Archiving to provide end users with secure “finger-tip” access to volumes of electronic information.
Content Archiving handles a large variety of user-generated electronic information. Email is the dominate form of usergenerated electronic information and is included in this definition. So too are Microsoft Office files (e.g. Word, Excel, PowerPoint, etc.) and the countless other file formats such as .PDF and .HTML. Files that are commonly sent via email as attachments are included in both the context of email and as standalone files. In addition to email and files, there are a large number of information types that are not text based, and include digital telephony, digital pictures and digital movies. The growing popularity of digital pictures (.JPG), audio and voice mail files (.WAV, .WMA) and video files is paving the way for a new generation of communication applications. It is within reason that in the near future full length video recordings will be shared just as easily as Excel spreadsheets are today. All these user-generated data types fall under the definition of Content.
Content Archiving distinguishes itself from traditional data protection applications. Data protection solves the important problem of restoring electronic information, but does little more. Archiving, on the other hand, is a business intelligence application that solves problems such as providing secure access to electronic information for quick search and legal discovery; measuring how much information exists; identifying what type of data exists; locating where data exists and determining when data was last accessed. For managing unstructured electronic information, Content Archiving delivers important benefits for business intelligence and goes far beyond the simple recovery function that data protection provides. Using tools that archiving provides, knowledge users can easily search years of historical information and benefit from the business information contained within.
Information Life-Cycle Management
Content Archiving recognizes that electronic information must be managed over its entire life-cycle. Information that was recently created has different needs and requirements than the same information years later and should be managed accordingly. Three distinct phases exist for the management of electronic information, which are the recovery phase, discovery phase and compliance phase (see figure). It is the strategic purpose of Content Archiving to manage electronic information throughout the entire life-cycle; recognizing the value of information in the short-term for production and long-term as a record of business; while continually driving information storage levels down to reduce storage costs and preserving access to information.
During the recovery phase all production information requires equal protection and must be available for fast recovery should a logical error occur, or a hardware failure strikes the production servers. Continuous capture of information reduces the risk of losing information and supports fast disk-based recovery. The same information stores can be accessed by end users who desire easy access to restore deleted files. Content Archiving supports the recovery phase by performing as a disk-based continuous data protection application. As compared to tape-based recovery, Content Archiving can restore information more quickly and with less loss of information. It captures all information in real-time and it keeps all electronic information on cost-efficient storage where it is available for fast recovery and also can be easily accessed by end users and auditors for compliance and legal discovery. The length of the recovery phase varies according to individual needs, but is typically 6-12 months.
At a point in time, which varies by organization, the increasing volume of current and historical information puts an unmanageable strain on production servers. At the same time the value of the historical electronic information decreases because it is no longer required for recovery. This is called the discovery phase. The challenge in the discovery phase is to reduce the volume of historical information while continuing to provide easy access to all information for audits and legal discovery. Content Archiving provides automated retention and disposition policies that are intelligent and can distinguish between current information and information that has been deleted by end users. Retention rules automatically dispose of information according to policies defined by the administrator. Further reduction is achieved by removing duplicates. For audits and legal discovery, Content Archiving keeps information in a secure, indexed archive and provides powerful search tools that allow auditors quick access to all current and historical information. By avoiding using backup tapes, searches of historical information can be performed quickly and reliably; thereby reducing legal discovery costs.
Following the discovery phase, electronic information must be managed and preserved according to industry rules for records retention. This phase is called the compliance phase. Depending on the content, information may be required to be archived indefinitely. Storing information long-term is a technical challenge and costly if not done correctly. Content Archiving addresses the challenges of the compliance phase in two ways. First, Content Archiving provides tools which allow in-house experts, who know best what information is a record of business, to preserve information. Discovery tools enable auditors and legal counsel to flag electronic information as a business record or disposable. Second, Content Archiving manages electronic information in dedicated file containers. File containers are designed for long-term retention on tiered storage (e.g. tape, optical) for economic reasons and have self-contained indexes for reliable long-term access of information.
Conclusion
The explosive growth in user-generated electronic information has been a powerful benefit to knowledge worker productivity but has created many challenges to enterprises. IT staff is challenged to manage the rapidly growing information stores while keeping applications running smoothly. Compliance and legal staff are challenged to respond to regulatory audits and litigation requests to search and access electronic information quickly. Content Archiving is a new class of enterprise application designed to manage unstructured electronic information over its entire life-cycle. Adhering to architectural design rules that ensure no interruption to the source application, secure access and scalability, Content Archiving manages information upon creation during the recovery phase, the discovery phase and the compliance phase where information is preserved as a long-term business record. Content Archiving provides IT staff, end users as well compliance and legal staff with the business intelligence tools they require to manage unstructured information economically while meeting demands for quick, secure access and legal and regulatory preservation.
About the Authors
Bob Spurzem
Director International Business
Mimosa Systems
Bob has 20+ years experience in high technology product development and marketing and currently he is a Director International Business with Mimosa Systems Inc. With significant experience throughout the product life cycle, from market requirements and competitive research, through positioning, sales collateral development and product launch, he has a strong focus in bringing new products to market. Prior to this his experience includes work as a Senior Product Marketing Manager for Legato Systems and Veritas Software companies. Robert has an MBA from Santa Clara University and a Masters Degree in Biomedical Engineering from Northwestern University
T. M. Ravi
Co-founder, President, and CEO
Mimosa Systems
T. M. Ravi has had a long career with broad experience in enterprise management and storage. Before Mimosa Systems, Ravi was founder and CEO of Peakstone Corporation, a venture-financed startup providing performance management solutions for Fortune 500 companies. Previously, Ravi was vice president of marketing at Computer Associates (CA). At Computer Associates, Ravi was responsible for the core line of CA enterprise management products, including CA Unicenter, as well as the areas of application, systems and network management; software distribution; and help desk, security, and storage management. He joined CA through the $1.2 billion acquisition of Cheyenne Software, the market leader in storage management and antivirus solutions. At Cheyenne Software, Ravi was the vice president responsible for managing the company’s successful Windows NT business with products such as ARCserve backup and InocuLAN antivirus. Prior to Cheyenne, Ravi founded and was CEO of Media Blitz, a provider of Windows NT storage solutions that was acquired by Cheyenne Software. Earlier in his career, Ravi worked in Hewlett-Packard’s Information Architecture Group, where he did product planning for client/server and storage solutions.
Yesterday, I live-blogged about Day 1 of this conference. That was more about the speeches given by dignitaries. Today I am attending one session, and this one promises to be more technical talks.
To refresh your memory, this is the Hi-Tech Pune Maharashtra 2008 organized by Suresh Kalmadi backed Pune Vyaspeeth, this is the 5th installment of this conference, and in addition to IT, the focus this time is on Bio-Technology and Animation. The conference is spread out over three days (18th June to 20th June) and there is a fairly interesting schedule of presentations by a diverse set of speakers.
I am live-blogging this conference so, 1) refresh on a regular basis if you’re reading this on Wednesday evening (Pune time), and 2) please excuse the terse and ungrammatical language.
I missed the morning sessions. There were two sessions on innovation (which I’m glad to have missed – I am bored of talks on innovation), one on biotech, and one that sounded very interesting – because it was case studies on animation (“Golden Compass”, “Tare Zameen Par”, “Little Krishna”) that was done out of Pune.
First up is P.S. Narayan, Head Sustainability Practice, Wipro, talking about “Does Green make business sense?” While a lot of the talk was general Al Gore-ish “you should help the environment” lecturing, there were a few points that I found interesting.
He is making the point that Green companies perform better. There are examples of businesses who focused on energy savings and managed to not just reduce energy costs, but also improved on a bunch of other measures. Also, he is showing that green companies do better on the stock market too. I’m not sure whether this is just correlation or there was some causation involved. (I mean it is possible that companies that think about going green, are also the same ones who are smart enough to reduce their costs, and the ones who are not going green are generally the companies that are not well run.)
What is Green IT? It’s not just designing your systems to consume less power. It is also about software solutions to reduce energy consumed in other parts of your company (e.g. did you think about re-designing your supply chain to minimize energy consumption?) Also, other things like green accounting (if your accounts department kept track of energy usage in addition to simply dollars spent, that would reduce consumption. Currently, most people don’t even know the details of their consumption.)
The next up is Dr. S. Ramakrishnan, Director General, C-DAC with a talk entitled “From Innovation to Deployment: Case Studies from C-DAC”. In their Language Computing initiative they have designed more than 3000 TrueType and Unicode fonts for Indian languages. In Speech Technologies, they not only have to worry about speech-to-text of Indian languages, but also speech-to-text of Indian English! C-DAC’s ATCS (Area Traffic Control System) brings advanced concepts in traffic control to Indian conditions. It uses vehicle detectors to optimize traffic signals. These kinds of signals are only present in two place in India – Delhi (63 signals, imported from UK) and Pune (34 signals, developed by C-DAC). The signals are controlled from a central location using wireless communication – which is really good because it reduces road digging. (Anyone driving around Pune these days will know how big a deal this is.) There is also a telemedicine project but he did not get time to go into that.
Dr. Anupam Saraph, CIO of Pune, making a case for having strong IT in government in Pune. To allow growth faster than the 7% that we are currently experiencing (it should be double digits), but also to ensure that we do not run into the problems that are caused by the growth when it happens. After the initial pitch, he is jumping ahead and talking about his vision for Pune in 2015, and then following it up with the specific projects that he has initiated. He mentioned how this is a partnership between government and businesses – he sees how it is sustainable when someone is making money off these cool services. He also mentioned the Design for Pune competition (which I am working on) and PuneTech. Cool.
In the plenary session, Rohit Srivastwa, head of IT for the Commonwealth Games, and Airtight Networks, gave a talk on how information security is very important these days. He talked about ClubHack (an online community for bringing security awareness to common people). He pointed out to Anupam Saraph how some government websites had security loopholes. This led to a nice back-and-forth between the two of them about the need to balance security vs. use of new technology – a refreshing change from the blandness that afflicts other presentations. But while the session was interesting, I was not entirely sure why it was a plenary session, instead of being a presentation in one of the regular sessions.
Vijay Kumar Gautam, COO, Commonwealth Games, Delhi 2010, gave a brilliant speech about the use of IT in sports, and brought out very nicely the huge difficulties involved in managing the IT for a sports event. Imagine a company that has 50,000 employees, and 1 billion customers. The company is built from scratch in 3 years, and is operational for only 3 weeks. Unlike most other IT projects, the deadline does not slip – the dates are fixed and remain fixed. Unlike most other software products, you don’t get a chance to do a bugfix or a patch release. You don’t get a chance to tune your system based on experience in the field. You don’t have an alpha or a beta release. And now imagine 10,000 journalists following your every move and ready to report on every gaffe.
He gave some idea of the complexity of the whole set up – hardware, software, processes. I’d love to get my hands on his presentation, not sure where I can get it from. They are planning on using the Commonwealth Youth Games in Pune later this year as a Proof-of-Concept test ground for the system.
The most interesting thing he said was this – such games happen all the time. There are Olympics (summer/winter) every two year. There are Commonwealth or Asian games every two years. Take into account world championships and other events and you have games all the time. And, it is very difficult to find people who have the experience of building IT systems for such a requirement. And they charge astronomical rates. You should get into this business. That was the main thrust of his talk.
Yahoo! Finance reports that Pune’s Chitale Dairy has used VMWare’s virtualization infrastructure to consolidate their two data centers into one and save costs:
Chitale Dairy, which produces about 400,000 liters of milk per day as well as cream, butter and yogurt, faced operational challenges with 10 physical servers spread across two datacenters in a town 500 kilometers from the nearest city. In its remote location, the company found it expensive and challenging to source and retain qualified IT support staff while also grappling with server sprawl.
By consolidating its two physical operations into one virtual datacenter using VMware Infrastructure, Chitale Dairy reduced server hardware acquisition costs by 50 percent, software acquisition costs by 75 percent, and power consumption in half. VMware also reduced server deployment times from three weeks to three hours and the time to restore a corrupted server from six or seven hours to 10 minutes.