Subscribe to the Bombay Chartered Accountant Journal Subscribe Now!

Whatsap

fiogf49gjkf0d
About this article:
WhatsApp is an instant
messaging app (application software for phones). For many this app is a
cheap substitute for SMS text messaging and can be called a
non-Blackberry version of the Blackberry messenger. This app works
across platforms and is easy to use. Readers may find this write-up
informative.

I still remember the day when a close friend of
mine kept pushing me to install this app on my phone. All the while,
trying to convince me that this app was really worth a shot. I was a bit
reluctant and the simple reason was that I would have to pay money to
purchase the app. An unpalatable thought at the time, it would be a
first for me (so far I have installed as many as 91 apps on my phone and
the ratio of paid vs free apps is 2:89). My friend kept chiding me that
the cost in comparison to the benefit was negligible, but I just
couldn’t swallow the thought of paying for an app.

While you may
say that I am tight-fisted, I would prefer the name frugal. But trust
me I am not the only one. If you are not convinced, check out the Maruti
Suzuki ad — where the salesman is trying to sell a luxury yacht to a
‘rich man’, the scene begins with the salesman praising one feature
after another . . . impressive one would say . . . instead the ‘rich
man’ asks ‘kitna deti hai’ (meaning how much mileage does it give) and
then comes the tag line — “for a country obsessed with mileage, we
produce the most fuel-efficient cars”. Just like the ‘rich man’ in the
ad, there are countless number of cell phone users (many of whom own
pretty fancy smart phones), who find SMSing an expensive mode of
communication. And they should, after all you can speak to one another
for as low as 1p per second, then why pay such a high price for a lowly
SMS, more so when you know that the phone companies are making a fast
buck on the SMS. Well now you have an alternative — WhatsApp.

WhatsApp
provides an alternative texting service that closely resembles standard
SMS text messaging. Simply put, WhatsApp messenger is a smart-phoneto-
smartphone messenger. I guess this is where I take the role of the
salesman trying to sell you the yacht (dont worry, your time will come
and you can ask kitna deti hai). Here are a few reasons why you should
install and use this app:

  • This app works on iPhone (IOS),
    Blackberry (Blackberry OS), Nokia (Symbian) as well as Samsung
    (Android). Arguably, that’s much better than the Blackberry Messenger
    (‘BBM’) which is limited to Blackberry devices.

  • Unlike standard
    text messaging, though, you can set a status message which other
    WhatsApp users can see, both in the Favourites page and in the main
    contact list.

  • And not only can you send photos, but you can also
    attach audio and video notes, and even your geographic location to
    WhatsApp messages. Plus, it provides an easy way to save your message
    history as a text file (see pic).

  • You could send a million
    messages, but pay a pittance. The messages can be sent to friends and
    family across the world (just like BBM) for the same cost.

  • The
    BBM requires you to know your friends’ PIN, well you can say goodbye to
    that now. Once you and your friends have installed WhatsApp, you don’t
    need anything else. This is actually one of the best parts — WhatsApp
    almost automatically identifies who all in your phonebook have installed
    WhatsApp and lets you chat with them instantly. In fact they will
    automatically appear in your Favourites.

  • WhatsApp gives you the
    option to remain on always/to remain connected with your buddies. If you
    choose to go offline, don’t worry the messages will be stored on the
    server and will be pushed to your phone as soon as you log on.

  • Messages
    are usually received very quickly and notifications appear via push,
    which you can configure in the phone’s settings if you want.

  • Like BBM it allows you to form groups (up to 10 people) where you can share messages with a group of friends.

  • Overall,
    WhatsApp Messenger is a huge benefit to the iPhone community and to
    smartphone users in general, because it lets you keep the text messages
    flowing to your friends for free . . . . . . arguably, for the same
    price that they cost cell phone providers to deliver. Come to think of
    it, you have nothing to lose but your expensive texting plans.

Well!!!!! Now’s the part where you ask kitna deti hai?

To
begin with, it will cost you US $1 (for the iPhone that is, for the BB,
Nokia and Samsung you can use it for free for one full year).

Unlike
standard SMS messaging, WhatsApp uses your phone’s data plan to send
and receive messages. So if you use the app a lot, then your data usage
will increase. (You can monitor these stats from within the app).
Similarly, if you travel outside of your phone carrier’s supported area,
it’s possible that you’ll incur data roaming charges if you leave that
option enabled. Staying attached to a Wi-Fi connection should alleviate
most of those concerns (but as a side effect, constant pinging to the
Wi-Fi network will drain your battery power very fast).

For
those of you who are extra security-conscious, you might be concerned
that your phone number is known to the app’s developer and that all
messages go through its servers. The privacy page on the WhatsApp
Website states that the company will Do No Evil with your data and the
developer lets you know that messages are stored on its system only
until they have been retrieved, at which point they are deleted.
WhatsApp also confirmed that WhatsApp text messages, like most e-mail
messages, are sent across the Internet unencrypted (contact data is
encrypted, however). That’s not necessarily a problem; just something
certain types of users may need to be aware of.

The only other
limitation is the requirement that your friends also have WhatsApp
Messenger app installed on their phones. However, if you’re the early
adopter within your circle and none of your friends have downloaded the
app yet, then you’re not going to have anyone to talk with. Luckily, the
app makes it easy to invite your friends to download the app, either by
sending them an e-mail or a standard text message.

If you liked
what you’ve read above and want to try this app, you can visit
(itunes/blackberry world/ OVI/android mart) and download this software.
The whole process is fairly simple. The app walks you through the quick
set-up process the first time you open it. You register your phone
number with the WhatsApp service. It verifies your identity by sending a
code (ironically, via a standard text message) that you then enter into
the set-up screen. After that, the app asks for permission to look
through your address book for contact numbers that are already
registered with WhatsApp and then places them into your your list of
Favourites. Then you’re finished and ready to start texting with your
friends. Once you and your friends have gone through this short
procedure, texting via WhatsApp Messenger is similar to standard SMS
messaging . . . . only much cheaper.

I would love to hear about your experience after using this software. You can send your emails to sam.client@gmail.com

Disclaimer:
This
write-up is not intended to promote or malign any particular product,
feature or any company. Further the write-up should not be considered as
an endorsement of any one product over the other. The sole purpose of
this write-up is to share knowledge and user experience.

levitra

The basics of cloud computing Part 2

fiogf49gjkf0d
About this article:
The previous write-up on this topic was intended to be an eye-opener on this subject. This one briefly discusses certain important aspects about cloud computing. This would include key terminology and the some offerings.

Background:
Cloud computing, as explained in the previous issue, is a model for enabling convenient, ondemand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. By providing on demand access to a shared pool of computing resources in a selfservice, dynamically scaled and metered manner, cloud computing offers compelling advantages in speed, agility and efficiency.

Moving on, one needs to appreciate that, currently, cloud computing is at an early stage of its life-cycle, and cloud computing as we know it, is the evolution and convergence of several trends. In order to benefit from the fast evolving model, one needs to understand certain important aspects and key terminology being used in the context of cloud computing.

Commonly used models of cloud computing:
The first in the order of things is for the readers to understand the different (common) cloud computing models available in the market. The models currently in vogue are:

  • Private clouds
  • Public clouds
  • Community clouds
  • Hybrid clouds

Private clouds:
These refer to clouds for exclusive use by a single organisation. Such clouds are typically controlled, managed and hosted in private data centers. However, this not a hard and fast rule, there are exceptions wherein the private cloud is for the exclusive use by one organisation, but the hosting and operation of the private clouds is outsourced to a third party service provider.

Public clouds:
These refer to clouds which are leased out for use by multiple organisations (tenants) on a shared basis. These clouds are hosted and managed by a third party service provider. These are fairly common and serve small and medium enterprises. Examples would be Microsoft 365, Google docs.

Community clouds:
These refer to clouds for use by a group of related organisations who wish to make use of a common cloud computing environment. For example, a community might consist of the different branches of the military, all the universities in a given region, or all the suppliers to a large manufacturer. To cite an example: Large Hadron Collider1. (Look this up on the Internet, you may find the facts and dynamics hard to believe.)

Hybrid clouds:
These refer to situations when a single organisation adopts both private and public clouds for a single application, in order to take advantage of the benefits of both. For example, in a ‘cloudbursting’ scenario, an organisation might run the steady-state workload of an application on a private cloud, but when a spike in workload occurs, such as at the end of the financial quarter or during the holiday season, they can burst out to use computing capacity from a public cloud, then return those resources to the public pool when they are no longer needed. (Somebody please wake up the Tax Department, please use this on due dates.)

Of the above, private clouds and public clouds are the most commonly seen and implemented.

Advantages:
While the advantages such as efficiency, availability, scalability and fast deployment are common to both public as well as private clouds, there are certain advantages which would be unique either to public clouds or to private clouds. Some of these are: Some benefits are unique to public cloud computing:

  • Low upfront costs — Public clouds are faster and cheaper to get started, hence providing the users with the advantage of a low-cost barrier to entry. There is no need to procure, instal and configure hardware.
  • Economies of scale — Large public clouds enjoy economies of scale in terms of equipment purchasing power and management efficiencies, and some may pass a portion of the savings onto customers.
  • Simpler to manage — Public clouds do not require IT to manage and administer, update, patch, etc. Users rely on the public cloud service provider instead of the IT department.
  • Operating expense — Public clouds are paid out of the operating expense budget, often times by the users’ line of business, not the IT department. Capital expense is avoided, which can be an advantage in some organisations.


Some benefits are unique to private cloud computing:

  • Greater control of security, compliance and quality of service — Private clouds enable IT to maintain control of security (prevent data loss, protect privacy), compliance (data handling policies, data retention, audit, regulations governing data location), and quality of service (since private clouds can optimise networks in ways that public clouds do not allow).
  • Easier integration — Applications running in private clouds are easier to integrate with other in-house applications, such as identity management systems.
  • Lower total costs — Private clouds may be cheaper over the long term compared to public clouds, since it is essentially owning versus renting. According to several analyses, the breakeven period is between two and three years.
  • Capital expense and operating expense — Private clouds are funded by a combination of capital expense (with depreciation) and operating expense.

Summarising:
To recap, cloud computing is characterised by real, new capabilities such as self-service, auto-scaling and chargeback, but is also based on many established technologies such as grid computing, virtualisation, SOA shared services and large-scale, systems management automation. The top two benefits of cloud computing are speed and cost. Through self-service access to an available pool of computing resources, users can be up and running in minutes instead of weeks or months. Making adjustments to computing capacity is also fast, thanks to elastically scalable grid architecture. And because cloud computing is pay-per-use, operates at high scale and is highly automated, the cost and efficiency of cloud computing is very compelling as well.

In the next write-up:
While cloud computing offers compelling benefits in terms of speed and cost, clouds also present serious concerns around security, compliance, quality of service and fit. There are a number of issues and concerns that are holding some organisations back from rushing to the cloud. The top concern far and away is security. While one can debate the relative security of public clouds versus in-house data centers, the bottom-line is that many organisations are not comfortable entrusting certain sensitive data to public clouds where they do not have full visibility and full control. So some particularly sensitive applications will remain in-house while others may take advantage of public clouds. Another concern is quality of service, since clouds may not be able to fully guarantee service level agreement in terms of performance and availability. A third area of concern is fit, the ability to integrate with in-house systems and adapt SaaS applications to the organisation’s business processes. Organisations are likely adopt a mix of public and private clouds. Some applications will be appropriate for public clouds, while others will say in private clouds, and some will not use either.

Until the next write-up . . . . Cheers!!!!!!

levitra

Tips and tricks — Securing your systems quick and easy

fiogf49gjkf0d
Introduction

Computers and computer networks are usually the heart and mind of any computer ecosystem, whether at your office or at your home. Generally, one tends to attach a lot more significance to the business ecosystem as compared to the ecosystem in one’s home and the common excuse is cost vs benefit analysis. Often the argument forwarded is that the data in the office is sensitive and therefore needs to be secured. This argument ignores the fact that the data at home is far more personal and any compromise there may well turn out to be a fatal error.

This article aims to give some quick easy, do it yourself tricks for securing your computer, wireless networks and your phone.

For those of you who missed it . . . . . . last month the BCAS had organised a free lecture meeting on ethical hacking. The speaker was Master Shantanu Gawade. A master not only because of the knowledge he possess on the subject of hacking, computer programming, etc., but also because he is a tender 14 years of age. Shantanu’s presentation evoked mixed reactions of shock and awe. Most of the members present were shocked by the potential threats that they had inadvertently exposed themselves to, and in awe because of the skills and knowledge displayed by a precocious boy of 14 years. Those who were able to comprehend the dangers that lay ahead asked — how do we deal with this menace, how do we insulate ourselves? Shantanu was candid enough to say that there are no silver bullets to this problem and that prevention was one of the best answers.

While it would be difficult to address every single issue, there are a few ‘do-it-yourself’ steps that you can take to reduce the threats. This write-up summarises the steps that you can take

  •  to check whether you have left WIFI network unsecured; and
  •  the steps to secure your WIFI network.

Those of you who were present during Shantanu’s presentation would instantly agree that the above would be good starting point.

How safe is your WIFI network:

A WIFI network provides several advantages (no wires and no ugly holes in your wall are just two of them1). A WIFI network allows a user to access the network without being tied to one particular spot. In other words, the user has the convenience of moving from his desk to another desk or conference room, etc. (at home- from your living room to any other room) and still be able to access the Internet or your server. WIFI signals can travel within the periphery (i.e., 360° of the periphery) of the router/ access point up to a particular range. You may say “it’s a huge convenience” and your neighbour might say “a huge convenience to me also”.

An unsecured connection allows neighbours and strangers access to your Internet connection and possibly your home network2. They could stream video over your connection, slowing down your own Internet access. If they have the skills, they may be able to search your hard drive for bank account numbers and other sensitive information. Even worse, they could download something illegal, such as hack some critical infrastructure, pornography, and make it look to the police as if you’re the guilty party. (You may recall that the cybercrime cell had traced some terror emails to the house of gullible citizens with an unsecured network — exploited by trouble-makers.)

So how do you prevent yourself from such threats. While switching off the network may be the easiest way, the proper solution would be to use WPA2 security. WPA2 offers considerably more than the older standards, WEP and WPA, both of which can be cracked in minutes. WPA2 can also be cracked, but if you set it up properly, cracking it will take more of the criminal’s time than anything on your network is worth. Unless of course hacking networks is the criminal’s bread and butter, sole purpose of the criminal’s existence.

Locking your WIFI network

Step 1 in this direction would be to check your router’s menus or manual to find out how to set up WPA2 protection. Once you have activated the settings the next step would be to lock down the same with a secure password.

If Step 1 fails, then to get started, you’ll need to log in to your router’s administrative console by typing the router’s IP address into your web browser’s address bar. Most routers use a common address like 192.168.1.1, but alternatives like 192.168.0.1 and 192.168.2.1 are also common. Check the manual that came with your router to determine the correct IP address; if you’ve lost your manual, you can usually find the appropriate IP address on the manufacturer’s website. Once you have find the appropriate IP address, first change the default password. Generally the default password is ‘admin’ or something similar provided by the manufacturer. Retaining the default password is very risky, because it is rumoured that there’s a public database containing default login credentials for more than 450 networking equipment vendors and there is a high probability that the hacker has already accessed it.

Though no password is foolproof, you can build a better password by combining numbers and letters into a complex and unique string. It is also important to change both your Wi-Fi password (the string that guests enter to access your network) and your router administrator password (the one you enter to log in to the administration console — the two may sometimes be the same) at regular intervals.

Step 2 is to change the Service Set ID (‘SSID’):

Every wireless network has a name, known as a Service Set ID (or SSID). The simple act of changing that name discourages serial hackers from targeting you, because wireless networks with default names like ‘linksys’ are likelier to lack custom passwords or encryption, and thus tend to attract opportunistic hackers. Don’t bother disabling SSID broadcasting; you might be able to ward off casual Wi-Fi leeches that way, but any hacker with a wireless spectrum scanner can find your SSID by listening in as your devices communicate with your router.

Step 3 is to enable the WAP 2 security:

If possible, always encrypt your network traffic using WPA2 encryption, which offers better security than the older WEP and WPA technologies. If you have to choose between multiple versions of WPA2 — such as WPA2 Personal and WPA2 Enterprise — always pick the setting most appropriate for your network. (Unless you’re setting up a large-scale business network with a RADIUS server, you’ll want to stick with WPA2 Personal encryption.)

Step 4 is to enable MAC filtering:

Running ipconfig will display your current network configuration. Every device that accesses the Internet have a Media Access Control (‘MAC’) address, which is a unique identifier composed of six pairs of alphanumeric characters. You can limit your network to accept only specific devices by turning on MAC filtering, which is also a great tip for optimising your wireless network. To determine the MAC address of any Windows PC do the following:

  •  open a command prompt (select Run from the Start menu), type cmd and press Enter (Windows 7 users can just type cmd in the Start Menu search box.)
  •  Next, at the command prompt, type ipconfig/all and press Enter to bring up your IP settings. If you’re using Mac OS X, open System Preferences and click Network.
  •  From there, select Wi-Fi from the list in the left-hand column (or Airport in Snow Leopard or earlier), click Advanced . . . in the lower left, and look for ‘Airport ID’ or ‘Wi-Fi ID’.
  • If you need to find the MAC address of a relatively limited device such as a printer or smartphone, check the item’s manual to determine where that data is listed.

Thankfully, most modern routers display a list of devices connected to your network along withtheir MAC address in the administrator console, to make it easier to identify your devices. If in doubt, refer to your router’s documentation for specific instructions.


Step 5 limit DHCP Leases to your devices:

Dynamic Host Configuration Protocol (DHCP) makes it easy for your network to manage how many devices can connect to your Wi-Fi network at any given time, by limiting the number of IP addresses your router can assign to devices on your network. Tally how many Wi-Fi-capable devices you have in your home; then find the DHCP settings page in your router administrator console, and update the number of ‘client leases’ available to the number of devices you own, plus one for guests. Reset your router, and you’re good to go.

Step 6 is Block WAN Requests:

This is the last step. Enable the Block WAN Requests option, to conceal your network from other Internet users. With this feature enabled, your router will not respond to IP requests by remote users, preventing them from gleaning potentially useful information about your network. The WAN is basically the Internet at large, and you want to block random people out there from initiating a conversation with your router.

Once you’ve taken these steps to secure your wire-less network, lock it down for good by disabling remote administration privileges through the administrator console. That forces anyone looking to modify your network settings to plug a PC directly into the wireless router, making it nearly impossible for hackers to mess with your settings and hijack your network. In case you find the above steps difficult to follow, please take the services of a professional and get it done before it’s too late.

Hope you have a safe computing experience. Cheers!

The basics of cloud computing

fiogf49gjkf0d
Part 3

About this article:
In the previous write-up on this issue we discussed certain important aspects about cloud computing including key terminology and some offerings. This write-up focusses on certain issues which would require consideration before one decides to opt for cloud services.

Background:
Cloud computing basically refers to providing the means through which everything i.e., from computing power to computing infrastructure, applications, business processes to personal collaboration, can be delivered to you as a service wherever and whenever you need. This model is fast emerging as the choice of several large and small businesses. The choice is quite natural considering the (assured) cost savings. Such savings can be either in the form of lower capital expenditure on hardware, software licence, infrastructure or in the form of lower operational expenditure i.e., operation and maintenance expense or reducing idle time, downtime, etc. (refer to the write-up titled Cloud computing basics — part I published in BCAJ April 2011 issue).

Businesses, large and small, have several options, between whether to opt for private or public or hybrid clouds (refer to the write-up titled Cloud computing basics — part II published in BCAJ May 2011 issue). Having several options itself, sometimes, becomes a hurdle while making strategic investments. Cloud computing also brings to the fore certain unique concerns, concerns which are more significant from the ‘enterprise’ point of view.

Primary objective of moving to the cloud:
As organisations evaluate how cloud computing can achieve these advantages, they are faced with numerous choices. While moving to the cloud has definite advantages such as — improving business agility, reducing management complexity and controlling costs, etc., one needs to appreciate that simply moving towards a service-oriented cloud computing model does not automatically deliver benefits. To derive maximum benefit and Return on Investment (ROI), cloud computing needs to be considered as part of a larger move towards more effective management and integration. Needless to say inadequate planning and half-baked cloud computing solutions may add complexities rather than reduce them.

Some myths and some clarifications:
While there are several myths and misconceptions associated with the topic of cloud computing, the ones that the readers of this journal are more likely to identify with are:

Myth 1:

Data security: Will the cloud service provider guarantee security?

One common concern amongst businesses looking to move to cloud computing is data security. Primarily, moving to the cloud entails — parking your data with the service provider, this can be a discomforting thought. The very possibility of threat to confidentiality and security of the data is the source of discomfort.

From a practical standpoint, public cloud datacentres are amongst the most secure premises on the planet. Yet, at the logical level, a cloud provider with every security certification still can’t guarantee the integrity of specific servers, applications, and networks, if your applications are poorly written, set up and secured. Similarly, all the security practices of a cloud provider are meaningless if a customer organisation’s security practices are weak.

The key take-away here is that there are several layers of security to protect your data, but there is always a possibility of chinks in the armour.

Myth 2:

Data control: My organisation will be locked into one vendor and lose control of its data, if it moves to the cloud:

Almost every organisation would acknowledge that businesses need to store shared files securely. This would assume more importance when the organisation is engaged in providing medical, legal and financial services. These organisations are subject to strict local laws.

If one believes that the best way to keep your data a secret is to manage it yourself, then the moot question would be why stash your precious data offsite?

It is relevant to point out that the essence of cloud services is ‘flexibility’. One application may call another on a different cloud service, and data may be stored anywhere, including your own network, but still be accessible to cloud applications. No cloud provider offers a service that completely takes control of your environment. The best cloud solutions will be a combination of on and offpremise services.

The key take-away is that while the service provider may control the infrastructure, the data is not entirely in his control. Sure!!! he controls your access to your own data, but his primary interest is merely optimal utilisation of his resources (just like you).

Myth 3:

Cost savings: An organisation must move all its applications to a cloud service to be able to benefit fully from cloud computing:

Moving an entire datacenter to the cloud is a tall task. Practically, no cloud provider would recommend this, at least not at one go (if you ask me — you are inviting trouble). Ideally, one should adopt a step-by-step approach. One should start by identifying applications in his pipeline that can benefit his organisation by being in the cloud. Look for applications where resources are used intensely for a short period each month then left idle for the rest of the time, or applications where a moderate level of resources are used continuously, but experience ‘periods’ of very high activity.

Such applications are ideal cloud candidates. This is so, because the cloud can scale up and down resources on demand. The cloud is built for flexible access to resources that can be allocated to other applications, or even other customers, when idle.

The key take-away is that one should do a cost benefit analysis of all activities undertaken and gauge the advantages/disadvantages of shifting to the cloud. A proper evaluation would also ensure that you minimise disruptions and costs associated thereto. Who knows, the sum of all parts may be greater than the whole.

Myth 4:

IT role changes: Do I still need an IT administrator?

The role of the Exchange Administrator does not become obsolete due to the cloud. There are still many tasks that remain on-premise. You still have to manage your users and their mailboxes. Industry-specific data retention compliance, as well as implementing custom workflows, is still your responsibility. While some tasks may no longer reside on-premise, managed cloud services free up your time to engage in more strategic roles, providing you with new opportunities.

Apart from patching those servers and physically maintaining them, all other aspects of managing applications remain in the IT administrator’s hands. Monitoring, updating, integration with services such as Active Directory, security and network monitoring — these task are still required within organisations utilising cloud services.

The key take-away is that the all powerful IT administrator’s role is impacted, but the role does not get diminished. The IT administrator’s role will evolve as the availability of compute cycles and networked storage increase — that is a given, just as the IT role has evolved in the past. The question IT administrators must ask themselves is, ‘Am I prepared to play a more strategic role in my organisation?’

Myth 5:

Getting started: All you need is your credit card to start cloud computing:

You can begin using cloud computing services with just a credit card. This is a good way to get experience with this new frontier of services, in fact some of the basic services may be available free. Most cloud services provide an environment designed for getting started and developing applications.

It is important that one gets used to the concept and gain comfort. Post this one may evaluate the advantages/disadvantages of moving to the cloud.

The key take-away is take small, measured steps. Learn from experience before betting it all.

In summary:

There are pros and there are cons, also there are hyped stories of success and spiced-up stories of failure. Readers may well be cautioned to do their own research and allay their fears of this emerging service. While the service provider may promise you the moon, one should pare his expectations and make investments only upon realising measured benefits. In short, look before you leap!!!!!!!

Cyber warfare — the next level

fiogf49gjkf0d
About this write-up

This write-up is about a new type of worm/malware, which was in the news recently. The worm called Flamer attracted a lot of hype and media attention given the speculation regarding its likely impact. This write-up is an attempt to cull out some key takeaways for benefit of the readers.

Background

Cyberspace is no longer a benign place to surf. Viruses are getting increasingly nasty and complex over the years. But while worms were traditionally being used by hackers and cybercriminals either to display their prowess or steal information and money, it appears now that even nation–states are backing such crimes to target countries – a trend popularly known as cyber espionage and cyber warfare.

Cyber warfare – the next level – Flamer worm

Circa 2010, news reports started appearing about a new type of a worm i.e., Stuxnet1. What was different about this worm was that it was the first of its kind i.e., the level of complexity, its apparent motive and the intended victims were not the ‘usual’ businesses or gullible individuals. On the contrary, experts believed that this was a ‘first’ – a worm written by a sovereign nation with the sole purpose of disrupting infrastructure facilities in another territory. It was also a ‘first’ because the worm was no longer attacking the zeros and ones (computer code), this time it was attacking the devices that were controlled by these zeros and ones – with a view to disrupt their functionality. There was the nagging feeling . . . . . . the type you get when somebody really bad/capable of doing nasty thing says . . . . I’ll be back (like Arnold Schwarzenegger in Terminator). It was (painfully) obvious that Stuxnet wasn’t the last word on the topic and things were likely to heat up . . . . very soon . . . . Coming back to the present day, that nagging feeling has become a reality – Stuxnet appeared in 2010, Duqu surfaced in 2011. Sometime around May 20122, security experts started issuing warnings about the ‘Flamer’ worm aka W32. Flamer or sKyWIper.

Threat assessment

A senior analyst at a leading security firm, sharing his view on the subject reveals that this is the most sophisticated threat he has ever seen. The same security firm had undertaken a detailed analysis of the ground-breaking Stuxnet virus, which ‘purportedly’ targeted Iran’s nuclear enrichment facilities two years ago, sending some of their centrifuges spinning out of control. The preliminary results shared by the senior analyst suggested that Flamer appeared to be even more complex than Stuxnet, and that it was an incredibly clever, comprehensive ‘spying programme’.

Grapevine reports suggest, “Flamer is a backdoor worm that goes looking for very specific information. It scrapes a mass of information from any infected machine and then sends it, without the user having any idea what is going on. The amount of information it can send is huge”.

Components identified3

A number of components of the threat have been retrieved and are currently being analysed. Several of the components have been written in such a way that they do not appear overtly malicious. Some of the components identified as malicious are:
• advnetcfg.ocx (0.6MB) (backdoor component)
• ccalc32.sys (RCA Encrypted Config file)
• mssecmgr.sys (6MB) (main compression component, LUA interpreter, SHH, SQL library)
• msglu32.ocx (1.6 MB) (Steals data from images and documents
• boot32drv.sys (~1kb) (Config file)
• nteps32.ocx (0.8MB) (performs screen capture)

This time it is different The one thing that everyone is sure about is that Stuxnet, Duqu and Flamer are definitely in another class than your typical spyware or fake antivirus threat. Experts universally agree that this complex software required a coding team and could not be achieved by a lone wolf coder. The complexity of the task has led many to presume only a nation-state would have the resources. Just as is being speculated in case of Stuxnet. It is interesting to note that unlike Duqu, Stuxnet and Flamer have the ability to infect systems via USB key, thus allowing them entry into facilities that are isolated from the Internet. They also use the same printer-driver vulnerability to spread within the local network. While all three worms are similar in the sense that all three are seriously modular (i.e., in a way that lets their command and control servers add or update functionality at any time), Flamer is definitely a step up.

  • Here is why: According to Kaspersky researchers, a Stuxnet infestation takes just 500KB of space, as against this, Flamer is an out-and-out giant at 20MB. Part of Flamer’s size involves the use of many thirdparty code libraries, prefab modules that handle tasks like managing databases and interpreting script code. Neither Stuxnet nor Duqu rely on third-party modules.

  • Given its size, Flamer is smart enough to mask its download impact. It is downloaded in multiple sessions. This is done to avoid giving itself away. In this respect, it is far more intelligent than its predecessors.

  • Stuxnet and Duqu used stolen digital signatures to fool antivirus softwares. Unlike these, Flamer doesn’t use a digital signature. Instead, Flamer uses some unique techniques for self-protection, chief among them is the ability to recognize over 100 antivirus installations and modify its behaviour accordingly. It uses five different encryption methods, three different compression techniques, at least five different file formats (and some proprietary formats too) and special code injection techniques.

  • Although Flamer is not concealed by a rootkit, it uses a series of tricks to stay hidden and stealthily export stolen data. One of its most amazing capabilities is the creation of a file on the USB stick simply named ‘.’ (dot). Even if the short name for this file is HUB001.DAT, the long name is set to ‘.’, which is interpreted by Windows as the current directory. This makes the OS unable to read the contents of the file or even display it. A closer look inside the file reveals that it is encrypted with a substitution algorithm.
  • Flamer is definitely complex. In one of the earlier reports on this threat, a security expert noted that it has at least 20 modules, most of which are still being investigated. Another expert remarked that one of its smaller modules is over 70,000 lines of C decompiled code and contains over 170 encrypted strings. As for what it does, you might better ask what doesn’t it do. Just about any kind of espionage you can imagine is handled by one of Flamer’s modules.

 

  • Flamer has very advanced functionality to steal information and to propagate. Using this toolkit, multiple exploits and propagation methods can be freely configured by the attackers. Information gathering from a large network of infected computers was never crafted as carefully as has been done in Flamer.

  • Stuxnet relied on an unprecedented four zero-day attacks to penetrate systems and Duqu managed with just one zero-day attack. Flamer didn’t use any zero-day attacks.
  •     Stuxnet and Duqu infestations automatically self-destructed after a set time; Flamer can self-destruct, but only upon receiving the auto-destruct code from its masters.

It’s worth noting that Flamer doesn’t necessarily do any of the things described above, not even replicate to other systems, unless it’s told to do so by its Command and Control servers. This combined with the fact that it uses many standard commercial modules has helped it get past behaviour and reputation-based detection systems (i.e., our commonly used antivirus systems).

It’s a live program that communicates back to its master. It asks, where should I go? What should I do now?

Experts say that Flamer is most likely capable to use all of the computers’ functionalities for its goals. It covers all major possibilities to gather intelligence, including keyboard, screen, micro-phone, storage devices, network, wifi, Bluetooth, USB and system processes.

To state simply, once a system is infected, Flamer begins a complex set of operations, including sniffing the network traffic, taking screenshots, recording audio conversations, intercepting the keyboard, and so on so forth.

Sounds just like a cold war (fiction) scenario — where highly trained, deep cover ‘sleeper’ agents were inserted deep inside enemy territory to attack the enemy from within. Takes me back to some of my favourite movies……..Salt, Killers, The impossible spy…….

Readers who are interested in more technical information may also look up the following:

  • http://www.symantec.com/security_respons/writeup.jsp?docid=2012-053007-0702-99&om_ rssid=sr-mixed30days

  •     http://blogs.mcafee.com/mcafee-labs/jumping-in-to-the-flames-of-skywiper

  •     http://www.mcafee.com/threat-intelligence/mal-ware/default.aspx?id=1195098

  •     h t t p : / / w w w . f – s e c u r e . c o m / w e b l o g / archives/00002371.html
  • http://www.kaspersky.com/about/news/virus/2012/Kaspersky_Lab_and_ITU_Research_ Reveals_New_Advanced_Cyber_Threat

  •     http://www.mcafee.com/us/about/skywiper. aspx

  •     http://www.crysys.hu/skywiper/skywiper.pdf4

It would be a cliché to say, that this is not the last we have heard about this worm or that cyber warfare is now gaining momentum and therefore expect to read and hear more on this topic.

 1.    Read Cyber warfare the next level BCAJ October 2010

 2.    Unconfirmed reports suggest
Flamer was first reported as early as 2007

 3.    Source: www.symantec.com

High-Frequency Trading

High-Frequency Trading (‘HFT’) has been around for many years now. In spite of this, very little is known about HFT. Ever since the beginning, people in general have either sung praises or spoken of the dark side of HFT. The purpose of this article, however, is not to dwell on the merits or demerits of HFT. Instead, this article is to depict how technology is used in this trade and the basic mechanics of HFT. The technical content has been kept at a bare minimum and logical/practical aspects have highlighted wherever possible.

Background

Once upon a time trading in stocks, securities, commodities, etc. was done on the ‘exchange floor’. Back then, ‘trading’ was a fairly straight-forward affair. Buyers and sellers gathered on exchange floors and heckled with each other until they struck a deal. Those were the heady days of power, pressure and sentiments. However, trading on the exchange floor had its own limitations and the trading practices were plagued with malpractice.

In case you have never had the chance to see how trading took place in the olden days or experience it, check these movies — English movies — Trading Places, Wall Street, Hindi movie — Guru.

By mid-nineties, computers and technology started gaining prominence. The ability of a computerised system, to flawlessly execute transactions, match buy and sell orders, etc., was growing exponentially. Then, in 1998, the Securities and Exchange Commission authorised electronic exchanges to compete with marketplaces like the New York Stock Exchange. The basic intent was to open markets to anyone with a desktop computer and a fresh idea. This objective was achieved largely.

Apparently, (as per data published by NYSE and other public sources) between 2005 and 2009 the trading volume (on the NYSE) grew about 164%. News reports have credited HFT for a large part of this meteoric rise. As a matter of fact, there are some who say that in the United States (US), while high-frequency trading firms represent 2% of the approximately 20,000 firms operating, they account for 73% of all equity orders volume. Currently, it is estimated that HFT trades account for 56% of all equity order volumes in the US, 38% of trades in Europe and 5-10% of trades executed in Asia.

Making money out of thin air

HFT became most popular when exchanges began to offer incentives for companies to add liquidity to the market. For instance, some exchanges have a group of liquidity providers called supplemental liquidly providers (SLPs), which attempt to add competition and liquidity for existing quotes on the exchange. As an incentive to the firm, the exchange pays a fee1 or rebate for providing the said liquidity. Rumour has it that the SLP was introduced following the collapse of Lehman Brothers in 2008, when liquidity was a major concern for investors.

High-frequency traders also benefit from competition among the various exchanges, which pay small fees that are often collected by the biggest and most active traders — typically a quarter of a cent per share to whoever arrives first. Those small payments, spread over millions of shares, help high-speed investors profit simply by trading enormous numbers of shares, even if they buy or sell at a modest loss.

HFT made simple

HFT is a program trading platform that uses powerful computers to transact a large number of orders at very fast speeds. HFT uses complex algorithms2 to analyse multiple markets and execute orders based on market conditions. Typically, the traders with the fastest execution speeds will be more profitable than traders with slower execution speeds.

Powerful algorithms — ‘algos,’ in industry parlance — execute millions of orders a second and scan dozens of public and private market-places simultaneously. They can spot trends before other investors can blink, changing orders and strategies within milliseconds.

Basic mechanics

The mechanics of such systems coupled with complex algorithms are not standardised. Conceptually, the design may be broken down as follows:

  •     The data stream unit i.e., the part of the systems that receives data e.g., quotes, news, etc., from external sources.

  •     The decision or strategy unit

  •     The execution unit.

These systems are very intelligent and make use of social networks, scanning or screening technologies to read posts of users and extract human sentiment which may influence the trading strategies.

Characteristics of a HFT system

HFT can be characterised as under:

  •     It uses computerised algorithms to analyse incoming market data and implement trading strategies;

  •     HFT trading strategies are for investment horizons of less than one day. The primary game plan is to unwind all positions before the end of each trading day. An investment position is held only for very brief periods of time i.e., from seconds to hours. The system rapidly trades into and out of those positions, sometimes thousands or tens of thousands of times a day;

  •     At the end of a trading day there is no net investment position. Since they must finish the day flat, HFTs exhibit balanced bi-directional (i.e., ‘two-way’) flow. It is argued that due to this feature HFTs can’t accumulate large positions.

  •     HFTs can’t deploy large amounts of capital, infact, HFTs have little need for outside capital or leverage, and tend to be proprietary traders. In theory, HFTs can’t ‘blow up’ (they don’t use much leverage, and don’t have much capital, so they can’t lose much capital!);

  •     Generally employed by proprietary firms or on proprietary trading desks in larger, diversified firms;

  •     It is very sensitive to the processing speed of markets and of the traders own access to the market;

  •     Positions are taken in equities, options, futures, ETFs, currencies, and other financial instruments that can be traded electronically;

  •     High-frequency traders compete on a basis of speed with other high-frequency traders, not (supposedly) the long-term investors (who typically look for opportunities over a period of weeks, months, or years), and compete for very small, consistent profits;

  •     HFT is a very low-margin (low-risk, low-reward) activity;

  •     Theoretically speaking, HFTs follow a Gaussian (Normal) distribution. Their logic is simple i.e., large expected returns are rare and tiny expected returns are abundant;

  •     For the HFTs, opportunities are short-lived because they are very small and they are heavily competed for;

  •     Economics of HFT requires identification of large quantities of trading signals, which is highly technology-intensive. Success or failure in this case is determined by the HFTs speed i.e., speed in capturing opportunities before they are accessed by competitors.

Standard HFT strategies

Most high-frequency trading strategies fall within one of the following trading strategies:

  •     Market making: involves placing a limit order to sell (or offer) or a buy limit order (or bid) in order to earn the bid-ask spread. By doing so, market makers provide counterpart to incoming market orders;

  •     Ticker tape trading: much information happens to be unwittingly embedded in market data, such as quotes and volumes. By observing a flow of quotes, high-frequency trading machines are capable of extracting information that has not yet crossed the news screens;

  •     Event arbitrage: certain recurring events generate predictable short-term response in a selected set of securities, HFTs take advantage of such predictability to generate short-term profits;

  •     High-frequency statistical arbitrage: this strategy requires the HFT to exploit predictable temporary deviations from stable statistical relationships among securities.

HFT the dark side

High-frequency traders often confound other investors by issuing and then cancelling orders almost simultaneously. Loopholes in market rules give high-speed investors an early glance at how others are trading. And their computers can essentially bully slower investors into giving up profits — and then disappear before anyone even knows they were there.

HFT came into spotlight about two years ago when a very large Wall Street firm sued one of their former employees for stealing code that was used in one of their programs used to execute this type of trade. When the former employee (programmer) was accused of stealing secret computer codes/software — that a Government prosecutors said could ‘manipulate markets in unfair ways’ — it only added to the mystery be-cause the Wall Street firm acknowledges that it profits from high-frequency trading, but disputes that it has an unfair advantage.

It is rumored that in May 2010 — a flash crash took place in the Dow in which several companies and blue chips lost a lot of their value in a matter of minutes, and the New York Times reported that shares of big companies like P&G and Accenture saw ridiculous prices like a penny or a $100,000. The prices were later restored to more usual levels.

Even in India — BSE cancelled all the futures traded on in one of the trading last year, and at least an initial report blamed an algo trader from Delhi for causing havoc because of their trades.

In spite of the fact that HFT has been around for more than a decade, even today, very little is known about HFT and Algorithmic trading. Only recently regulators like the SEC and SEBI has started asking some questions. In fact, if the readers are interested they may look up the recent guidelines issued by SEBI on this issue. SEBI’s endeavour is to contain possibilities of systematic risk caused by the use of sophisticated automated software by brokers.

There are several questions like how do these programs work, what are the triggers, is there a risk and do these programs provide an undue/ unfair advantage to the user. Only time will tell.

Disclaimer:

This article is only intended to create awareness about HFT. The contents of this article are based on various stories, articles, research papers, etc. currently available in the public domain. The purpose of this article is neither to promote, nor malign any person or a company mentioned in the article.

Microsoft Office 2013

fiogf49gjkf0d
About this write-up
MS Office is a popular application software and enjoys wide usage across the world. Recently Microsoft released the Customer Preview of the latest version of its Office suite i.e., Microsoft Office 2013 (a.k.a Office 15). This write-up briefly discusses some of the new features proposed to be introduced in the new software, product enhancements to existing features, and some pros & cons associated therewith.

In my last write-up I had mentioned that developments and product announcements/launches were happening in such quick succession, that hardly a day passes by and a new product is launched. As a consequence, products are becoming out of fashion (in relative terms) almost immediately after launch.

When I was penning my previous write-up, I chose to write about the Flamer worm instead of writing about Samsung Galaxy S-III, iOS 6 and Microsoft Surface) . . . . don’t ask me why. Anyhow, I had ended the write up with the note that the next write-up would be about Samsung Galaxy S-III. In all honesty, I was all set to keep this commitment and suddenly out of the blue I read about Microsoft’s latest. All of a sudden it felt like Galaxy S-III had already become ‘old news’ and I had to write about the latest offering (announcement for now) from Microsoft. And so . . . . here we are . . . .

Background

Microsoft Office 2013 (a.k.a Office 15) is a productivity suite from Microsoft Windows and is likely to succeed the hugely popular Microsoft Office 2010. A developmental version (build 2703.1000) was leaked in May 2011. Subsequently, in January 2012, Microsoft released a technical preview of Office 15 (build 3612.1010). Almost six months later, on 16 July 2012 (to be precise) Microsoft unveiled the Customer Preview.

In this write-up, I have tried to highlight some of the new features proposed to be introduced in the new software, product enhancements to existing features, and some pros & cons associated therewith.

Whats new in Office 15

 While there are several features that one can describe, here are a few features that I found exciting:

  • Cloud integration
  • Will respond to touch, Stylus and the good ol’ keyboard
  • The new ‘Metro’ look
  •  Edit PDFs in Word 2013
  •  Will support Open Document Format (‘ODF’) 1.2
  •  Sharing, embedding web elements like YouTube videos
  •  Social media integration — skype, flicker
  •  Enhancements in Excel, Word, Outlook, One Note.

Some of the things that might not excite a few people:

  • Will have to upgrade from Windows XP/Vista
  • Get used to SkyDrive cloud storage.

Cloud integration

Cloud integration is now becoming a de facto ‘must have feature’. Cloud storage has been around for a quite some time (X drive types). Without getting into ‘who started it all’, Google’s chrome OS was a serious attempt to move towards cloud integration. If you recall, the Chrome OS was touted as one of slimmest OS because it required very little time to boot and Google had famously said that there was no need for providing any apps within the OS because everything was on the internet and that most people only boot their PCs and log on to the net — hence all the apps would be on the net. Last year when Apple unveiled its latest offering, it also announced a new service iCloud (5 GB storage). Gone were those days when you need to synchronise your PCs at different locations, no need to carry data in a portable drive or disc. With Office 15, Microsoft too has joined the gang. SkyDrive is default storage location for all your files (effectively SkyDrive is expected local C drive). Subscribers will be given 20GB storage space.

With this version, the Microsoft is moving to a subscription-based model wherein your Office files are tied to your Microsoft ID. Once you sign up, you can download the various desktop apps to a certain number of devices and, as with Windows 8, your settings, SkyDrive files and even the place where you left off in a document will follow you from device to device. Office 365, which is currently being sold to businesses, will be available to home-users as well.

 In addition to receiving future Office upgrades automatically, subscribers will get additional Sky- Drive storage, multiple installs for several users, and added perks such as international calls via Skype. You’ll also be able to stream Office apps to an Internet-connected Windows PC.

Responds to touch, stylus also

The preview page says “Office 15 will take you beyond the mouse and keyboard — to embrace touch and pen input” (one can hope for a much better experience while using One note). While multi-touch laptops aren’t — and probably won’t be — a mainstream choice for business and homeusers anytime soon, touch is an essential component of smartphones and tablets, obviously. The pen may be making a comeback too, judging by the popularity of Samsung’s stylus-equipped Galaxy Note. Office 2013 will allow you to swipe a finger across the screen to turn a page; pinch and zoom to read documents; and write with a finger or stylus — just like you do on your smart phone or tab. Additionally, when you write an email by hand, Office 2013 will automatically convert it to text. The user interface has been modified (especially the Ribbon feature — its flattened up or as Microsoft likes to call it ‘Metrified’). While this may seem a bit odd when you see it on a desktop, but you may appreciate it more when you try using it on a tablet PC or on your smart phone.

The new ‘Metro’ look

Microsoft loves Metro user interface, which was first introduced in Windows Phone 7 around two years ago. Since then Metro has become the user interface of future for Microsoft and the company is putting it in all its products. Office 2013 too has been given a Metro makeover. It is a slick interface, with clean lines, lots of empty space and looks modern.

For the uninitiated

Metro is an internal code name for a typographybased design language created by Microsoft. Originally meant for use in Windows Phone 7. Early uses of the Metro principles began as early as Microsoft Encarta 95 and MSN 2.0. Later on, these principles evolved into Windows Media Center and Zune. Now they are included in Windows Phone, Microsoft’s website, the Xbox 360 dashboard update, and Windows 8. A key design principle of Metro is better focus on the content of applications, relying more on typography and less on graphics (‘content before chrome’). WinJS is a JavaScript library by Microsoft for developing Metro applications with HTML.

There are two aspects to the design changes introduced in Office 2013 — visual changes and usability changes. Microsoft thinks that there is no need for any faux chrome or aero fluff around windows. Hence, the interface has been ‘Metrified’ (that’s how Microsoft likes to say it). The icons have been flattened, things have been cleaned up (i.e., the heavy boundaries, bevelled edges, shadows, etc. . . . . all gone.

 In fact, icons are likely to be a thing of the past. Under Metro there will be hardly any need for icons. While some argue that icons were simple (graphic, easy to remember) indicators for tools like copy, paste, etc., they kinda spruced things up. Microsoft argues that when you have as many as 4000 of such icons it eats away most of your display area.

Microsoft justifies the Metrification by saying that Office 15 is likely to used and seen on screens of different shapes and sizes, consider the screen of a typical smart phone…… would you rather see the screen or the numerous icons. Duh!!! It’s about getting the content front and centre and trying to get the application content out of the way — there when you need it, but out of the way when you don’t. On tablets and smart phones, you want to put the application stuff to one side.

Microsoft thinks that once you get the hang of it, you will appreciate the thought process.

Edit PDF documents in Word 2013

Until now you could only ‘save’ office files in PDF format. To edit these files or other PDF files, either you would have to edit the original office file and then (again) save as PDF or you had to buy third-party software/utilities. Going forward, you will be able to open PDF files and edit them in MS Word 2013 and then save them as word files or as PDF.

Word 2013 will maintain the formatting such as headers, columns, and footnotes and elements such as tables and graphics, of the PDF and permit you to edit them as though they were created in Office 2013 itself.

Users feedback suggests that Office 2013 handled simpler PDF files with ease. But it was not so graceful with the complex ones that had many images and elements.

Will support Open Document Format (‘ODF’) 1.2

Microsoft fought ODF1 as it became an open international standard (ISO/IEC 26300) by creat-ing its own standard OOXML (ISO/IEC 29500) and pushing it through standards organisations. But Microsoft has now apparently accepted that ODF has widespread support with other vendors, governments and organisations.

Microsoft already supports ODF 1.1 in Office 2007 SP1, Office 365, SharePoint and SkyDrive WebApps. Now Office 2013 will support ODF 1.2.

ODF 1.2 has already been widely adopted and is supported by, along with others such as Gnumeric, Google Docs, Zoho Office and AbiWord.

Sharing, embedding web elements like YouTube videos & social media integration — Skype, flicker

Office 2013 uses Sky Drive to enable better sharing of documents. You can invite people to work on to the document or use PowerPoint to give a presentation on the web. Word files can also be published as blogs on several popular blogging services directly from Office 2013.

YouTube videos can be now embedded into the documents directly and users don’t have to save these clips to the local computer. Office 2013 also includes Flickr integration that allows users to search for photographs on the popular photo sharing websites and embed pictures using Office 2013.

Microsoft acquired Skype last year, and Office 2013 will be the first suite to incorporate the popular VoIP service. You can integrate Skype contacts with Microsoft’s enterprise-oriented Lync communications platform for calling and instant messaging. Office subscribers get 60 minutes of Skype international calls each month.

User feedback suggests that there’s room for improvement, though.

Big Data – II

fiogf49gjkf0d
About this Article

This article is
part 2 of the series on Big Data. This article briefly deals with issues
such as, why Big Data is gaining so much importance and what are the
recent trends in Big Data collection and analysis. The write up also
discusses some of the technologies being used for Big Data analysis.

The
previous write up briefly touched upon what is Big Data and some
background on the vital role played by it. This write up will delve a
little further and deal with some of the trends and developments in this
arena.

Background:
Big Data, as discussed earlier,
is all about collecting, storing, analysing and using the results for
betterment (one sincerely hopes so). It is typically characterised by
features such as volume, velocity, variety and veracity. While Big Data
is not entirely a recent development, but the manner in which data is
gathered, the sources of information, techniques for storage and
technologies for analysis, have evolved significantly in recent times.

Big Data is for Everyone:
Generally
speaking, most people believe that Big Data is for large corporations
and businesses or for the Government. But the truth is, whether you’re a
5 person shop or part of the Fortune 500, you can have Big Data and it
can help you to grow and become profitable. Today, if one wants to
remain competitive, he has to analyse both internal and external data,
as quickly and cost effectively as possible. This (rule) applies equally
to all types of organisations, big or small, giants or dwarfs.

Right
now, you may be asking how will Big Data help me to find the
opportunities by analysing new sources of data? Here is one small
example:

As the world becomes more instrumented, with RFID tags,
sensors and other sources, we are creating more and more data. When
paired with external data – like that generated by social media sites –
there’s incredible opportunity that is largely untapped and unanalysed.
This is where Big Data analysis comes into the picture. Every day,
companies of all sizes “cut through the noise” created by so much data
to find valuable insights.

Not just businesses and commercial
organizations, Big Data analysis can be applied to the social sector
too. Using the same techniques and tools (i.e. used for developing
marketing and risk management tools), Big Data analysis when applied to
the social sector, has the potential to revolutionise the functioning of
those sectors. For instance, imagine the advantages of using Big Data
analysis in:

  •  the public sector;

  •  the healthcare sector; or

  •  (to put it more generally,) mainly those sectors where an ethos of treating all citizens in the same way is kind of expected.

Advantages would traverse beyond commercials to the realm of mass social betterment.

How Big Data is Used
:
Big data allows organisations to create highly specific segmentations
and to tailor products and services precisely to meet those needs.

Consumer
goods and service companies that have used segmentation for many years
are beginning to deploy ever more sophisticated Big Data techniques,
such as the real-time micro-segmentation of customers to target
promotions and advertising. As they create and store more transactional
data in digital form, organisations can collect more accurate and
detailed performance data, in real or near real time, on everything from
product inventories to personnel sick days. Information Technology is
used to instrument processes and then set up controlled experiments.

Data
generated therefrom is used to understand the root causes of the
results, thus enabling leaders to make decisions and implement change.

Big Data technologies:
Some of the key Big Data technologies which are in play are described below:

  •  Cassandra: Cassandra is an open source (free) database management
    system, designed to handle huge amounts of data on a distributed system.
    This system was originally developed at Facebook and is now managed as a
    project of the Apache Software foundation.

  •  Dynamo: Is a proprietary software developed by Amazon.

  •  Hadoop: Is an open source software framework for processing huge
    datasets on certain kinds of problems on a distributed system. Its
    development was inspired by Google’s MapReduce and Google File System.
    It was originally developed at Yahoo! and is now managed as a project of
    the Apache Software Foundation.

  •  R: “R” is an open source
    programming language and software environment for statistical computing
    and graphics. The R language has become a de facto standard among
    statisticians for developing statistical software and is widely used for
    statistical software development and data analysis. R is part of the
    GNU Project, a collaboration that supports open source projects.

  •  HBase: Is an open source (free), distributed, non-relational database
    modeled on Google’s Big Table. It was originally developed by Powerset
    and is now managed as a project of the Apache Software foundation as
    part of the Hadoop.

  •  MapReduce: A software framework introduced
    by Google for processing huge datasets on certain kinds of problems on a
    distributed system.32. This too has been implemented in Hadoop.

  •  Stream processing: Also known as event stream processing. This refers
    to technologies designed to process large real-time streams of event
    data. Stream processing enables applications such as algorithmic trading
    in financial services, RFID event processing applications, fraud
    detection, process monitoring, and location-based services in
    telecommunications.

  •  Visualisation: This refers to technologies
    used for creating images, diagrams, or animations to communicate a
    message that are often used to synthesise the results of big data
    analyses. Some of the instances of visualisation are: Tag clouds,
    Clustergram, History flow, etc.

Myths surrounding Big Data:
While
there are many myths surrounding Big Data, for the purpose of this
write up, I have briefly summarised few myths commonly associated with
Big Data. These are:

Big Data is only about massive volumes of data:

As
discussed in part 1, volume is only one of the factors. Generally, the
industry considers petabytes of data as a starting point. However, it is
only a starting point, there are other aspects such as velocity,
variety and veracity to deal with.

Big Data means unstructured data:

While
variety is an important characteristic, it should be understood in
terms of format in which the data is gathered and stored. Many people
have a mistaken belief that the data would be in an unstructured format.
As a matter of fact, the term “unstructured” is misleading to a certain
extent. This is because, one doesn’t take in to account the many
varying and subtle structures typically associated with Big Data types.
Candidly, many industry insiders admit that Big Data may well have
different data types within the same set that do not contain the same
structure. Some suggest that the better way to describe Big Data would
be to term it “multi structured”.

Big Data is a silver bullet type solution:

This
is an avoidable pitfall. Most businesses have a tendency to believe
that Big Data is a silver bullet to their growth strategy. The
applications available only offer one of the means to analyse data.
Application of the learnings from the analysis is altogether a different
thing. What needs to be understood is that, Big Data is only a means to
the end and not the end itself.

What to expect in future:

  •    Big Data will be an important driver of business activities in the future. Almost all businesses will leverage the insights from Big Data based research to hone in their strategy. Be it innovations, competition or value addition, Big Data’s contribution will be significant.
  •     The impact of Big Data will span across sectors. Among these health sciences and natural sciences are likely to have a positive impact on the larger society.

  •     One can expect that the sources of data and volume of data itself will grow exponentially. Consequently, the data integration process will become more efficient.

  •     There will be a demand for talented personnel. Notable demand will not be restricted to personnel possessing the requisite skill for collecting and analysing Big Data. The need will be for personnel who know how to use the results of Big Data analysis in effective decision making.

  •     Decision making as we know it (and put in practice) today, is likely to undergo a drastic change. Sophisticated analytics can substantially improve decision making, minimise risks, and unearth valuable insights that would otherwise remain hidden.
  •    We are likely to see a sea of change in the regulatory environment, mainly related to privacy, intellectual property rights and public liability.

Well, this concludes the part 2 of the write up on Big Data. In my next write up I intend to deal with “the (ab)use of social media”. I intend to cover some (disturbing) trends that have caught the attention of many. Its still a thought, but the idea is fresh.

Disclaimer: The information/factual data provided in the above write up is based on several news reports, articles, etc., available in the public domain. The purpose of this write up is not to promote or malign any person or company or entity, the purpose is merely to create an awareness and share the knowledge that is already available in the public domain.

Crowdsourcing

fiogf49gjkf0d
About this article:
This write-up is (in a manner of speaking) a continuation of the previous write-up on mass collaboration. The basic idea remains the same: there is a large problem, capable of being broken into several small manageable parts. The task, though simple to humans, is difficult for computers to achieve (as yet). This idea is applied differently to achieve a variety of objectives. Some are commercial and then there are others which contribute to the growth of society as a whole.

Background:
The term ‘crowdsourcing’ as you may have already guessed, is a derivative of the words ‘crowd’ and ‘sourcing’. While this phrase was first coined by Jeff Howe in June 2006 Wired magazine article, you may be surprised to know that this concept was being commonly applied for several years before that. Few examples which have become huge:

  • Wikipedia
  • Captcha and recaptcha

Some lesser known examples:

  • Brooke Bond/Lipton runs a slogan contest, the winner of the slogan gets a cash reward (and Brook Bond gets 1000+ new catchy slogans for future marketing — virtually for free);

  • An ad agency organises photography contest. Contestants use their own cameras and film. They are given themes/concepts and come up with innovative ideas/snaps. The ad agency spends on promoting the event and some refreshments for the contestants. Post the contest the ad agency retains all the photos (1000’s of ideas — virtually for free);

  • Very recently, two leading business houses in India announced in newsprint and media that they would invest in start ups. They invited entrepreneurs all over the country (and abroad) to register and share their ideas (basic idea, sample model, estimates for commercials). Everyone would be given the opportunity to make an ‘elevator’ pitch. Once again 1000+ ideas virtually for free.

And then there are some blacksheep . . . . . .

  • Remember Speak Asia . . . . if you do some digging you may find that similar schemes were floated in the African continent . . . . very successful . . . . all stakeholders made money. Somehow the idea didn’t click in India.

  • If you have seen Die Hard 4 — the villan uses the skill of amateur hackers to develop a code, this code is used to disrupt systems.

If you look at any of the above-mentioned ideas, you may agree that all of them were simple ideas, brilliantly executed.

What is crowdsourcing and how does it work:

Simply put, crowdsourcing is a distributed problem solving and production model. Typically, a problem is broadcast to an unknown group of solvers in the form of an open call for solution. The ‘users’ or the crowd (i.e., the online community) comes together and submits solutions. Yet another crowd sifts through these solutions and finds the more acceptable/better solutions. These solutions are then owned by the broadcasting agency (i.e., the crowdsourcer). The winning solutions are sometimes rewarded, sometimes monetarily, sometimes with a prize or recognition (i.e., the contributors are paid crumbs and the broadcaster keeps the cake).

Advantages of crowdsourcing:
Without getting in to the ethical aspect of the subject, one needs to appreciate that there are certain advantages that crowdsourcing can offer :

  • Problems can be explored at a comparatively small cost, often very quickly.

  • Possible to achieve a win-win proposition sans monetary compensation — best example is Luis von Ahn’s Recaptcha and the efforts to translate wikipedia’s German version.

  • Crowdsourcing makes it possible to tap a wider range of talent (or prospective customers) than normally feasible — best example — auto industry has been using social media to source ideas from prospective customers — ideas about car design, features, accessories, etc.
  • Resultant rewards have potential of spurring activities — more entrepreneurship, growth in business, investments, employment, etc.

Criticism about crowdsourcing:

  • Once the crowd starts contributing, somebody has to sort and sift through the information. This is a costly affair, unless the right resources are used the costs outweigh the benefits;

  • Given that there is no monetary compensation, increases the likelihood of the project failing. Without money one may face problems with fewer participants, lower quality of work, lack of personal interest in the project/results, etc.;

  • Barter may not always be possible;

  • Risks mitigation through contracts may not be possible since there are no written contracts, non-disclosure agreements or for that matter non-transparency about how the information will be used;

  • Difficulty in managing and maintaining a working relationship with the crowd throughout the duration of the project;

  • Susceptibility to faulty results and failure is still too high.

Though there are several pros and cons, so far the perception has been positive. With the success of ideas like recaptcha and the translation project, people have started believing in crowdsourcing’s potential to balance global inequalities. A rather tall statement, but its still a wait-and-watch situation.

I would like to end this write-up by sharing my experience with crowdsourcing. Sometime ago, I downloaded a free app on my phone called Waze. At the time I didn’t know that it was a crowdsourcing app. However after using the app, I have (kind of) started leaning in favour of crowdsourcing and hope to see more developments in this field.

Waze app:

Waze is a free iPhone app which tries to crowdsource real-time traffic and navigation data. The application has advantages because it provides information which is ‘almost’ real-time and updated. It is quite different from your navigation/GPS systems because apart from providing you information about routes, Waze also provides information about traffic, speed at which the traffic is moving (it’s been a mixed experience for me), information about roads under construction (this is based on user inputs and quite accurate) — if there is a obstruction or an accident and the road gets blocked, users can send an instant update and all users will be pinged instantly.

The best part is that most of the time the user simply has to switch on the application and leave it on. The software keeps tracking your speed (using GPS and your GPRS/3G bandwidth) and broadcasts this information to other users. If your car slows down the app sends you a prompt asking if you are stuck in traffic. The information is broadcast almost instantly (have noted that it is broadcast in 5-10 seconds).

I have been using the app intermittently and have found it quite useful to avoid traffic. Have benefitted from updates quite a few times and that’s why I rate it as a pretty good ‘time-saving app’. While the app is free, there is a downside — the constant tracking can drain your battery and unless you have a good data plan, it will also drain your wallet.

That’s all for this month. Next month is likely to be dominated with the budget proposals, but I promise that I will have some interesting ideas and stories to share with you.

Cheers.

levitra

Microsoft Office 2013 – Part II

About this write-up

MS Office is a popular application software and enjoys wide usage across the world. Recently, Microsoft released the Customer Preview of the latest version of its Office suite i.e. Microsoft Office 2013 (a. k.a Office 15). This write up briefly discusses some of the new features likely to be introduced in the new software, product enhancements to existing features, some pros and cons associated there with.

Background

This write up is the second part of the article on MS Office 2013. The first part dealt with some of the new features that are expected to be a part of MS Office 2013. Some of the features described in the article were:

  • Cloud integration
  • Touch and stylus based interface
  • The new “Metro” look
  • Convenience of editing PDF documents in MS Word 2013
  • Support for Open document format (“ODF”) 1.2
  • Social media-related integration

In this part, we will look at some of the enhancements, new features which are expected to be a part of MS Office 2013. While there are many features that one could write about, given below is a short summary of the changes / new features that you may find useful:

MS Word:

Right from the first moment you start Word, you will notice the crisp new interface. The basic interface has been changed (“Metrified”). The ribbon feature has been changed (Microsoft has made it more flatter) to appear more spacious. One of the reasons for this is that, when the MS Word is used on a smart phone, the look and feel and the user experience while switching from desktop to tablet / smart phone would appear seamless.

Besides the above, cleaning up the main inter-face has the effect of giving more space and allows the user to focus on the document itself rather than the tools (which are supposed to help and not hinder). To be candid, when I migrated from Office 2003 to Office 2007, the one convenience that I appreciated the most, was that the interface allowed me to work on the document / spreadsheet. All the tools that I would need, were neatly organised on the ribbon. Whenever the need arose, they were only 2 or 3 clicks away or (most of the time) just a right click away. I haven’t had the chance to use the MS Word 2013, but I have a feeling that the experience is going to be even better.

The Read Mode feature is yet another feature to look forward to. This feature is particularly aimed at tablet users. As the name suggests, this feature is for reading. When you switch to the Read Mode, the interface is literally reduced to a bare minimum, thus allowing the document to reflow and to fit in to the screen. One can say its almost like the full screen mode. The interface provides “thumb friendly” buttons on either side of the screen for easy navigation. Some users may be in for some disappointment, because this mode allows the reader to read only one document at a time. Duh ….. you wanted to read ….right!!!!, what else do you want???.

The track changes feature too has been improved for better user experience. The user interface in Word 2013 uses a simpler mark-up look, which appears to be less overwhelming (for many) and intimidating (for some) than the earlier red (strikethroughs) and blue (bold/ underline) mark ups. The new Markup view provides final version of the document with indicators in the margin to indicate the sentences which may have been edited. Whenever you are ready to focus on the changes, just click on the indicator line and it will expand into a thread. Users may find this feature particularly useful while collaborating with others.

One of the conveniences that have been discontinued in MS Office 2013 is the option to add spelling in auto correct by right clicking. This feature was introduced way back in MS Office 97 (I think) and was an instant hit. This feature was very useful for correct typos – – the types you make while typing any document for instance you type “o fthe” instead of “of the”. Earlier, all the user needed to do was to right click and instead of just correcting this one instance—add to the auto correct and save the effort for all similar typos. MS Office 2013 will no longer offer this convenience… but don’t despair, you can still go to the Auto Correct menu and add the same. The only difficulty will be finding it.

EXCEL:

Once again, the basic interface has been metrified i.e. looks and feels very crisp. The look and feel is common between all the other applications of MS Office 2013.

If you thought that MS Excel was an outstanding product, the latest version Excel is even better. Microsoft has added some awesome tools, Quick Analysis tool is one them. In the earlier version, if you selected a range of cells with numbers, nothing happened. In MS Office 2013 – Excel, if you select a range of cells with numbers, a QUICK ANALYSIS tool pops up next to the selected range and gives you a variety of options like—Conditional formatting, charts showing most of the information, formulae, tables formats and in cell sparklines (introduced in Office 2010). However, go around any of the options and you will see it either in the data or in one of the pop up charts. The suggestions are intuitive and change according to the data highlighted. While the overall number of options remain the same, the interface would suggest some of the options (such as why a particular chart or a pivot table may be more suitable) which you may find useful.

The next in line is the Chart advisor. An early prototype was featured on the Office Labs, which has now been fully integrated along with other analytical tools in excel. One can say its a plain vanilla version of professional business analytic tools. With the Chart Advisor, the likelihood of you getting the right chart or pivot table in the first attempt itself is far higher…….which many may agree……… translates to tremendous savings in time. Guess that’s one up for artificial intelligence.

The previous avatar of MS Office ie Office 2010 brought in several features which kinda added “jazz” to Excel. The current version ie Office 2013 has focussed more on functionality rather than “jazz”. But that does not mean that there is no “jazz” added in MS Office 2013. As a matter of fact, the error function (ie the indicator which highlight errors or inconsistencies) has been spruced up quite a bit. For instance: if you move between cell or add or delete some figures that lead to a change in some other result or formula, you are likely to see subtle animations to draw your attention to what changes have happened. So what’s new eh!!!!! Well, for starters, if the change is in the area (ie the displayed area / sheet) then the animation is …. let’s just say …..less animated and if its in a different sheet or so …. the animation is …..a bit more animated. If you click the cell, there will be onscreen prompts to lead you to whatever it is that Excel intends to draw your attention to. This makes it much harder to change or delete information that changes your results without noticing that it makes a difference.……sounds exciting, doesn’t it?

Even the error messages are more useful. For instance: suppose you drag a cell across the worksheet when what was really meant to do was to click somewhere else — the older version would give you a fairly “cryptic” warning ….. but this will not be a problem in MS Office 2013— now Excel gives you a warning in far more simple / descriptive manner, suggesting what’s wrong. Add to this, now there is a whole new add-in to look for errors and inconsistencies between worksheets.

Time slicer & the Quick analysis tools are some other tools to look forward to. The time slicer tool helps you to dig further into your data. For instance: it organises data by date, so you can filter down to a specific period or jump through figures month by month to see the differences. The Quick Analysis is like a shortcut of sorts for making sense of your data as it is or one may say that it is a way to preview different visuals i.e. you’ll see various format-ting options, and as you hover over them you’ll see the document change accordingly, giving you a glimpse of what you’ll see if you end up selecting that option. This is quite similar to the formatting and fonts option available since the Office 2007 days.

In MS Office 97, Microsoft introduced the auto fill feature. It’s one of the features that I have come to appreciate over a period of time. It is an excellent tool to use when filling up data in tables. The Flash Fill apparently is a step up. Flash Fill is a feature that recognizes your data patterns to the point where it should be able to predict what belongs in the remaining blank cells and fill them in for you. For example, if you were to make a time sheet spreadsheet detailing on which client time was spent and by which employee, Excel would eventually pick up on the fact of every employee who has worked on the client / specific project and fill up the data for you. For instance: every Saturday is booked for internal filing etc– in theory, you just have to enter some of that data and then go to the Data tab, where you press the Flash Fill button to make it fill in the rest. A bit of caution here …. Feedbacks available indicate that the Flash Fill is not able to interpret / pick on trends in “all” data.

There are several other features to write about but may be in future… once I lay my hands on the official version. Well that’s all for this month… wish you a Happy Diwali in advance.

Disclaimer: The discussion regarding the features and enhancements contained in this write up are based on the various feedbacks/ reviews available on the internet and various magazines, blogs, etc. The purpose of this write up is only to share the knowledge and not to malign any person or product.

Big Data – What is it all About??

fiogf49gjkf0d
About this article

Big Data is not a
very new idea, it’s been out there for quite some time. Nonetheless,
very few people have realised the full potential of this idea. To
highlight a few advantages, Big Data can help businesses become more
efficient, help them in servicing the customers better and at the same
time improve their bottomline. In a completely different sphere of life,
Big Data helps various research organisations track a variety of data,
such as tracking meteorological data, data related to clinical tests
conducted, etc.

Be it business establishments like eBay, Amazon,
Facebook or research organisation like NASA, the UN, Governments across
the world, etc., the one common link for all those who use Big Data is
Technology. This article seeks to create awareness about how technology
is used to store and analyse Big Data. Like all big ideas, there are
several stories – success as well as failure, myths, etc. associated
with it. This article will deal with some of the successes and failures.


Background

Ever wondered how a weather bureau
predicts weather or for that matter, how organisations like NASA, ISRO
monitor space, (in case you didn’t know already – apart from secretly
tracking UFOs) that includes tracking various stars, planets,
meteorites, comets, space crafts, satellites, millions of objects of
floating junk which were in some form or another a part of a satellite
or some cargo carried by the satellites. Also, there is the curious case
of the measurements that scientists do, such as that in a nuclear test,
the Hadron Collider. How about mapping the human genome – did you know
that there are more than a billion unique data sets ?

I know
that sounds hugely futuristic and the question that begs to be answered
is “What do I care” or “How does it matter to me”. Well let’s just say
that what is described above are some of the sources and users of Big
Data. Closer to home or to our everyday life, Big Data is used by giants
like Facebook, Amazon, Walmart, to name a few, for improving customer
experience.

Characteristics of Big Data:

Well, to be
honest, “Big Data” is more like a term which was coined in reference to
the data. What I mean is that, there no “official” definition of “Big
data” or for that matter “Small Data”. But, generally speaking, Big Data
refers to data characterised by four features i.e. volume, variety,
velocity and veracity. To understand this better, let’s take a few
illustrations of these characteristics that are closely identified with
Big Data:

Volume:
Today, businesses everywhere, are awash
with ever-growing data of all types. Conservatively speaking, they
collect huge amounts of data (often the volume is in terabytes – in some
cases petabytes – of information).

For instance, someone like
Twitter would churn x terabytes of tweets created each day, into
improved product sentiment analysis. Someone like General Electric is
likely to convert billions of annual meter readings to better predict
power consumption. One company boasts of systems which track events
(crime related) which can help Governments reduce crime rates.

Velocity:

Sometimes, a few minutes is too late. Certain time-sensitive processes
such as catching fraud, Big Data must be used as it streams into your
enterprise in order to maximise its value.

For instance,
exchanges like the Bombay Stock Exchange, National Stock Exchange etc.,
scrutinise millions of trade events created each day to identify
potential fraud (like the punching error report very recently). Couple
of weeks ago (and even in the past), these exchanges had assisted SEBI
is pinpointing instances of circular trading and front running.

Variety: For the readers of this Journal, data
would mean spreadsheets, word documents, accounting records, etc. But in
reality, there is a vast variety of forms/formats in which data can
exist. In case of Big Data, data may be of any type – structured and
unstructured data, text data, sensor data, audio, video, click streams,
log files and more. Typically, new insights are found when all these
different types of data is put together and analysed from a specific or
variety of specific points of reference.

The classic examples of
this would be Facebook, Amazon etc., and if I may dare to say so,
“Algorithmic trading solutions”. It is said that in some cases, the
“algos” are so advanced that they analyse the tweets and social media
trends for “sentiments” and execute trades on the basis of such analysis
alone.

Veracity:
What role does veracity have to play
here. Imagine this – you spend a fortune, putting in place a system to
collect the data. Thereafter, the data is stored before an analysis is
made. What good would be the collection, storage and analysis, if the
data collected was inaccurate. Further, customers part with the data
willingly (most of the time unknowingly), who ensure that their privacy
is not violated. Statistically speaking, one in three business leaders
don’t trust the information they use to make decisions.

How can
you act upon information, if you don’t trust it? Establishing trust in
Big Data presents a huge challenge, as the variety and number of sources
grows.

Big Data – has been out there for some time:

Most
people go under the assumption that Big Data is a recent phenomenon.
But that’s not quite true. As a matter of fact, companies like American
Express1 and Google have been using Big Data in some form or the other,
to analyse and predict customer behaviour, with a view to enhance
customers’ service and public perception. While this may or may not be
true, the fact remains that the amount of data captured and analysed in
the last two to three years, far exceed the total data (in volume and
variety) captured over the last millennia (at the least).

Big Data – recent changes:
What
most people don’t realise, is the manner and extent to which changes
have taken place in the last couple of years. To begin with, storage
space has increased dramatically, our ability to process such data has
been growing exponentially. One could also attribute some positives to
the technological advancement, development of new analytical models,
etc. Given all these, our need and manner of use, the very application
of such data, has undergone a sea of change (one may say. A change of
epic proportions). Here is why:

  • Walmart handles more
    than 1 million customer transactions every hour, which is imported into
    databases estimated to contain more than 2.5 petabytes of data.
  • Facebook handles 40 billion photos from its user base.
  • FICO Falcon Credit Card Fraud Detection System protects 2.1 billion active accounts world-wide.
  • Decoding the human genome originally took 10 years to process; now it can be achieved in one week.
  • There
    are 4.6 billion mobile-phone subscriptions worldwide and there are
    between 1 billion and 2 billion people accessing the internet.
  •  Between
    1990 and 2005, more than 1 billion people worldwide entered the middle
    class, which means more and more people who gain money will become more
    literate, which in turn leads to information growth.
  • The world’s effective capacity to exchange information through telecommunication networks was
  • 281 petabytes in 1986,
  • 471 petabytes in 1993,
  • 2.2 exabytes in 2000,
  • 65 exabytes in 2007; and
  • it is predicted that the amount of traffic

flowing over the internet will reach 667 exabytes annually by 2013. (Source: Wikipedia)

How big is “Big data”:

Consider this. In 2012, the Obama administration announced the Big Data Research and Development Initiative, which explored how Big Data could be used to address important problems facing the government. The initiative was composed of 84 different Big Data programs spread across six departments. The United States Federal Government owns six of the ten most powerful supercomputers in the world.

Big data has increased the demand of information management specialists due which software giants of the likes of SAG, Oracle Corporation, IBM, Microsoft, SAP, and HP, have spent more than $15 billion on software firms only specialising in data management and analytics. This industry on its own is estimated to be worth more than $100 billion. That’s not all, it’s reported to be growing at almost 10% a year, which is roughly twice as fast as the software business as a whole.

In the Indian scenario, the Indian Big Data industry is expected to grow from $ 200 million in 2012 to $ 1 billion in 2015, at a CAGR of over 83%. Nasscom’s prediction is that Big Data will help the BPO industry move forward as it will help in “evidence-based” decision-making for clients, which in turn has a high impact on business operations.

Can we ignore Big data?

The answer seems to a resounding NO. Why?????? Cause………… To remain competitive, all organisations need to analyse both internal and external data, as quickly and cost effectively as possible. As the world becomes more instrumented, with RFID tags, sensors and other sources, companies are creating more and more data. When paired with external data – like that generated by social media sites – there’s incredible opportunity that is largely untapped and unanalysed.

Parting remarks:

This write-up was intended to be a precursor – to give the readers a basic overview of Big Data. In the next part, we will cover some more ground and delve into some more details, understand what’s all the hype about and whether there is a hidden pot of the gold at the end of the rainbow or not.

Until then, I wish all the readers a Happy Dassera.

Disclaimer: The information/factual data provided in the above write-up is based on several news reports, articles, etc., available in the public domain. The purpose of this write-up is not to promote or malign any person or company or entity. The purpose is merely to create awareness and share knowledge that is already available in the public domain.

Keyboard Short cuts for BlackBerry Devices

fiogf49gjkf0d

TECH UPDATE

This article is about simple keyboard short cuts for
BlackBerry devices. Keyboard short cuts help in improving our typing speed and
in many cases, navigating between applications. The tips mentioned in this
write-up would apply for 8800 series and later devices; these may or may not
work on the older devices.

Everybody likes short cuts :

In general, we are all lazy in one way or another. If one
were to be told that he could do the same task with lesser effort (and without
compromising on the output), the first question he will ask is ‘How do I do
that’ and the answer would be ‘Use a short cut’. While keyboard short cuts like
CTLR+C and CTLR+X and others used extensively, this article is about short cuts
for your BlackBerry devices. Yes ! ! ! ! There are keyboard short cuts for
BlackBerry devices also (i.e., beyond the standard short cuts for copy,
paste and send). Here are some instances, which you may find useful :

Rapidly switch back and forth between BlackBerry applications :

The average desktop or for that matter a laptop contains a
smart chip. The chip is called smart because it contains ‘multiprocessors’. As
the name suggests, these are capable of performing several tasks and executing
processes simultaneously. Among other things, the multiprocessor allows a user
to switch from one task to another without compromising on the speed. The switch
is almost instantaneous when you use a desktop or a laptop. This agility,
however, is not available on your BlackBerry. The explanation is simple; the
BlackBerry device (like the other competing smart phones) uses a simple
processor.

So how does one get around this handicap ?

Simple . . . . . Use a short cut.

The most basic way to switch from one BlackBerry application
to another is to repeatedly hit the ‘ESCAPE’ key while inside a programme until
you get back to your icon screen. From there, you’d scroll your track ball or
wheel to find the next application you want and then click to launch it.

A quicker and more efficient way to go from an active program
to another program is to use a short cut. While inside an application, hold down
the ‘ALT’ key which is directly below the letter ‘A’ key and then click ‘ESCAPE’
the key with an arrow reversing directions and to the right of your trackball on
8000 series devices. While holding down ‘ALT’, you can scroll left or right
between apps, and you need only release the ‘ALT’ key to select a program. (For
this, you need to be using a program i.e., a program needs to have been opened
recently or still running
). You can always access your Home Screen,
BlackBerry browser, Options, Call Log, Messages and a few other applications
depending on your device settings.

Using the event log :

Your BlackBerry’s Event Log displays your system’s recently
run events and processes. If you’re experiencing a problem with your BlackBerry
or having an issue with a specific application or service, information from the
Event Log can be helpful for troubleshooting. And it can be a good BlackBerry
hygiene to clear out the log, to keep your device running smoothly.

To access your Event Log, go to your Home Screen, hold down
the ALT key and then type ‘LGLG’. The Event Log will appear, and you can click a
specific event for more information or hit your BlackBerry MENU key more
options. (The MENU key has seven dots in the shape of the letter B, and it’s
found directly to the left of BlackBerry devices with a trackball). You can copy
event information using the MENU key and tailor your settings to log only the
specific types of events.

Freeing up some memory space :

You can also free up some valuable device memory to help your
device run faster by
clearing your Event Log. To delete your list of events, hit the BlackBerry MENU
key while any event is highlighted and then click ‘Clear Log’. A dialogue box
will pop up asking if you’re sure you want to delete the log. Once you confirm
the deletion, your log will be cleared. (Don’t worry, if your IT Department is
running device management software along with its BlackBerry Enterprise Server,
your company probably has its own record of this event log.)

Reboot your BlackBerry without removing the battery :

Any BlackBerry veteran knows that sometimes it is necessary
to reboot your device after installing a new application, to solve performance
problems, refresh your Smartphone’s memory or fix other minor issues. One way to
do so is to remove your battery door and pull the power pack. After the battery
is returned to the device, your BlackBerry reboots. This gets the job done, but
it’s time consuming to power down the device and then remove and replace the
battery and your battery door won’t fit as snugly if you’re constantly taking it
off.

The quickest and easiest way to reboot is via another
BlackBerry keyboard short cut. To reboot, simply hit ‘ALT’, ‘RIGHT SHIFT’ and
‘DELETE’. (The RIGHT SHIFT key is found on the bottom right corner of the
BlackBerry keyboard and DELETE key is also on the right hand side and has the
letters ‘DEL’ on its face) You might say this is the BlackBerry version of
CTRL+ALT+DEL. After pressing these three keys in tandem, your device powers
down, your LED indicator turns red for a few seconds and the reboot process
commences.

Change your signal strength display from bars to numeric :

Most modern cell phones offer up some form of ‘five-bars’ to
display user’s wireless signal strength, and the BlackBerry default mode is no
different. But if you want more precision than bars can offer, you can change to
the numeric signal strength display mode. The numeric mode shows wireless signal
strength in decibels per mill. watt (dBm), a ratio measured power in decibels
(dB), referenced to one mill. watt (mW).

To switch from bars to numbers, navigate to your BlackBerry
home screen, hold the ‘ALT’ key and enter in ‘NMLL’. The signal display will
then automatically display a dBm value. In general, a reading from -45 to -85 is
considered very strong. Any reading that’s lower than -85 — for instance, -100 —
is weaker. To switch back to bar mode from numeric, just hit ALT again and
retype ‘NMLL’.

The numeric display can be helpful to determine accurately on
how much a wireless signal degrades as you move from place to place. (It’s also
geek chic to read your cellular signal strength in dBm instead of boring old
bars.)

Bring up ‘Help Me’ screen for device, system data:

Your device’s ‘Help Me’ screen displays useful device and system information such as your vendors ID, the version of BlackBerry platform, OS version, your PIN, the International Mobile Equipment Identity (IMEI) number, etc. While most of this information is available at various locations throughout your BlackBerry Options, the Help Me offers a simple way to access all the data on a single screen.

To pull up the Help Me screen, navigate to your Home screen and then press ‘ALT’, either ‘SHIFT’ key and the letter ‘H’. To return to your Home screen, hit ESCAPE or open the MENU and select Close.

That’s all for this month. You can email your feedback to me on sam.client@gmail.com. Do look forward to my next write-up on the topic of cloud computing.

Social Networking – Be Careful Out There – II

fiogf49gjkf0d
About this Article

This write up is Part 2 of the three part series on the topic. The previous write-up was aimed at creating awareness about some of the myths and misconceptions related to the use of social networking sites.

While the recent events have had the effect of an eye opener for some people, there are many others who throw wind to caution


This article highlights some simple steps and safe practices which may help in making your experience a safe one rather than a sorry one.

Background

The previous write up briefly discussed some of the myths and misconceptions related to the use of social networking sites. It also focussed on the complete lack of awareness on how personal information is stored, accessed and made available on the internet. The more shocking revelation being that the information is, more often than not, revealed with or without the permission of the person who was most likely to be affected by such a revelation.

 The key takeaways from the previous write up were:

• Social networking sites aren’t responsible for your privacy…. you are!!!

• Default settings on the site, may or may not provide adequate protection.

• When social networking sites change their privacy policy, they may or may not tell you about the changes made, more importantly they may not tell you how your “personal” information is about to become a more public.

• The privacy policy of the social networking site does not extend to its partners (i.e. app and other third party service providers).

• When something is provided to you free of cost, it doesn’t mean that there is no cost attached. On the contrary, it means that someone else is footing the bill. And that ‘someone’ is going to extract something of value (like your private info) in return.

• Social networking is a paradox – you are posting data meant to be private on a medium which is meant to be public.

Risks

Very recently, Facebook acknowledged that their servers were hacked. While the company said that there was no data loss/damage done, there is no way of knowing for sure whether that was a fact. This may come as a surprise to some people, however, for others it was something that they always expected to happen.

Given the nature and amount of data collected and stored by some of the social networking sites, it was obvious that sooner or later, they would be targets of cyber criminals.

A curious person would ask what does the social networking site have that may be of interest to anyone other than the users? Or the information posted by me is harmless, what damage can the hacker do to me?

A short list of the risks involved is as under:

• All your private information, either about yourself or your friends, their likes or dislikes will be compromised.

• Someone could use this information to bully you or cyber-stalk you or your friends.

• The information may be used for inappropriate or illegal purposes including phishing, cyber frauds, hacking someone else’s account, etc.

 • It is also possible that your ‘views’ about someone or something may be disclosed to the very person and there would be consequences.

 • Your name, details may be used to spread viruses, spam, malware, etc

• Someone may hijack your email account or Facebook page and post some damaging information.

Steps to Safe Social Networking Experience

 It is important to remind the readers that there is very little we can do against a prolific hacking attack or a skilled scamster. After all, considering that the networking sites with all their resources couldn’t do much, can you do any better? It is therefore imperative that you take steps to reduce the impact of any damage that may be caused. Listed below are a few ‘counter measures’ that may be useful:

Don’t succumb to peer pressure:

Peer pressure is like a double edged sword, at times it forces you to excel and then there are times when you succumb to it and in that moment of weakness, sometimes, it leads to disastrous consequences.

Don’t let peer pressure or what other people are doing on these sites convince you to do something you are not comfortable with. Stay within your limits. Remember, just like the spoken words cannot be taken back, what you post on these site cannot be erased (not very easily). It will remain in the system no matter what.

Keep personal information out:

Generally people have a tendency to post personal information like their phone number, photos of their home or their work place, school or date of birth, etc.

Just stop for a minute and think about it. This is the same information that a hacker would be need to access your bank account, your credit card, etc. Do you really want to leave this information out in the open?

Keep your profile closed, allow only your friends to view the profile. Else, for a skilled hacker or a scamster, you would be a sitting duck, ripe for the kill.

Mask your identity:

Be very wary of posting any personal data. If possible use a nick name or an alias (commonly referred as a ‘handle’).
It’s very easy to set up a separate email account to register and receive information from the site.

The advantage being that should you even feel the need to close the account or stop using the social networking site, you needn’t stop using your primary mail account.


Use strong passwords:

Remember, the password is the weakest link in the chain. Birthdates, location, nicknames are too common, you don’t need to be a super computer to figure out these types of passwords. The hacker will have a look at your profile and the information will be sitting right in front of his eyes.

Make sure that you use a combination of upper and lower case plus numbers and special characters. It doesn’t have to be very difficult.

Common daily use sentences like ‘I travel by western railway’ can also be converted in to a unique password by making use of a combination of upper and lower case characters along with symbols. Something as obvious as BCAS 2013 can be written as ‘8©@S2013’ and it would be become 10 times more difficult to guess or hack, yet easy for you to remember.

Social networking vs. venting out

Social networking and venting out are two seperate things. Remember that what goes online stays online.

Don’t say anything or publish pictures that may cause you or someone else embarrassment.

Never post comments that are abusive, or those that may cause offence to either individuals or groups of society.

Recently, many companies have started (re)viewing current and prospective employees’ social networking pages. The slightest indiscretion and you are likely to be on your way out.

What you say can and will be used against you

Who actually owns and who controls “your” intellectual content that you post is not as clear as you might think. This also raises the question: If you don’t own it, can you really control it?

 Terms of usage vary with every social networking service. It is more likely, that as soon as you sign up, you give up control of how your content may be used.

Be careful in choosing your friends:

It’s an age old advice. Be that as it may, it applies to your offline as well as online friends. Be wary about who you invite or accept invitations from. Be aware of what friends post about you or reply to your posts, particularly about your personal details and activities.

Never disclose private information when social networking. Most importantly be careful of clicking on links on an email or social networking post, even if its from your friend (in some cases specially if its from your ‘friend’)

One of the biggest mistakes you can make is to accept friend requests from people you don’t know. When you do that, you are inviting people you know nothing about to share your personal information.

When your friends share information about you on their networks that you’d rather keep private, contact them and request them to remove the damaging information. Some sites may also permit you to remove any tags that your friends use to identify you in their posts

Guard against phishing:

Be guarded about who you let join your network. Use the privacy network to restrict strangers from accessing your profile. Be on guard against phishing scams, including fake friend requests and posts from individuals or companies inviting you to visit other pages or sites. If you do get caught in a scam, make sure you remove any corresponding likes and app permissions from your account.

Don’t be afraid to block specific users or set individual privacy settings for certain sensitive posts and information.

While all of the above discussed ‘counter measure’ may not offer complete protection, you may be saved from a total disaster. After all, prevention is always better than the cure.

The next write up (the third and concluding part) will deal with the specific issue of changing your privacy settings (i.e. location) and some basic steps on what to do if your account is hacked.

Using the Internet for mass collaboration

fiogf49gjkf0d
About this article:

This article is based on a video of Luis von Ahn aired recently on a popular site i.e., www.ted.com. The video itself was recorded sometime around April 2011.

Every once in a while you come across something, an idea or a vision, that knocks you down completely. The thing that strikes you the most, is the simplicity. This article is about one such idea and how few individuals have used their minds to harness energies of millions and millions of people to help make a difference.

The Internet, as a resource, is viewed differently by different individuals. For some it is a source of information and knowledge, for others it is a means of earning a livelihood, and then there are those who are able to use their limitless imagination and ingenuity to effortlessly harness the power and labour of millions and millions of individuals, to achieve the unbelievable or the next to impossible.

One honest confession I need to make is that, while I had heard about mass collaboration and had seen its practical application (one of which is Wikipedia), but, when I first saw this video, I was completely awestruck and blown away.

Here are a few statistics to tell you why:
  • Currently, more than 350,000 websites are using these ideas ?
  • Time spent per day is equivalent to 500,000 man-hours ?
  • The number of words digitised by these sites exceeds 100 million a day — that’s the equivalent of effort required to digitising (approx) 2.5 million books a year ?
  • The effort put in, is all done one word at a time/10 seconds per person by approximately 500 million people.
Mind you ! ! ! this is just a sample of what limitless imagination and ingenuity can achieve.
So what is this mind boggling, out of the box idea, that I am raving about? Well . . . . . . . all I can say are three words CAPTCHA, RECAPTCHA & DUOLINGO.
CAPTCHA:

Captcha = Completely Automated Public Turing test to tell Computers and Humans Apart
Whats that . . . . . you said ? ? ? Is a very common response, so let me translate that into non-geek language.
Let say you are trying to register or log into sites like Google, Facebook, Twitter (and several others) and you see some oddly distorted letters/words (see picture below).
These seemingly innocuous letters (or text pieces) are a common site today. While most recognise these as a security feature, lesser number of web surfers know that these are tools for identifying whether the person accessing the site is a human being or a computer (bot) and hence the name – Completely Automated Public Turing test to tell Computers and Humans Apart.
For those of you who are unaware, unlike humans, a ‘bot’ cannot read distorted words. When you type the (correct) words in the box, it proves that you are human and the website allows you to register/access content/purchase goods/make reservations, etc.
Over a period of time Captcha has become (almost) a standard security feature. In the video von Ahn revealed that (by April 2011) there were more than 350,000 websites using Captcha and some approximately 500 million users every day were spending 10 seconds each while accessing various e-commerce sites.
The first reaction to the above is ‘WOW’ — 350,000 websites, 500 million users. von Ahn too felt a sense of pride that his invention was being used by so many people, but then he also thought that each of these 500 million users were spending 10 seconds each during the verification process, this translated to 500,000 man-hours (approx). Then came the thought, “Is there something I can do to utilise this effort to do something — something huge but simple — something that machines cannot do (as yet) as efficiently as humans can?” Needless to say that stopping the use of Captcha, given its benefits, was not an option. This thought was the seed to another research, resulting in what is commonly known as RECAPTCHA.

RECAPTCHA: von Ahn and his associate/intern came up with this idea on the basis of the findings of their research. The idea was to use the efforts of the 500,000 man-hours to digitise books. There are several projects doing this already, including one being pursued by Google. It is common knowledge amongst most people who are involved in the endeavour to digitise books, that computers and more specifically, optical character recognition (OCR) technology is applied for digitising books. And that typically, this involves one person using a scanner device to scan one page at a time and then wait for the OCR software to convert the scanned image in to a document. What is not very commonly known (at least with the public at large) is that the technology is not 100% accurate. Machines and for that matter computers/ software, at times, are not able to ‘recognise’ many of the characters that are scanned by them. This is more so when the book being scanned is older than 10 years. The difficulty arises due to a variety of factors such as the typeface used, yellowing of the pages, creases in the pages, wear and tear/ condition of the book. In all such cases, human effort is required (computers cannot do it as easily as humans). Thus, RECAPTCHA was born. Once again the idea was a simple one, the visitor was presented with two words (instead of one in Captcha) one which was known to the software and the other which was required to be ‘recognised’. When both words were recognised, the visitor was granted access to the site he was visiting. All the time, in the background, RECAPTCHA was comparing this result with the response provided by another 10 users (who were given the same combination). If the result matched, then another word was digitised.

Once again the idea was a runaway success — The number of words digitised by these sites exceeds 100 million a day — that’s the equivalent of effort required to digitising (approx) 2.5 million books a year. Given the success, RECAPTCHA was acquired by Google.

von Ahn and team revisited their question and embarked on a yet another journey. This time they decided that all the parties involved in the process should have something to gain — in captcha human effort was used to verify their status as humans. While this helped the website owners, it resulted in wastage of human effort. Recaptcha used this human effort to convert books — once again website owners and book readers gained- nothing for the visitors who were assisting in the digitising process were not being compensated. This thought gave birth to DUOLINGO.

DUOLINGO:

Just like digitising books, translating content is another ‘skill’ which the machines/software do not posses (as yet). It’s one thing to merely translate words and a different thing to translate the words with context. It is the context in which the words are spoken, which makes the text readable and by that measure more comprehensible. If you don’t believe try using the translators available for converting a poem in Hindi to English and vice versa (no offence but its like watching a Chinese movie — dubbed in Tamil — the tone/pitch of the dialogue or a fight scene versus the body language — I have always found it hilarious — try it sometime). Coming back to the topic . . . von Ahn and team came up with the idea of DUOLINGO.
What von Ahn and team realised was that there was content on the web which needed to be translated. The video has cited the example of translating content on the English version of Wikipedia to Spanish version — currently the Spanish version is only 30% of the English version and the cost of converting the same — as the video suggest — from the lowest cost vendor, based on the effort of exploited labourers in a third-world country- was $ 50 million. Cost apart the other quandary was where do you find enough people who know more than one language and were willing to participate in the translation process. The solution was that there are hundreds and thousands of people who want to learn another language, they have to pay money to learn, here was an opportunity to learn and apply at the same time — without spending anything from their pockets. Now there is a win-win for almost all !

  •     Content can be translated
  •     With context, translation is easier, fun, improves the learning/experience
  •     The accuracy is far higher than that offered by software currently available and almost comparable to the accuracy of a professional translator
  •     Both parties don’t pay money but put in their ‘efforts’
  •     Both parties gain
  •     And on the hindsight lesser exploitation of labour

The result: based on current stats the translation can be done in a matter of weeks.

Now that’s what I call innovation.

Like I said earlier, I was completely blown away when I saw the video, I am sure after reading this write-up (and maybe watching the video) you will be too.

Wish you Merry Christmas and a Happy New Year.

Disclaimer:

The purpose of this article is not to promote any particular site or person or software. The sole intention is to create awareness and to bring in to limelight some thought-provoking content.

Social networking – Be careful out there – I

fiogf49gjkf0d
About this Article

Social networking is “hep” and the “in thing” nowadays. Entire generation Y is hooked on it. Undoubtedly, it is a convenient way to connect with family, friends and other people. But that’s the bright side, what most people don’t realise is that there is a dark side too. This article is aimed at highlighting some of the perils of social networking sites specifically related to the privacy of the account holder.

Background

Today, it’s a common feature to see teenagers hooked on to social networking sites all the time, as if it were a life support system. What’s more, teenagers are likely have several friends and connections online or in the virtual world, even when continents, distances and time zones may separate them. Sometimes, it is at the cost of having friends in the physical (or real) world.

Come to think of it, it really isn’t all that different from the past. I mean that, once upon a time it was “hep” to have pen friends, then email, bulletin boards and chat rooms became a fad. One could say that it’s the same old wine in a new bottle – today you have friends, followers and connections on Facebook, Twitter and Linkedin (to name a few popular social networking sites).

Agreed, it’s a convenient way to connect with family, friends and other people with common interest. And with the technological advances today, it’s almost effortless, because the site does all the work of finding all your “long lost” friends, colleagues and relatives. Many times, these sites offer “suggestions” regarding people you may be interested in connecting to or groups you may want to follow. This, you may say, is the bright side of social network. How- ever, what people don’t know (or care enough to know) is that there is a dark side as well.

“Nahhh!!!!! Can’t be!!!! Social networking is harmless banter, we are jus hangin out, what’s wrong with that?????

Chill yaar, you are just being paranoid.” I am sure that you have heard this before. Well, you are about to get a rude awakening.

The Dark Side

Couple of weeks ago, a furore was raised in the press and all over the internet, when 2 teenaged girls were hauled to the police station for posting some innocuous status updates on one of the popular social networking sites. While a lot is written on how the law enforcers should have acted, how draconian the internet law is when it came to the freedom of speech and of course, the whole debate of what should be done (or should not be done) and who is responsible (or irresponsible). Despite all this noise and chatter about the who, what, where and when, most people missed out on a little known ‘open secret’. What’s this ‘open secret’ you may ask.

Well, forget all the chatter and the noise for a moment and think, how many people actually gave a thought to the following:

• How did the mob come to know of the “personal” post?

• Were they friends with the teen who posted the message?

• Did the teen intend that persons other than her “friends” see the post/tweet?

• Can persons other than one’s friends see his/ her posts/tweets?

• How can anyone see my posts /tweets?

• And of course, the million dollar question that begs to be answered –

How did they get the address of the teen who posted the update and the vital information that the teen was located within (ahem) striking distance? This question becomes a ten million dollar question when you ask, if they were not friends and they were not connected, were they supposed (allowed) to see such personal information (i.e., Location of the person putting up the post).

In all the printed press, news reports, countless Tweets and Facebook updates there is hardly a peep into these questions. If you were a conspiracy theorist, you would know for sure that “something just ain’t right here”. You may have guessed it by now …. Nobody noticed (and all probability likely to remain unnoticed) that the real transgression was the “a compromise of the privacy of your personal data”.

By the way, if you didn’t ask this question earlier, then it would be a good indicator that you too have chosen to remain blissfully unaware of “what’s out there”.

The Ugly Truth

SOCIAL NETWORKS AREN’T RESPONSIBLE FOR YOUR PRIVACY – YOU ARE. What most people (individuals who use social media regularly and extensively) is that you are parting with some very vital and sensitive personal information right from the time that you open an account with these social networking sites. It’s pretty standard to give information such as your full name, where you live, what you do, what you like (or dislike), your date of birth. You post pictures of you and your family, your precious possessions, your triumphs, etc. And to top it all, you literally “strive” to keep this information updated every day (and in some cases-every waking moment). You take solace (my view–choose to remain blissfully unaware) in thinking that:

• this information is with the site;

• it’s secure, behind layers of security;

• they have a privacy policy, they can’t share it with any one;

• only my friends and connection can see it;

• It’s harmless banter (yeah!!!, really!!! Do make it a point to tell it to the mob when they come visiting);

• I will delete it after some time But as they say “Ideal and real” are two completely different and mutually exclusive things. Some open secrets that you must know:

Default Settings:

When you sign up, the social networking site sets your privacy controls to “default settings”. I am sure there would be several instances wherein you have accepted the prompt that the settings are at default albeit without really checking or understanding what “default settings” really means. In some cases, default means that everyone can read your post and access all the information that you give the site.

Changes in Privacy Policies

While some people are wise enough to check what the default setting is, they sometimes fail to keep track of changes in the privacy policy of the social networking site. What people do not account for is that Privacy Policies can change. In some cases, these site notify you, but in many cases, by continuing to access the site or using the service you “by default” agree to the revised Privacy Policy. How is that possible you ask, I have a right to be informed, they have to tell me !!!.

Don’t they ?????? All these questions are the types you ask after reality knocks you down. The truth is that, it all boils down to terms and conditions of service, YESSSSS, the one’s where you click “AGREED” without even bothering to read what they say, let alone understanding what the implications are.

Somewhere in the fine print, there are terms which say that “the service provider is at a liberty to alter the terms of Privacy Policy and that it is your obligation to look them up on a regular basis. Further that, if you continue to use the site, it will be presumed that you have read the Privacy Policy and have agreed to the revised terms.

Here is a question for you. Google and Facebook both have revised the terms of the Privacy Policy (mainly their Policy on what data will be collected and how they intend to use it). They were “kind” enough to send a mail/notification about the change and date from when the policy will become effective. How many of you saw this mail in your inbox/notification when you visited the site? More importantly, how many of you made an attempt to see “broadly” what changes are likely to take place. If you haven’t done it as yet, then be rest assured that you will have no one but yourself to blame.

Apps and Games

If you think that you have covered all bases by reading the Privacy Policy and having understood the terms have agreed to it and acted very cautiously, even then one could say you have left yourself exposed. Sure you read the Policy for the hosting site, but what about the apps/ games that are made available on the site? More often that not, if your friend has been using it or recommends it, you too sign up because you want to be with the gang and cannot fall behind. Well, if that is the case, I would say you covered the pin holes, but left the manholes wide open. It is quite possible that these app/ games/utilities may have a policy which is quite different from the hosting site and it might not be very protective.

Difference Between Free and Freemium

Just because the service is free or it doesn’t cost you anything doesn’t mean that there is no cost attached. It only means that the cost of providing service to you is being borne/ subsidised by someone else i.e. what is offered to you for free is sold to someone else for a premium (hence the word freemium). Everybody and I mean everybody (there may be a few exceptions like the Khan Academy) who is providing some free service to you, is selling the data that you generate, in one way or the other, to somebody else. You may not believe it, but every time you say “you like something”, this data is collected, collated and analysed for future sale. Every comment about a product, a service, a brand, etc be it good or bad, is tracked and stored for future sale. Not only this, if you like a brand, there is a very high probability that the very same social networking site (if not this one then some other site also) will help the brand to sell “what you like” to your friends.

Paradox of Social Networking and Privacy:

It’s a paradox because, you are posting your personal and private data on a media whose reason for existence is to promote “openness”. So, on one hand you want the data to be in the public domain and at the same time, you don’t want anyone to see it. Funny isn’t it!!!! Reminds me of the famous quote from Shakespeare’s Hamlet “To be or not to be, that is the question”.

While there are several issues that still need to be dealt with, but woh kissa phir kabhie.

The next part of this series will focus on some tips on dos and don’ts while posting on social networking sites.

Disclaimer: The information/issues discussed in the above write-up is based on several news reports, articles, etc., available in the public domain. The purpose of this write- up is not to promote or malign any person or company or entity. The purpose is merely to create awareness and share the knowledge that is already available in the public domain.
    

Social Networking – Privacy Settings in Facebook

fiogf49gjkf0d
About this write-up:
This is the third and concluding part of the three part series dealing with security related issues faced when using popular social networking sites. This write-up deals with some of the settings and describes how and when these settings should be activated. While the suggested changes in the security settings may not guarantee that your personal information is not divulged to the unknown persons, it however, would act as a simple barrier to unwanted prying eyes.

Background

The previous two write ups briefly highlighted how social networking sites are a boon as well as a bane. Boon because they help you to reach out to your friends, contacts, etc. They also help you connect with like-minded people. However, what most people don’t realise is that, you may be parting with a lot of personal information, more than you bargained for and as a matter of fact, more that you even know or comprehend. It is a known fact that unscrupulous people can use this information for their own gains. It is a known fact, notwithstanding this, whenever disaster strikes, the people who are affected, more often than not, realise that they were sitting ducks.

Need for privacy

Social media sites, as we all know, permit us to meet /connect with other people on the net. Initially, we start off with close friends and relatives, whom we look up on facebook almost as soon as we open an account. It’s quite likely that they may have asked, you are you on Facebook?? Why aren’t you on Facebook??? You know….. giving you the feeling that everybody had boarded the bus to paradise and you were the only person left behind. So first of all you connect to them. Also you put in all the small–small personal details about yourself such as which school/college/ university, date of birth, locality where you stay/work, your chosen profession, likes & dislikes (yes that too), etc. All this information is careful and meticulously ‘harvested’ in humongous databases (read my write up on Big Data).

The next step in the process of ‘networking” is to ‘connect’ with like-minded people on Facebook. Suddenly, you will start getting prompts, suggesting that such and such person has a similar trait and therefore you may connect. What you don’t know is when you started punching in personal information, an intelligent algorithm was working behind the scene and putting all the pieces together. If not that, it was creating a ‘footprint’ for others to ‘find’ you.

While this seems convenient and intuitive to you, what most people don’t realise is that this very information can be used to ‘target’ you for something nefarious. It is in your interest that you don’t expose yourself to such risks. In order to do that you need to review your privacy settings and tweak them in a manner that permits you to connect with ease, but the same is protecting you from the villains lurking in the shadow.

Privacy settings

Activating or deactivating privacy settings can also be described as drawing a line (something like the proverbial Laxman rekha one might say), a line beyond which you want to keep intruders out. Conversely, one may say that you draw the line also to create a boundary beyond which your personal stuff doesn’t go. Mind you, just like in what has been said in Indian mythology, the villans will try every trick in the book to lure you, it is for you to realise what’s in your own interest.

Very briefly, the security setting (on Facebook) can be used to:

• Manage how you connect with others
• Select the audience with whom you want to share your personal stuff, and
• Manage how others connect with you (mainly photo tagging)

STEP 1: Manage how you connect with people

In order for you to manage, you first need to know:

• Where to find your privacy settings (a bit obvious, I know, but just in case you didn’t know) • Privacy shortcuts
• Controlling who can send you friend requests
• Changing the filter preferences for your messages
• Who can see your profile pictures (reminded me of a scene from Shah Rukh Khan Juhi Chawla starrer….where apro SRK says KKKKKKiran….)

So, first things first:

Where are my privacy settings?

To view and adjust your privacy settings:

1. Click in the upper-right corner of any Facebook page
2. Select Privacy Settings from the dropdown menu
3. Click on a setting (ex: Who can see your future posts?) to edit it, or use the left column to view your other settings

What are my privacy shortcuts?

Your privacy shortcuts give you quick access to some of the most widely used privacy settings and tools. Click at the top right of any Facebook page to see shortcuts that help you manage:

• Who can see my stuff?
• Who can contact me?
• How do I stop someone from bothering me?

This is also where you’ll find the latest privacy updates and other helpful tools. The shortcuts you find here may change over time to reflect the settings and tools that are most relevant.

Controlling who can send you friend requests

By default, anyone on Facebook can send you a friend request. If you’d like to change who can send you friend requests:

1. Click at the top of the page.
2. Click Who can contact me?
3. Choose an option from the dropdown menu below Who can send me friend requests?

Changing the filter preferences for your messages

You can change your filter preferences right from your inbox:
1. Go to your Other Inbox
2. Click Edit Preferences
3. Select Basic or Strict filtering
4. Click Save

Messages that are filtered out of your inbox will appear in your Other folder. If a message you’re not interested in gets delivered to your inbox, select Move to Other from the Actions menu. Keep in mind, anyone on Facebook can send you a message, and anyone can email you at your Facebook email address.

Who can see your profile pictures

When you add a new profile picture, here’s what happens:

• The photo is added to your timeline and appears in your Profile Pictures album.
• A thumbnail version of the photo is made and appears next to your name around Facebook. This helps friends identify your posts and comments on Facebook.
• Your current profile picture is public. You can change who can see likes or comments on the photo.

Step 2: Select the audience with whom you want to share your personal stuff

This includes:

• When I share something, how do I choose who can see it?
• How can I use lists to share to a specific group of people?
• Can I change the audience for something I share after I share it?
• How do I control who can see what’s on my timeline?
• What is my activity log?

When I share something, how do I choose who can see it?

You’ll find an audience selector tool most places you share status updates, photos and other stuff. Just click the tool and select who you want to share something with.

The tool remembers the audience you shared with the last time you posted something, and uses the same audience when you share again unless you change it. For example, if you choose Public for a post, your next post will also be Public unless you change the audience when you post. This one tool appears in multiple places, such as your privacy shortcuts and privacy settings. When you make a change to the audience selector tool in one place, the change up-dates the tool everywhere it appears.

The audience selector also appears alongside things you’ve already shared, so it’s clear who can see each post. If you want to change the audience of a post after you’ve shared it, just click the audience selector and select a new audience.

Bear in mind, when you post to another person’s timeline, that person controls what audience can view the post. Also that, anyone who gets tagged in a post may see it, along with their friends.

How can I use lists to share to a specific group of people?

Lists give you an optional way to share with a specific audience. When writing a post or sharing a photo or other content, use the audience selector to pick the list you want to share it with.

Can I change the audience for something I share after I share it?

Yes, you can use the audience selector to change who can see stuff you share on your timeline after you share it. Keep in mind, when you share some-thing on someone else’s timeline, they control the audience for the post.

How do I control who can see what’s on my timeline?

•    You can share basic information like your home-town or birthday when you edit your timeline. Click Update Info (under your cover photo) and then click the Edit button next to the box you want to edit. Use the audience selector next to each piece of information to choose who can see that info.

•    Anyone can see your public information, which includes your name, profile picture, cover photo, gender, username, user ID (account number), and networks.

•    Only you and your friends can post to your timeline. When you post something, you can control who sees it by using the audience selector. When other people post on your timeline, you can control who sees it by choosing the audience of the Who can see what others post on your timeline setting.

•    As you edit your info, you can control who sees what by using the audience selector.

•    Before photos, posts and app activities that you’re tagged in appear on your timeline, you can approve or dismiss them by turning on timeline review. Keep in mind, you can still be tagged, and the tagged content (ex: photo, post) is shared with the audience the person who posted it selected other places on Facebook (ex: News Feed and search).

•    Set an audience for who can see posts you’ve been tagged in on your timeline.

•    To see what your timeline looks like to other people, use the View As tool.

What is my activity log?

Your activity log is a tool that lets you review and manage what you share on Facebook. Only you can see your activity log.

Step 3: Manage how others connect with you— mainly photo tagging

This includes

•    How do I remove a tag from a photo or post I’m tagged in?
•    What is timeline review? How do I turn timeline review on?
•    How do I review tags that people add to my posts before they appear?
•    How do I control who sees posts and photos that I’m tagged in on my timeline?
•    How can I turn off tag suggestions for photos of me?

How do I remove a tag from a photo or post I’m tagged in?

Hover over the story, click and select Report/Remove Tag from the dropdown menu. You can then choose to remove the tag or ask the person who posted it to take it down.

You can also remove tags from multiple photos at once,

1.    Go to your activity log
2.    Click Photos in the left-hand column
3.    Select the photos you’d like to remove a tag from
4.    Click Report/Remove Tags at the top of the page
5.    Click Untag Photos to confirm

Remember, when you remove a tag, that tag will no longer appear on the post or photo, but that post or photo is still visible to the audience it’s shared with other places on Facebook, such as in News Feed and search.

What is timeline review? How do I turn timeline review on?

Posts you’re tagged in can appear in News Feed, search and other places on Facebook. Timeline review is part of your activity log and lets you choose whether these posts also appear on your timeline.

When people you’re not friends with tag you in a post, they automatically go to timeline review. If you would also like to review tags by friends, you can turn on timeline review for tags from anyone:

1.    Click at the top right of any Facebook page and select Account Settings

2.    In the left-hand column, click Timeline and Tagging

3.    Look for the setting Review posts friends tag you in before they appear on your timeline? and click Edit to the far right

4.    Select Enabled from the dropdown menu

How do I review tags that people add to my posts before they appear?

Tag review is an option that lets you approve or dismiss tags that people add to your posts. When you turn it on, then anytime someone tags a photo or post you made, that tag won’t appear until you approve it. To turn on tag review:

1.    Click at the top right of any Facebook page and select Account Settings

2.    In the left-hand column, click Timeline and Tagging

3.    Look for the setting Review tags friends add to your own posts on Facebook? and click Edit to the far right

4.    Select Enabled from the dropdown menu

When tag review is on, you’ll get a notification when you have a post to review. You can approve or ig-nore the tag request by going to the content itself.

Its important to highlight that when you approve a tag, the person tagged and their friends may see your post. If you don’t want your post to be visible to the friends of the person tagged, you can adjust this setting. Simply click on the audience selector next to the story, select Custom, and uncheck the Friends of those tagged and event guests box.

How do I control who sees posts and photos that I’m tagged in on my timeline?

To choose who can see posts you’ve been tagged in after they appear on your timeline:

1.    Click at the top right of any Facebook page and select Account Settings

2.    In the left-hand column, click Timeline and Tagging

3.    Look for the setting Who can see posts you’ve been tagged in on your timeline? and click Edit to the far right

4. Choose an audience from the dropdown menu

You can review photos and posts you’re tagged in before they appear on your timeline by turning on timeline review. Keep in mind, photos and posts you hide from your timeline are visible to the audience they’re shared with other places on Facebook, such as in News Feed and search.

How can I turn off tag suggestions for photos of me?

To choose who sees suggestions to tag you in photos:

1.    Click at the top right of any Facebook page and choose Account Settings

2.    Click Timeline and Tagging from the left-hand column

3.    Under the How can I manage tags people add and tagging suggestions? section, click Who sees tag suggestions when photos that look like you are uploaded?

4. Select your preference from the dropdown menu

When you turn off tag suggestions, Facebook won’t suggest that people tag you when photos look like you. The template that we created to enable the tag suggestions feature will also be deleted. Note that friends will still be able to tag photos of you.

Well, these were the basics.

If you want to learn more either visit http://www. facebook.com/help/privacy alternatively, you can do a google search and you will find several useful links to help you on this issue (not only for facebook).

Disclaimer: The purpose of this write up is to spread awareness, promote ethical and safe computing practices and share knowledge. This write up does not seek to discredit or malign any particular person, corporation or business in any manner what so ever.

Mobile Payments — the future trend

fiogf49gjkf0d
This write-up discusses some of the prevailing trends and products available for making payment by using a mobile phone. While there is a lot of similarity in the payment process, there are subtle differences in technologies used and accompanying advantages/ disadvantages. This write-up seeks to highlight some of the differences.

To say that the advent of mobile telephony in India has changed the lives of countless millions would be stating the obvious. Today, mobile phones are not just a means of communication, but they are much more. I am sure, neither Alexander Graham Bell (who invented the telephone in 1875) nor did Dr. Martin Cooper (who is credited with designing the first practical mobile phone back in 1973) ever imagined that one day in the future their invention would be used to:

  • Flash1 one’s status (funky, snooty, VFM)
  • Collect memories (photos)
  • Stay connected (Facebook & Twitter)
  • Keep updated (news, alerts)
  • Entertain (music, video)
  • Transact (m-commerce)
  • Influence people (Obama’s election campaign) 

Be that as it may, today, mobile phones are an integral part of our day-to-day environment and (at the cost of repeating myself2), their importance/ our dependence on this marvel of technology is growing by the day. Today, the phone has become the hub for all our activities, from e-mailing and browsing to paying bills and transferring money. In fact, mobile phones are fast replacing your credit/ debit/ATM cards (Plastic money) as a convenient mode of transacting. For the uninitiated, please watch the recent ads put up by Airtel, Indusind Bank. There are several active players3 and they offer the same or similar services, for a charge (of course). Here it is important to understand what is on offer, and then pare down expectations accordingly.

How does a mobile banking/wallet work?

Mobile banking (not to be confused with phone banking) allows you to conduct financial transactions on your phone just as you would at a bank branch or through Net banking. Banks are now evolving this facility as they launch innovative products (this sometimes entails installing an app on your phone). In the mobile banking segment, all telecom companies have tie-ups with different banks that allow you to avail of banking services.

  • The process is pretty simple, and the steps could be something like: Register with the service provider: Open an account with the concerned bank or telecom company.
  •  In case of a bank — register for Net banking.
  • Use a Java-based phone4.
  • Activate GPRS services on your connection, so that you can access the Net5.
  • Install the banks phone app.

To transfer funds, you will have to:

  • Log in using the bank’s app menu and input the mobile phone number or bank account number of the beneficiary.
  •  Message the PIN you receive from the bank to the beneficiary who will also receive a secret number.
  • The recipient will have to log in with both PINs at the ATM to withdraw the money.
  • If the funds are being transferred to a bank account, it will take about four working days.

Practical applications:

IndusInd Bank’s cash-to-mobile service enables customers to transfer money to anybody, including those who do not have an Indusind Bank account. A bank customer is required to download the bank’s app on his phone, and then put in the phone number of the person to whom he wants to send the money, along with the transaction amount. The bank sends a message to the remitter and the beneficiary, along with different PINs to each. The remitter is required to message his PIN to the beneficiary, who can then use both PINs and his mobile number to withdraw cash from an IndusInd Bank ATM. The service is free, but operator charges would apply. Also, the sender will need a Java-enabled handset. Airtel Money has a different offering.

Airtel Money can be used on any mobile phone, and you can register for it by dialling *404# or at an authorised Airtel Money retailer. There are two types of accounts. The first one is an express account, wherein you can load Rs.10,000, and use it to pay utility bills or for booking rail/flight tickets on travel portals. The upgraded version is called a power account, which can be loaded with amount up to Rs.50,000. This can be done through Net banking or an Airtel Money retailer.

Charges?

There is a minimum fee for each transaction. For instance, a transfer of up to Rs.500 will cost Rs.5, while higher transactions and up to Rs.10,000 will entail a fee of Rs.10. Under mobile banking, apart from the transaction charge, one also pays Internet charges and SMS charges to the service provider.

Other considerations:

The Reserve Bank of India (RBI) has capped the transaction limit to Rs.10,000 for all essential services like ticketing, utility bill payments, etc. For non-essential transactions, the limit is set at Rs.5,000. There is also a ceiling of Rs.50,000 for loading the wallet.

While online banking has picked up pace, mobile banking is currently subdued. One reason for this is that whenever a new technology is introduced in the market, it takes time for people to familiarise themselves with it, which is why the growth is slow. Phone technology is another problem area, as there are different platforms of mobile banking for different phones. Also, let us not forget the whole business of bandwidth — all these applications need secure and good connections.

 Presently, most banks have decided to take one step at a time. They are not pushing hardcore banking services, but only presenting mobile banking as an enquiry tool to entice customers to carry out transactions. For example, SMS alerts for bill payment may tempt you to pay the bill through the phone itself.

What’s in store for the future?

Notwithstanding the above, the advent of smart phones has definitely spelt good news for the mobile banking segment. Why? For starters, the younger generation today prefers to use mobiles more than PCs. Secondly, statistics7 suggest that there are approximately 13 million Internet users in the country, as against 911 million mobile phone users. Obviously, the numbers would justify future trends and investments.

This decade belongs to mobile telephone, and the use of phones (smart or otherwise) is going to be the trend of the future. Until then, bon chance.

1    On April 3, 1973 Dr. Martin
Cooper did show off to his rival Joel Engel, head of research at AT&T’s
Bell Labs by placing a call to him while walking the streets of New York City
talking on the first Motorola DynaTAC prototype.

 2    Refer to this feature in the BCAJ March 2010.

 3    Airtel, Oxicash, Paymate, ICICI, Citi, Indusind,
etc.

 4    Not required for Airtel Money.

  5    Not required for Airtel Money.

6     This is based on the
information available in public domain, there may other charges/conditions.
Readers are expected to do their own due diligence before subscribing to the
service.

   7  Released by TRAI in February 2012.


levitra

Google Hangout – III

fiogf49gjkf0d
About this write up: This write up is the 3rd part of the series of articles on Google Hangout. This write up focuses mainly on some of the more popular instant messaging apps. The article briefly describes some of the features of these apps and highlights how hangout appears to have an edge over its peers. This article is the third and final installment of a series of articles on this topic. The first write up dealt with the telecom ecosystem and the different messaging apps/options available to users. The write up also dealt with the rise and fall of these apps/options over time. The second installment mainly dealt with the apps like SMS and BBM and why they are losing momentum. In this write up, we will briefly look at the current favourites in the instant messaging apps space and how they compare with Google Hangout (or vice versa for that matter).

Popular instant messaging apps

The previous write ups have dealt in brief why instant messaging apps became popular. Some of the key factors were:

Cost factor: Short Messaging Services (i.e. SMS) became a rage during the time period when the cost of voice calls were sky high. Their popularity started declining when the telecom service providers started reducing the voice call rentals. As a matter of fact, the general perception today is that it is cheaper to call then to send an SMS especially when it cost 1p per second and 1 minute would cost Re. 1/- as against Re. 1 for just 140 characters /SMS.

Instant Communication: The fact that the message would be delivered instantly – almost anywhere in the world – to the persons phone was a huge advantage over emails. This was true before the Blackberry boys came in and before the smart phones joined the race. Even today, a good majority of the population prefers instant messaging to emails. To be candid, I can’t even recall when was the last time I shared a joke or a personal message with my friends or dear ones on email. As a matter of fact, not a day goes by when one of my colleagues or friends, etc. share that whatsapp, etc., have made it so much easier to connect with family members.

Ease of use: This perhaps is one of the most important factors, especially when seniors are concerned. The younger generation has always been known to be tech savvy and have the uncanny ability to adapt to the latest technological development. One would say that the younger generation thrives on the changes. As against this, the seniors find change unnerving, they prefer the security of the old, tried and tested. This is even a bigger hurdle when they have to take a number of steps to achieve the same goal. Instant messaging has changed that significantly. To give you a simple illustration, if you are using whatsapp and you create groups and include your parents, it gives them an opportunity to know what’s going on, etc. There is a small illustration later in this write-up on this.

Informal communication: This is another reason why instant messaging is very popular is that emails generally have been associated with formal communication as against this instant messaging is perceived to be less formal and mostly casual.

Mass reach: If one compares instant messaging with voice calls i.e. alerts for charges on your debit card, reminders for utility payments, etc. – which would you prefer. My vote would certainly go for instant messages – they are far less intrusive. Imagine receiving a telephone call everytime a charge was made on your card or a utility payment was due – one more voice to nag you….

That being said, let’s move on to the apps which are popular:

Popular instant messaging apps:

Whatsapp:
This one is my favourite. In fact I wrote an article recommending this app in the BCAJ. It is one of the apps (out of 75 on my phone) for which I have paid money (it’s free now) (have only 5 paid apps 70 are free).

This app is quite efficient. Apart from allowing you to send text messages, the user can also send photos, videos and sound files (this was added after we chat came on the scene). This app will help you save a lot of money on the phone bill (especially if you have an unlimited data plan). Some of the other useful features include group messaging, sharing location, time stamp. What I particularly like about whatsapp is that

• it works on a simple GRPS connection as well as a WIFI (no need for a data plan)
• I don’t need to add contacts separately (unlike BBM)
• Even if I change my phone, new messages will come to the new phone, even if I don’t have anyone’s PIN
• It works on all popular devices/operating systems

We Chat:
This is app is fast gaining popularity and there are several ads being aired on almost all channels. The biggest plus is that apart from texting (and the ones described above), users can also send voice messages.

To be honest, I don’t have much comment or experience in using this app. There were a couple of turnoffs however:

• One needs to register an account with we chat
• Why bother sending a voice message – just call
• Chinese ……snooping….

Skype:
Has been around for several years now, recently bought over by Microsoft. Quite popular even today. It is available on the desktop as well as on the phone. This was popular because it gave the users the ability to have a real time voice conference (one to one or one to many or many to many). Many seniors use this to talk to their dear ones living around the world. Once again, I don’t have much comment or experience in using this app. There were a couple of turnoffs however:

• The app is very resource hungry – takes a lot of space and RAM when in operation

• Voice quality is decent but the video is often grainy and jerky (could be a bandwidth or a hardware issue on either side – did not face the as much in google hangout though)
• Need to register an account. You could call on the phone but (I think) you have pay charges for this facility

Viber:
This app is also quite popular. The biggest plus is that it allows real time voice calls. Have tried this from my phone, there is some time lag but the voice clarity is pretty good (even on GPRS). The app gives you the convenience of group chatting and alerts you as and when users download and activate it on their phone (Whats app doesn’t give an alert). It is fairly popular and in many ways scores over skype due to ease of use and speed. Unlike skype, it doesn’t offer video chat. Recently, they have started offering a desktop version.

Google Hangout:
Google has taken its time testing this app….. moving from google chat to google talk and now hangout. This app works on most smart phones (desktop — its already linked to your gmail account). The pluses are that it allows you to send text messages and hold video conference. Have tried it a couple of times and when compared to Skype and Facetime (iPhone/iPad specific), the video quality is somewhere in between (better than skype but still miles away from facetime). Just last week, I was trying to get on a video chat with someone located in Canada and after 10 minutes of skyping he said why don’t we switch to google hangout it some much better. I think that more or less summed it up for me.

The next write up will focus on what google hangout has to offer and what the future may have in store for users. It will also be the concluding part of this series. Do look forward.

Disclaimer: The purpose of this article is not to promote any particular site or person or software. Further comments about various products and services are based on the user experience related information available in the public domain. There is no intention to malign any product or service in any manner whatsoever. The sole intention is to create awareness and to bring in to limelight some thought provoking content.

Google Hangout – II

fiogf49gjkf0d
At the outset, I would like to mention that when I started penning the series titled Google Hangout, I had intended to cover more than just the features of the Google Hangout app. The intention was to build a bit of background and create awareness of where we were and where we are heading. While there is a growing perception that Google Hangout may be a game-changer, there is very little awareness as to what are the dynamics involved. In this series on Google Hangout, I have endeavoured to bring out some of the facts which I thought would put things in perspective. The concluding write-up of this series will cover some of the features which suggest that Google Hangout will change the way we communicate with each other.

About this write-up
: This is the 2nd part of the series of articles on Google Hangout. This write-up focuses mainly on the events before Google Hangout was put up in the public domain and how these events directly or indirectly will influence the things to come—one of which is the success of Google Hangout as an instant messaging app.

The previous article briefly discussed the events related to the development of the app-based ecosystem and the rise and decline of various players in the arena of instant messaging apps. In this write-up, we will discuss some of the popular instant messaging apps, which are relevant today, but in the foreseeable future may become ‘also rans’.

The previous write-up briefly touched upon the early events leading to the advent and subsequent developments related to the instant messaging ecosystem. The previous write-up also discussed why Short Messaging Service (SMS) became popular. Thereafter, BlackBerry Messenger (BBM) came into the picture and shook up the world. Today, the scene has changed; the tide has turned for BBM, which is fast losing ground to newer and more nimble players in the market. The following paragraphs, are some of the facts which help the readers understand some of the key factors at play.

The rise and fall of SMS

As discussed in the previous write-up, between the years 2000–2005, SMS was a popular means of communication. The (prohibitive) minimum cost of a voice call was the primary reason. But as time went by, technological advancements, easy and cheap access to mobile telephony, drop in the minimum cost per call, etc., led to the decline in the popularity of SMS as a means of communication. Once voice call rentals started to fall, users started realising that there were disadvantages to using an SMS, i.e., other than the difference in the cost of an SMS vs. voice call, factors such as limit to number of characters per SMS, limitations of the keypad on the phone, perceived need for rich media compared to plain vanilla text, abuse of the SMS system by mass advertisers, etc.

One may say that the role of SMS as an enabler of instant communication reached its peak when it became the de facto means of mass communication. At its peak, SMS was used to exchange greetings during festivals like Diwali, Christmas and Id; banks and other service providers sent updates of transactions related to money transfers, credit card use, bank balance; users exchanged daily SMS’s (containing jokes, positive thoughts, etc) on a mass scale. A whole new ecosystem had spawned due to mass SMS-ing.

While mass SMS-ing capability was the bright side, there was a dark side too. Mass advertisers started targeting a number of mobile users for mass messaging. Consumers across the country started receiving (mostly unwanted) messages. These would range from offers to supply services of a plumber, AC repairmen, to selling insurance, stock trading tips, so on and so forth. Somehow this development seemed inevitable. Mass marketers then found that their calls to mobile phone users (for selling various products and services) were being ignored due to the caller ID facility. Mass advertisers realised that while a call could be ignored, there was no way to stop someone from sending an SMS. The abuse became so rampant that the Telecom Regulatory Authority of India (TRAI) was forced to clamp down hard. TRAI imposed several restrictions such as the National Do Not Call Registry, imposed requirement to register mass advertisers and enforced a limit on the number of messages per phone number per day, among other diktats.

Thus, the fate of SMS as a means of instant communication was more or less sealed. Today, savvy users consider SMS-ing not only an expensive option; they also find it to be a limiting factor when they want to reach out to their ever-expanding network of friends.

The rise and (eminent) fall of BBM
During 2005–2010, i.e., right about the time that the reign of SMS was nearing its end, BBM started gaining ground as the de facto means of instant messaging and communications. The BlackBerry (BB) device was already quite popular as a smart device for official communication (i.e., emailing). With a growing number of users and easy access to the BB data services, BBM started covering ground lost by SMS. By 2008-09, BBM was already accepted by the corporate world as a reliable instant messaging service. In the BB world, email was the official means of corporate communication and BBM was the unofficial yet cool means of communication. Users formed groups and used BBM to exchange jokes, positive messages, etc., just like they had used SMS in the past. The advantage that BBM offered was zero cost. As pointed out in my last write-up, while the BBM service itself was free, one would need to purchase a BB (approx cost 18K plus) and pay for the data charges. Another relevant point was that right up to 2007-08, users would restrict their BB subscription to mail and data services, voice calls were used sparingly. Apparently, at that time, BB had not been permitted due to which the cost of a voice call was far too high. Soon thereafter, various Indian telecom players started offering BB services (data & voice) at an affordable cost. Even then a user would have to purchase a BB device, the cost of which was quite steep for the common man.

Post 2008, a series of changes took place:

• Increase in the penetration of mobile technology— widespread usage across the country
• Advent of global players in the telecom sector
• Falling rentals for voice calls
• Introduction of 3G technology
• Easy access to Internet, through smart phones
• Introduction of better quality of smart phones
• Rise of the iPhone

While one could argue both ways on all of the above factors, it remains an accepted fact that easy access to the Internet and availability of cheaper technology, i.e., both hardware as well as software, were the key factors to the upheaval that was to come. By 2008, Nokia had already started losing ground to BB devices. It was no longer a “status symbol”. At that time, users (and to a great extent Nokia too) started realising the perils of not keeping up with the changing times. Users realised that Nokia’s Symbian-based phones could not match up with increasing end-user expectations, mainly related to emailing and access to the Internet. Also, BB was uniquely positioned because it offered a full QWERTY keyboard. While other device makers did try to play ‘catch-up’, they had already missed the boat.

BB’s troubles really began to surface after the introduction of the iPhone 4. It was slick, userfriendly and what many industry watchers would say ‘path-breaking’. There were several reasons for and against the iPhone, some of which are:

• It was expensive but users felt it was worth it.

•    While users were restricted to the iOS and the iTunes environment, these environments themselves provided for so much that one did not feel the need to look beyond that factor. As a matter of fact, the feeling was that none of the other players provided so much.

•    There was a paradigm shift in the user interface environment. While a standard number pad was considered as a serious limitation, the famed QWERTY keyboard and track ball/pad also seemed to be laborious when compared the iPhone’s touch-based interface.

The dynamics were completely stacked against the QWERTY keyboard when Apple introduced Siri, its much touted voice-based interface.

•    The ease in Internet access gave much needed succour to Internet-dependant apps like Google Talk and WhatsApp. (Here I would confess that I started using an iPhone just about then— around December 2009, and at the instance of my mentor, installed WhatsApp. To be candid, I was more than happy to see my phone bill go down due to the lower number of SMS’s).

•    The Apple app store made sure that more and more (free as well as paid) apps kept cropping up. Users were spoilt for choices. It was only a matter of time that they realised the limitations in the offerings of BBM.

•    The increasing popularity of the apps market place and introduction of iPhone clones was a strong sign that BB, and as a natural consequence BBM, would see that its days were numbered.

•    Many IT players saw the growing popularity of instant messaging apps and felt that there was a gap between what BBM had to offer and what the consumers at large were expecting. Thus, WhatsApp, WeChat, etc., came into the market. Text-based messaging was destined to be a thing of the past. People were already expecting more. Between WhatsApp and We-Chat they got to sent voice-based messages, videos, pictures, map locations, etc. It seemed that BBM was already gasping for breath at that time.

While there are several other facts which one would want to consider, I think it would be suf-ficient to say that players like Nokia (which recently decided to sell out to Microsoft) and BB (taking losses, likely to cut approx 5,000 jobs, considering a sellout, recent announcement that it would of-fer BBM on Android phones) are feeling the heat or as one can say, are dropping out of the race.

The next write-up will be about the popularity of apps like Skype, WhatsApp, WeChat, etc., and how these apps (like their predecessors) are likely to face competition from Google Hangout—the new kid on the block.

I wish all the readers the best of luck with the tax audit season.

Disclaimer: The purpose of this article is not to promote any particular site or person or software. Further comments about various products and services are based on the user experience related information available in the public domain. There is no intention to malign any product or service in any manner whatsoever. The sole intention is to create awareness and to bring into the limelight some thought-provoking content.

Google Hangout – I

fiogf49gjkf0d
About this write-up:
Mobile phones have pervaded almost every aspect of our life, be it in the personal space or in the work environment. This is true in so many ways. For instance, most people shudder at the very thought of what would happen if their mobile phone stopped working or was not with them, even for a single hour or a day . There are several reasons for this and mobile apps have made a sizeable contribution in this regard.

While there are several apps which are capable of a variety of functions such as downloading information, music, video, storing and sharing, etc., one of the most notable category of apps which has really improved the user experience are the instant messaging apps. These apps have changed the landscape of mobile telephony and messaging. Google Hangout is the latest entrant in this arena.

This write up briefly describes some of the features / capabilities and how this app would be useful to the readers of this magazine.

Introduction:
Mobile phones have pervaded almost every aspect of our life, be it in the personal space or in the work environment. So much so that most people find it difficult to imagine what would happen if their mobile phone stopped working or was not with them, even for a single day. There are several reasons for this and mobile apps have a sizeable contribution in this regard. There are several apps which are capable of a variety of functions such as downloading information, music, video, storing and sharing all these. However, one of the most notable categories among these apps, which has really improved the user experience is the category related to instant messaging. These apps have changed the landscape of mobile telephony and messaging.

Instant messaging apps started off with a basic text option, gradually moving on to audio and now finally, they have started offering video options also. This write up briefly describes some of the apps and highlights the features of the latest entrant on the scene i.e. Google Hangout.

Background:
Some of you may recall, just about a decade ago (2000 – 2003 types) the closest thing we had to instant messaging back then, was ICQ chat or the Yahoo Messenger or the AOL messenger. These were quite popular and hip. But when you think about it in hindsight…there was a catch… all of these applications were built for desktops/laptops. Ergo, these apps were instant only when you were in front of a PC. But that’s how technology was back then and most people found it useful. As a matter of fact, there are still remnants of those days i.e. Google Chat and Yahoo Messenger are still in use (am not saying popular). In most cases, they have been merged with the email account.

At that time, mobile apps were non-existent at that time. This was partly due to the fact that owning a mobile phone was a luxury for many Indians. Mobile technology was in its nascent stages and quite expensive. The closest thing available to instant messaging back then was the Short Messaging Service or as it was popularly called SMS. But those days were different. Back then, SMSes were either free or used to cost a pittance (at least as compared to the cost of a voice call). But like all good things, like the telegram service and before that the pager service, SMSes too are fast becoming a redundant mode of communication. While this may seem abrupt to many, it isn’t so. Read on to know why

The beginning of the end of text messaging:

One of the first nails in the coffin was put in by the Blackberry Messenger Service (“BBM”). Back in 2006, Blackberry devices (“BB”) were a rage. Then, in 2007-08 (approx), the BBM service was launched. The instant messaging landscape changed completely soon thereafter. By 2010, the popularity of BB and the BBM scaled new heights. And rightly so. After all it was easy to use, instant and most importantly free of cost (i.e. not counting the cost of the BB and the data plan).

At that time, BBM had no competitors. There was a huge void between the BB and all other devices (mainly Nokia, HTC, Sony, Motorola). BB was riding a high. However there was one downside (at least for the users) – the catch was that you needed to own a Blackberry device. That itself was not a small catch, given that each BB device would cost near about 18k plus was a major limitation.

Near about that time Google Talk made its advent. While there were early adopters, reports in the public domain suggest that Google Talk didn’t really dent BBM’s hold on the market. There were several reasons for this. Some of which could be listed as under:

• Available smart phones (not very smart, really speaking)

• Supporting operating system

• (most importantly) Availability of bandwidth (i.e. ability to access internet through the phone).

I know there was Wi-Fi, but come on … really… the users would be able to access Wi-Fi at limited placed… is that really mobile.

Near about that time, a series of products’/services’ launches were announced. Some of the notable ones are:

• Launch of the iPhone 3, 4 and 4S

• Use of 3G & 4G technology

• Itunes and the app market created around the iPhone ecosystem

• The Qwerty keyboard lost its defacto status of standard interface to the touch based interface (no pencil required, as in the case of Palm and i-mate JAMin)

• Apple announced Siri – the new revolutionary voice based interface.

While these changes happened over a period of 3-4 years, in this time period BB slowly and steadily started losing its grip on the smart phone market. With it, BBM started losing its relevance as an instant messaging app.

iOS and Android ecosystem:
With the launch of the iPhone (iOS) and the Samsung S series (Android OS), there were two basic expectations of the customer i.e. easy internet connectivity and newer offerings in the form of apps and utilities. BB and Nokia had taken for granted their position and failed to innovate. What they missed was capitalised upon by Apple and then by Samsung. Their phones and the operating system started behaving like hosts capable of doing a lot more/beyond a simple phone, camera, music player, email, games offering etc. The phones offered a lot more interactivity and options to share.

Instant messaging:
Instant messaging was a part of the mobile telephone ecosystem from early 2000. It was a hit back then, mainly on account of the pricing differential and the convenience it offered. But as they say, time and tide waits for no one and the only thing permanent is change. With newer technology such as 3G, 4G, WiMax, LTE, etc, users had the chance to use media with richer features/content like images, short audio files and video. The type of files which in the past were not used because of the time taken to upload and download. The need of the hour was the development of apps that would piggyback on the cheaper internet technology (whilst avoiding the more expensive telephone option) and give the users a similar (in many cases better) experience. In the initial phases, developers focussed on developing apps which would allow the users to send SMS via the internet. While these did catch on, they didn’t really become mass products or a rage, as there were several limitations. Already the users were habituated to using software like Skype, Google Talk for online chats (audio as well as video) with developments like the iOS and the Android ecosystem, stripped down versions of these instant messaging software packages started entering the market.

Even these did not (really speaking) really achieve the lofty position of becoming the defacto standard (Skyype did have a hold but …). Part of the reason was that these software packages (not apps) were resource hungry and demanding. Add to this, there was a need for heavy bandwidth.

I did try using Skype on my i-mate JAMin (2006-09) but was terribly disappointed. Was forced to uninstall Skype after two attempts to use (and several attempts to stop my phone OS from hanging).

What this meant for an ordinary user was that not only did you need a very high end phone, you needed a robust operation system and the broadband network for effective usage (similar to a desktop environment). That’s when apps like Whats App, Viber, etc. entered the market. These apps were game changers.

My next write up will carry more information on why these apps became game changers and what were the reasons for the same.   

Until then…. cheers

Disclaimer: The purpose of this article is not to promote any particular site or person or software. Further comments about various products and services are based on the user experience related information available in the public domain. There is no intention to malign any product or service in any manner whatsoever. The sole intention is to create awareness and to bring into limelight some thought provoking content.