Monthly Archives - February 2016

Anti Money Laundering for Securities – Interview of Mr. Jayesh Shah, MD & CEO, Prism Cybersoft

Anti Money Laundering for Securities – Interview of Mr. Jayesh Shah, MD & CEO, Prism Cybersoft Private Limited

Anti Money Laundering in Securities - Jayesh Shah - April 2015

 

What is Anti Money Laundering?

Money Laundering (ML) is the act of converting illegitimate and ill gotten money like those sourced from smuggling, extortion, drugs and terrorism into legitimate money that evades suspicion and scrutiny. Money laundering typically involves three stages –

  1. Placement – introducing ill gotten money into banking/ securities system
  2. Layering – merging ill gotten money with legitimate money for all money to look legitimate &
  3. Integration – process of integrating both legitimate and illegitimate money into regular flow of money making the whole stream to look legitimate.

Anti Money Laundering (AML) is described as activities taken actively by financial institutions to detect money laundering and to prevent it.

AML in banks and financial institutions is quite some time old and enjoys very sophisticated practices for detection and prevention. However, AML in securities domain is pretty new and is still challenging.

 

Why is Anti Money Laundering Important?

AML is important to curb financing of illegal activities discussed earlier. It is necessary that ill gotten money be checked, forfeited and should not enter the main stream money flow disguised as legitimate money.

Who lays the guidelines for AML in India?

The legislative framework for AML was laid by the Prevention of Money Laundering Act, 2002, after which substantial progress has been made in increasing awareness and robustness of Anti Money Laundering guidelines. The Financial Action Task Force (FATF) which is an independent and inter-governmental body is responsible for setting standards in this area. In 2010, after a tough evaluation, India was admitted as the 34th member to the FATF. FEMA has spelt out clear AML obligations to be complied with. Both RBI and SEBI have issued necessary guidelines on the monitoring mechanism and obligations of institutions on Suspicious Transaction Reporting. It is now incumbent upon institutions and service providers to follow these guidelines in true letter and spirit.

Whose responsibility is it to monitor for AML activities?

Every financial organization is expected to study their clients and their transactions and file a ‘Suspicious Transaction Report’ (STR) to the country’s Financial Intelligence Unit (FIU). The FIU itself studies the transactions reported and if it finds anything suspicious, it reports the transaction and client to country’s law enforcement agencies.

 

How does AML in securities stack up with AML in banking or finance?

AML in securities industry is tricky because most jurisdictions and markets don’t accept cash for securities transactions, which is normally used traditionally in ML and terrorism funding activities. ML in securities is also very lucrative for launderers because not only does securities markets helps them launder money, it also helps them generate money. Systems and processes have to be robust and intelligent enough to capture such ML activities in securities space.

What is the scope of AML in securities? Is insurance also covered?

Generally AML in securities refers to AML in Wholesale Markets, Wealth Management, Investments funds and

processing, unregulated funds like Hedge Funds, Bearer Securities, Bills of Exchange etc. Depending on jurisdiction, trading in securities is not limited to securities dealers and brokers but also touches upon banking and insurance. Insurance may thus be covered under securities and an AML guideline covers it as well.

How does ML take place in Insurance?

Money launders use insurance policy and industry to convert their black money or cash into legitimate money. Popular methods in not so advanced countries include buying long term single premium plans using cash and then surrendering the policy in free lookup period. The insurance company refunds the premium money in full and in cheque and when the money launderer deposits this cheque in his bank account, no suspicion is raised because the source of money is legitimate (coming from an insurance company). Additional indicators could be customer canceling a policy and asks the refund to be sent to a third party or a customer interested in products early surrender value or a customer purchasing an insurance policy using different instruments like traveler’s cheques or bearers cheques and cash.

What are the common patterns of laundering seen in securities industry that a market participant should be careful about? How can it be detected by use of technology?

Several common patterns are seen in securities markets in money laundering and AML strategies should be formulated accordingly.

Some patterns pertain to converting illegitimate money into legitimate and some pertain to generating more money.

For converting illegitimate money to legitimate, money launderers may involve simple tricks like –

  • providing misleading information to intermediaries while opening accounts
  • making many small cash deposits and buying securities when the amount becomes large
  • Using brokerage accounts to hold funds for long term and similarly using broker’s pool accounts or broker’s beneficiary’s account to hold shares for long term
  • transactions where one party is seen to be deliberately taking loss thereby transferring money to another
  • purchase of long term investments followed by a sudden liquidation regardless of fees and penalties
  • customer engages in extremely complex transactions where his profile may be otherwise
  • Engaging in boiler room operations etc.Sometimes money laundering clients may bring cheques from another reputed financial institution to open an account or for transactions making the intermediary lower their KYC standard because they get biased into believing that the originating financial institution has already conducted its own KYC investigation and hence issued a cheque which may not be the case.

    Sophisticated tricks may involve selling deep in the money options at a throw away price or generally at inferior terms allowing the counterparty to exercise the option and get money (so that payment looks legitimate),

    Generation of more money involves tricks such as manipulating low priced securities, use of shell companies for reverse merger, insider trading and other kinds of frauds.

    Technology plays a very crucial role in detecting AML activities. For example, all client lists must be daily scrubbed against debarred entities list. It may so happen that when a client is registered he may be acceptable but later he gets blacklisted hence daily check is needed. The case of undervalued options can be checked by having a good options pricing tool in place and software having validation that disallows in the money options to be sold cheap or at no cost. Similarly, trading systems must continuously scan for clients whose trading positions and strategies are not consistent with their risk profiles.

Do employees also play a role in money laundering? Should they also be monitored?

Certainly. Participants think AML is only about screen people like Politically Exposed Persons or drug/ terrorism and suspicious transactions but it is also about intermediaries, distribution channels, products, payment methods and most importantly employees of the organization who may be in collusion with the money laundering client.

In fact each organization must look for vital clues within the organization to check any such activity. Some indicators of employee involvement could be –

  • employee reluctant to take leave
  • employee’s lifestyle is lavish and inconsistent with his earnings
  • employees job demands and goals are intense making him compromise on KYC guidelines
  • employee is located in a different country than his supervisor
  • a management culture that rewards numbers more than compliance with requirements
  • employees bringing supporting documentation for clients which is inadequateThe AML monitoring department must constantly draw up policies to monitor its employees against this list. Software products must also have necessary checks in place. For example, it has been seen that employees delete transactions or amend the terms of transactions which may benefit clients in an undesirable way. Software should not allow any deletes. Any changes that employees may want to make should be through passing of necessary journal entries or reversing journal entries and never by deleting because auditors may not go through deleted transactions in normal course of time.

What basic care should a brokerage take to make sure it doesn’t become an agent for money laundering?

Some basic steps are

  • strictly adhering to KYC guidelines
  • disallowing cash withdrawals and use of cash for purchase of securities
  • not allowing withdrawal or sale proceeds to be paid to a 3rd party
  • not allowing changes to a financial product after the transaction that enables payments to be received from or paid to third parties

What are the main challenges that institutions and regulators are facing in AML compliance in the securities sector?

There are several challenges in this area starting from inconsistency in definition of ‘securities’ itself. Reporting requirement for securities has only been introduced very recently in many jurisdictions. Many institutions find it difficult to file STRs on time because transactions in securities market are very fast paced. Some institutions may not understand STR requirements itself. Institutions should also train employees rigorously on AML and the implications of ML as institutions that don’t comply will attract heavy penalties and will lead to reputation loss and even imprisonment of directors or those responsible.

 

Disasters Happen: Is your Business Continuity Plan Ready? – Interview with Mr. Jayesh Shah, MD & CEO, Prism Cybersoft

Disasters Happen: Is your Business Continuity Plan Ready? – Interview with Mr. Jayesh Shah, MD & CEO, Prism Cybersoft Private Limited

Disasters Happen - Is your Business Continuity Plan Ready - Jayesh Shah - March 2015

What is BCP and Disaster Recovery?

Institutions like banks, brokerages, exchanges, depositories etc face continuity threat from incidents like fire, flooding, terrorism and other natural and man made calamities. Businesses must go on even if such adversities are faced. Services need to be rendered and data needs to be protected. A Business Continuity Plan (BCP) and Disaster Recovery (DR) is any plan to continue operations if a place of business is affected by disasters mentioned above. Such a plan mentions how the business will restart its operations, how quickly and how it will recover its lost data or move all operations to some other location. For example, if a fire destroys a building where an institution was running its operations from, how will the institution resume its operations from somewhere else with minimal loss of time, effort, continuity etc? Businesses in developed countries place a lot of emphasis on BCP and DR in the post 9/11 scenario. The concept by itself is not new. In olden days, kings use to involve their teenager sons in affairs of the kingdom so that if the king gets killed untimely in a battle, son can take over without much loss to the kingdom.

What is the importance of Business Continuity Planning and Disaster Recovery?

Like many other businesses, financial services business is sensitive. Enormous wealth is made or lost in seconds. If an institution is an intermediary, like a stock broker, it has a responsibility and it shoulders. Transactions worth millions is done through it by its clients like investors and traders. Any disruption in its service could result in losses worth millions to its clients and in turn itself if it is not covered and has not planned to meet such events effectively.

Why should an institution plan for DR?

An institution such as an intermediary executes millions of transactions on a daily basis. A minute disruption in service due to god or man made events like flood, fire, malicious software etc could lead to losses for itself and its clients. Apart from monetary loss, reputation loss and data loss could be fatal. Once client’s trust is lost, it is extremely difficult to regain. BCP and DR is fast becoming a must have for critical businesses like financial institutions.

Is there any area that is ignored in a DR plan?

It’s a myth that DR setup is very expensive. Technologies like virtualization and cloud makes it very easy and cost effective to set up a DR site.

DR is like an emergency service. One prays that is never needs to be activated but once it is triggered; it needs to work seamlessly, as expected.

The problem with most DR plans is that while a lot of planning and care is taken to get it implemented, there is virtually no effort taken to test out whether it is running as expected. Lack of testing may disappoint when the DR facility is actually needed. Ideally, once a quarter, the institution must invoke the DR facility without giving any notice and work for one full day on that facility to ensure that operations can be run on DR when invoked.

Also, care needs to be taken that the same disaster doesn’t hit both primary and DR sites. For example having a DR site in Pune for operations in Mumbai is not a good idea because both Mumbai and Pune fall in the same seismic zone.

Another important thing to be kept in mind relates to myth people have about source of disruptions. Institutions plan very carefully for natural disasters. However disruptions due to natural disasters are only about 3% of the cases. More than 75% of cases of outage are because of hardware malfunctions, human error or software getting corrupt, including computer viruses.

Is there any guideline or regulation for brokerages around the need to have DR?

Yes. Regulator has laid down guidelines for BCP and DR and has provided subsequent guidance. However most of the guideline is for exchanges, depositories and clearing house.

What is the difference between BCP and disaster recovery?

Most people think they are one and the same thing. However, DR is a part of Business Continuity Planning. BCP is a much larger plan that involves planning for failure due to systems, processes or people. Infrastructure and system failure is a part of it.

How does one need to plan for DR? Is a real time DR needed?

Normally, a Business Impact Analysis (BIA) is conducted. In this business processes are separated between critical and non critical. For example, a brokerage must analyze very carefully all the aspects of a transaction value chain. It must also analyze criticality of each function and the business tolerance of each function if it were to go down. For example, an institutional client’s DMA business could be said to be extremely critical with zero tolerance to go down. Same could be said for dealing and real time risk management. Back office operations are important but not mission critical. In the sense that an hour of delay in back office can be managed and won’t prove as a show stopper. Once such criticality map is drawn up, brokerages need to draw up a DR plan accordingly. Since real time DR could be resource hungry, not all parts of the transaction value chain need to go to real time DR. Functions like back office etc could go into delayed DR. Available for use but may not be instantly. An hour or two of delay could be tolerated.

One broker I knew placed their servers in a third party data centre which itself has a strong real time DR facility. Without spending a single additional rupee, the brokerage has moved to real time DR.

To what level is this planning necessary?

Business impact analysis discussed above has to be detailed and finally two critical points have to be reached. These are – Recovery Point Objective (RPO) and Recovery Time Objective (RTO). RPO is the accepted latency of data that will not be recovered. For example if there is fire in office and one has backup till yesterday, today’s work will be lost. In many cases, this will be acceptable. RTO is the defined acceptable time it needs to take to restore all functions. Supposing a critical function like trading is halted because of server crash and it takes 10 minutes for the backup server to start and all trading to migrate to this server. In such case, RTO is said to be 10 minutes. Obviously, the lower the RPO and RTO are desired, the more an institution will need to spend to build redundancy. Current guidelines say that exchanges and depositories should have a Recovery Point Objective and Recovery Time Objective of 4 hours and 30 minutes respectively. However, most institutions will try and do better than this.

How fast can a DR facility be activated?

It actually depends on the overall business and technical architecture of the DR site. Activation could be done within few milliseconds to few minutes depending upon a variety of factors like hardware redundancy, bandwidth and scalability of the DR site.

One popular exchange has its DR facility in Chennai. From time to time, it keeps testing this facility by switching off the Mumbai facility during live market. All traders are then shifted to Chennai based DR site for subsequent trading in milliseconds and this shift is so seamless that traders don’t even come to know.

Fast activation naturally needs more money in terms of hardware and support.

What is the state of BCP and DR in Indian Capital Markets?

Readiness on BCP and DR today varies from one institution to another and it typically depends upon IT sophistication of these institutions. However, one common theme that cuts across all institutions is that significantly more planning and investment is needed, especially in the bottom 3 quartile of institutions.

Do you think Financial Institutions must step up their efforts in this area?

Yes of course. Much better understanding and financial investment is needed. Institutions don’t lack the money to put such processes and infrastructure in place. Most have the resources and do millions worth of transactions on a daily basis. They lack the IT awareness and expertise to put this in place.

Are there any people issues that one needs to keep in mind?

Yes. It is important to keep a couple of very high quality people at the BCP or DR site too. This is to take over operations when needed if the institution’s regular office people fail to reach office. If done intelligently, a lot of cost optimization can also be done. For example, if a flood prevents Mumbai staff to reach office, it can have a simple failover plan to start the DR server and a mechanism for critical people to be able to connect their home PCs to this server. Hardware redundancy has to be backed up by a working plan and there should be people to run operations along with enabling the technology. If a couple of trained people are not there to manage the DR set up, and main office staff is stranded, applications will start but there won’t be any one to run it.

Is it all about Hardware Redundancy and Process Planning?

No, a detailed Threat and Risk Analysis needs to be conducted in which the institution needs to properly analyze every potential threat which it faces like earthquake, fire, electricity outage, flood, cyber attack like virus etc. Many threats are purely human which needs solution involving human beings and don’t need hardware solution. If the institution out sources some of its processes, it must ensure that the vendor doing the outsourcing job must also have a proper BCP and DR in place else this may prove to be a weak link.

Institutions must realize that BCP and DR are no longer just a requirement. It is a necessity. It is like insurance for your business. An institution doesn’t realize the impact of not having it, until disaster strikes.

Can a Good Back Office Software be a Competitive Advantage for Brokerages? – Interview of Samir Jayaswal, SVP & Head of Operations, Prism Cybersoft

Can a Good Back Office Software be a Competitive Advantage for Brokerages?- Interview of Samir Jayaswal, SVP & Head of Operations, Prism Cybersoft Private Limited

Can Good Back Office Software - Samir Jayaswal - Feb 2015

Software for managing back office has advanced in the past two decades. They have become intelligent, highly integrated and much more sophisticated. So much so, that our guest, Samir Jayaswal argues that a good back offices can really be a competitive advantage for brokerages. Here are some excerpts from our interview –

How has the role of back office software evolved?

Back office software applications have passed through many distinct phases. Two decades ago, it started by routing traders to correct counters in a trading ring and recording transactions on the trading floor. One and a half decade ago, the role of back-office software used to be limited to book keeping and automating manual jobs like in-warding physical shares, checking against fake and forged database, handling objections, managing and reconciling bank entries, receiving delivering shares and funds towards settlements etc.

Then NSDL and CDSL started operations and coupled with some smart changes by stock exchanges and clearing entities, the market was able to take some very rapid and bold moves towards becoming efficient. For example move to dematerialization itself was a huge success story and the Indian capital market experience has become a showcase to the world. Over time, back offices became intelligent and shortened settlement cycles and a whole new way of managing risk by handshaking of trading application and back office emerged. Now technology offering in a stage where Back-offices are enabling brokerages  to manage multiple lines of businesses like Mutual Funds, IPO, Securities Lending and Borrowing etc and the level of automation is very high, enabling brokerages to free up their key resources to manage other value added assignments.

How do you see this role evolving further?

The day is not far that back offices will be so evolved that there will be very little back office at broker’s end! Brokerages are realizing that the processes they are running on huge applications at their premises with hundreds of people managing them are becoming redundant. These people could be engaged in much better areas internally. Processes themselves are improving by collective efforts of regulators, buy side/ sell side institutions and technology solution providers. Electronic Data Interchange in the form of Straight Through Processing has been enabled by process standardization and interoperability. Real handshake between these participants is happening at an unprecedented level. Banking payments have been eased by introduction of RTGS and NEFT. Our culture of being risk takers and early adopters has also paid off. The day is not far where brokerages will just dictate Service Level Agreements and outsource their entire Back-office operations to third party service providers subject to SEBI guidelines. I envisage that in future the government itself will encourage such outsourcing, given the fact that the government itself out sources some of its critical services like Passport issuance to third party agencies. I also envisage the exchanges, clearing corporations and depositories taking away some critical compliance and control roles away from the brokerages and intermediaries onto themselves. For example, exchanges could start sending the contract notes directly to end clients and depositories could start sending statement of holding and transactions to the DPs end clients. This will serve their interests better.

Brokerages will in future, become lean and thin and will only focus on core activities like Client Acquisition, Marketing & Brand Building and providing more sophisticated trading products, strategies and services to their clients.

So you are saying it is not only software vendors but the market as a whole that is advancing?

Absolutely yes and this is where opportunities are arising from. The Capital Market has seen a phase of rapid advancement and will continue to see this in future as well. Fifteen years ago, clearing houses took 7 days to process 600 Crores of settlements. Now they don’t blink at 70,000 Crores a day. Massive capacity addition has taken place and the investment is continuously on. Apart from these sell side institutions advancement mentioned earlier, government steps like maintaining details of debarred entities, validating PAN details, KRA, uniquely identifying customers first through PAN, UCC and now through aadhaar etc provides impetus to market development. This, coupled with advancement in technology like mobile applications provide for exciting times ahead. Imagine being able to meet your margin and settlement obligations by clicking few buttons on your mobile phone. It will be a paradigm shift.

How much is the role of technology vendors in the success of their clients?

I would say substantial. The client is virtually blind without adequate technology. My belief is that if the application vendor and the client are in harmony, a good back office application becomes a substantial and sustained competitive advantage. Because market moves in cycles, if a brokerage rides on a good vendor, in a market that supports, it can really shape up the brokerages’ fortune. This is subject to brokerage choosing a technology partner that is agile, adaptable to new technologies, listens to the client and understands market changes. If you specifically ask about us, we have been real partners of some of our clients who have trusted us and expanded their reach and offerings on the basis of software and services we have provide to them. The industry must believe in the adage that if your clients like you, they will talk to you but if they trust you, they will do business with you.

What advancement do you see happening on the Back office application front in future?

Back-office software will become much smarter and leaner. Its functions will also be hived off selectively to third party service providers. The role of back office software will also increasingly transform from book keeping to assisting brokerages in more value added areas like client servicing, dealing with intermediaries and helping open new lines of businesses. Also, the role of front office trading application and back offices will merge and become much fuzzier than what it is today. A lot of functions like reporting, book keeping and accounting will either be outsourced or will be shifted to cloud.

Do technology vendors compete with In-house IT teams of large brokerages? Some brokerages are known to have teams of 100+ developers developing applications for them.

The build vs buy debate is perhaps as old as the technology industry itself. No, technology vendors do not compete with brokerages in-house teams. Technology companies, especially product companies invest substantially in Intellectual Property creation. For example, our products could typically reflect 200 man years of investment in development. A typical brokerage house can also do this kind of investment but there is no long term value proposition in it. Hence, they must concentrate on broking. Product companies invest a lot on skill up-gradation, and generally are very well aware of new technologies and how and where they can be leveraged. They also score better than in-house teams in other areas like better employee engagement, better retention, better software development processes, better domain knowledge by virtue of working with multiple global institutions and overall lower costs. Brokerages will increasingly realize that it is best to buy from product vendors rather than develop in house. It is in their interest even if the initial cost is higher. Typically in such investments, The ROI initially sounds better in their favor when they go ahead and develop. But if they take a larger time frame of say three to five years, the total cost of ownership is also much higher than what they would have otherwise paid by buying from a product vendor. A typical vendor like us will also have partnerships with other technology companies like Microsoft, Oracle, VMWare, Segate, Intel, AMD, Redhat, Symantec etc which in turn enables them to understand latest technologies better.

 

How do you see investment of brokerages in technology going forward?

Indian brokerages need to explore and exploit technology more than what they are doing currently. This is because software application is still considered as a day to day work tool rather than a business enabler. Of late with the advent of automated trading and co-location etc, brokerages are actively seeking better technology and are starting to see technology as a competitive advantage. Some brokerages embark on a customized development strategy. Their contention is that if the entire market is using the same application, how could it be a strategic advantage for them? Such decision proves to be tough in the long run when continuous application maintenance is required. In the case of technology product vendors, these costs get shared amongst multiple clients.

What are the other areas where substantial technology investment is expected in brokerages?

Rather than newer areas, brokerages will invest much more in existing areas.  Same clients are being pursued by several brokerages and only the most intelligent of them will manage to retain the client and their attention. This means a lot of investment in intelligent CRMs and client analytics.

Automated trading will see a fresh round of investment to increase trading sophistication and so will newer trading initiatives like High Frequency Trading. Regulators will themselves become more sophisticated and will continue to demand more compliance, leading to escalated investment in better compliance products and anti money laundering applications. Finally, in these uncertain times, much more investment will be diverted to business continuity planning and disaster recovery.

Adoption of Cloud in Financial Services Industry – Interview of Mr. Jayesh Shah, MD & CEO, Prism Cybersoft

Adoption of Cloud in Financial Services Industry – Interview of Mr. Jayesh Shah, MD & CEO, Prism Cybersoft

Adoption of Cloud in Financial Services Industry - Jayesh Shah - Jan 2015

There is so much of discussion around Cloud. What is the reality on ground?

The reality is that a lot of businesses have adopted Cloud for various activities. In a survey, 82% of the companies have reported that they saved money by moving to Cloud.

Initially, there was a wave where non critical functions of an organization like attendance monitoring; project planning etc. was moved to Cloud and critical functions were being self managed. But now we see important and critical functions being moved too. We increasingly get queries from financial community if we have applications for trading, risk and settlements on Cloud. I think it’s a fundamental shift in the thought process and adoption.

Several platforms like Salesforce.Com, Netflix have become market leaders being on Cloud. Internally, we also use Cloud based software extensively for managing client interactions, our own engineer’s timesheets etc and experience the benefits of Cloud on a day to day basis.

 

What are the various categories of services offered on Cloud?

It all started from Software as a Service (SaaS) but now not only software but we see several extensions like Platform as a Service from Google in the form of Google App Engine and Microsoft in the form of Azure, which by the way even we are leveraging for our own application which we are putting on Cloud. Then there is Infrastructure as a Service like the Amazon Web Services and NYSE Euronext Infrastructure Cloud and also a host of services for Individuals. So what started as Software as a Service on Cloud is rapidly extending in other areas. Customer’s acceptability and willingness to subscribe to such services is also increasing dramatically.

How do you see Cloud penetrating Financial Services as a business?

Financial Services as an industry is normally late adopters of any new path breaking technology like Cloud is. Participants here want complete clarity on regulatory aspects, service aspect and continually assess the advantage vs. risk involved. But once they are convinced, the go the full path to embrace such technologies.

My estimate is that Indian Financial Services firms, especially the trading community we operate in is yet to adopt Cloud as a platform because vendors are yet to roll out large, organization impacting platforms on the Cloud. However, we are confident that when such services will be offered, they will be adopted rapidly because these same firms have experienced the benefits of Cloud in other areas.

As a company, do you have offerings on Cloud for Financial Services business?

Yes, we are rolling out a personal portfolio management application on Cloud for individuals because we feel individuals have experienced the benefits of web through success stories like flipkart.com, irctc.co.in and several mailing applications. This solution is not run of the mill PMS offered by several financial services portals. It is very comprehensive and automated application offered as SaaS which will help individuals right up to providing data for filing their tax returns.

There are some other areas which we will offer soon on Cloud which will certainly be an industry first. However at this point of time we can’t disclose it because of confidentiality.

What will be the challenges in hosting applications in Financial Services Industry in India?

Where there are opportunities, there are bound to be challenges. For example, there is not much clarity on whether a bank can sign a water tight agreement and start hosting their data exclusively outside of India. Most bankers won’t do it but can they host such data non-exclusively to avail value added services like analytics? If yes, under what conditions.

Then there is this whole issue of trust. Will a trading house host their proprietary trading strategies and data with the same hosting solution provider who is also hosting for their competitors?

Third, there is the issue of legal enforcement. How fast can a financial institution get remedy in case of a data security breach or loss of data because the service provider decided to close down its service? Agreements could be made tight but capitalization of small service providers itself could be an issue.

What is the future of Cloud?

The future of course looks bright. It is estimated that by 2015, spending on Cloud could be in excess of USD 180 Billion. Innovation will happen on multiple fronts. From Financial Services perspective we will see dramatic changes in Big Data and Analytics. Now each firm need not do big investment in sophisticated analytics tools. There will be smaller, boutique firms specializing in ultra sophisticated analytics tools which will provide services to banks and financial firms over the Cloud. This will be a big leveler especially for small financial services firms. Suddenly a small bank can have the same analytics capabilities as their more sophisticated counterparts like Citibank.

Then, we will see a dramatic change in how content is delivered and accessed. Amazon Web Services Content Delivery Network is again a great leveler for smaller financial services portals and it enables them to distribute content in a low latency environment like larger portals.

Thirdly, it will also change the manner in which applications and data currently in Cloud is accessed. Currently, clients are dependent on their vendors to build in APIs to access these applications.

Once a lot of applications and services migrate to Cloud, which is already happening by the way, we will see a lot of innovation around how these applications and data is accessed from them. For example the Railways could build standard APIs and open access to data from their irctc.co.in platform to private operators to analyze and build in value added services.

The growth of Cloud will also happen because Mobile Apps are growing rapidly. Their growth will feed each other.

I also expect a lot of E-Governance Applications to be put on Cloud by the existing government to bring about service delivery and transparency over the web. In a decade, I expect governments all over the world to be the biggest consumers and also providers of Software as a Service over the Cloud.

Personally do you use Cloud on a day to day basis?

Yes, extensively. I have personally been one of the earliest adopters of Cloud and have experienced its benefits.

I use several apps that are provided over the Cloud. Very recently I signed up with a Cloud storage based company to keep a back up of all my personal documents.

Importance of Business Intelligence, Pricing Optimization & Demand Analytics for Financial Services Firms – Interview of Samir Jayaswal, SVP & Head of Operations, Prism Cybersoft

Importance of Business Intelligence, Pricing Optimization & Demand Analytics for Financial Services Firms – Interview of Samir Jayaswal, SVP & Head of Operations, Prism Cybersoft

Ca33333pture

Please throw some light on business intelligence.

Business intelligence (BI) is a broad term to describe tools or techniques used to transform raw business data into meaningful business insights. BI tools are capable of sifting and reading large amounts of data and producing meaningful information from it. Good business intelligence can generate several business opportunities and insights which can be used for better pricing, better customer engagement, better customer retention and eventually better profitability.

What is Price Optimization?

Price optimization is also one of the examples of business intelligence. It is the science of predicting the price sensitivity for any product or service in a way where customers buy at that price and firms meet their profitability or any other defined objectives. Differential pricing for different segments of customers is a very common strategy adopted by financial services firms in order to attract or retain them. However, a firm cannot just lower its price in an effort to sign up more customers. If it does so, lower prices will lower the profitability. Similarly, raising the prices to increase profit also doesn’t help because higher prices will mean lesser number of customers will sign up. Thus, there exists a price for every segment of customers which will be an optimal price where the firm gets enough number of customers as well as profitability. Trying to find out this right price is price optimization.

Please illustrate with an example.

Let us take an example of a home loan. If a bank prices its loan higher than competition, it will attract less clients impacting its business volumes and in turn profitability. When it prices the loan low, it becomes attractive for customers and while they may sign up in large numbers, the profitability for the bank may be low because the margin is lesser. The bank needs to strike a balance and arrive at the right price for optimum profitability. Another example where price elasticity of demand plays a crucial role is when two customers start negotiating with the bank for better rates. Assume the bank has offered both customers loan at 9.5% per annum. How does the bank know how much discount to offer to each of these customers so that they sign up with the bank? Most salespeople would offer the maximum allowable discount and signup with the customer. However, star salespeople manage to offer the lowest discounts and yet sign up customers. Imagine if the bank could scientifically predict this rate and prompt the sales person to offer this rate where the propensity for the customer to sign up with the bank was highest, every sales person would become a star sales person. A few basis points saved could also result in huge money for the bank.

Is it always price vs profits?

Price vs. profits is the usual organization goals. However, organizations work under several constraints. For example, competition, regulatory diktat, firm’s internal beliefs and policies and so on. The process of price optimization keeps these constraints in mind while arriving at the right prices. Obviously, more the number of constraints, the more difficult it is to find the most optimal price.

What are the benefits for the financial institution?

Without this kind of demand analytics at different price points, financial institutions are blind while pricing their products. Take the example of financial institutions running campaigns during festival times. How does an institution know for sure that by giving a 10 basis points discount, whether the institution will be more profitable because of increase in business or will be in a loss because of lesser margins? Demand analytics produces insights that could deliver increased profitability, increase volumes, increased number of customers and better customer retention. In fact such analytics could help management spell out their key goals and formulate strategy so that they can achieve these key goals.

We need to make sure that customers, who are price sensitive and hence elastic, get rates which make them sign-up with the institution and not go to competition and customers who are price inelastic pay in a way where bank receives the maximum value. Price inelastic customers normally subsidize price elastic customers while price elastic customers bring in additional volumes. Such additional business and margin could improve the profitability of the business to a great extent. This fine tuning of rates is called as ‘Price Optimization.’ How much additional profit can such a strategy bring in? Well, that depends on a variety of factors. Typically, it could be between 5 to 20 basis points. This means that if the institution is operating on a profit margin of 1%, it can improve the bottom line from 5% to 20% directly by just optimizing the prices. Imagine how much effort the institution would have to put in and how many more on the ground sales staff it would need to deploy in order to achieve this increased profitability without price optimization.

Is this a new science?

The science of price optimization is about a decade old but financial institutions are increasingly discovering newer and newer areas for its applications. The earliest adopters were non life insurance companies, and then came banks and now NBFCs and brokerages are also doing pilots to implement this concept. It can be applied in different areas like retail, channel optimization, brokerage and commissions optimization etc.

What is the future of demand analytics?

The future of demand analytics is very promising. As competition increases, financial institutions are under more and more pressure to come up with innovative mechanisms of increasing their profitability. Adding branches and sales people is something every institution does. However, this may improve volumes but not necessarily profitability. They need to do something additional which is innovative and scientific. Profitability is directly linked to margins which in turn are linked to pricing. Hence pricing optimization is important. Institutions need to study demand analytics for various segmentation of customers at different price points. Currently, institutions don’t record what happens when a customer receives a price. However, to understand demand analytics, institutions must record what happened at that price? Did the customer accept or reject it immediately or bargained? If they bargained, how many times etc. This data can be put to good downstream use to calculate price sensitivity of customers and price elasticity of demand.

What is price testing?

Before rolling out the new optimized prices, the institution would like to check how these prices are being received by clients. Typically small samples of customers are chosen or maybe all customers in a particular city are chosen and new prices are rolled out to them to check their reaction. If they respond as per expectation, the new prices are rolled to all clients, else business intelligence analysts go back to the labs to fine tune their models and prices.

What about segmentation?

Market segmentation is a marketing strategy where customers are grouped into subsets when they are perceived to have similar properties and buying behavior. Institutions are currently segmenting their customers intuitively. However, more scientific methods need to be applied. Institutions segment customers so that there can be ease of marketing. However, when detailed behavioral analysis techniques are applied, every customer can be treated as a segment and specifically catered to, dramatically increasing his stickiness with the institution. That is the power of demand analytics.

What if the market is very competitive?

Price Optimization as a science and demand analytics with price optimization is meant to be used in highly competitive marketplaces. In monopolistic & oligopolistic markets, the customers would anyways take any rate you offer to them. In hyper competitive markets, it is very important to understand who are your price elastic customers and retain them. Such business intelligence help you bring in additional volumes by attracting and retaining these price elastic customers and improving profitability from price inelastic customers. It will also help you retain profitable customers and let go of customers who make losses for you.

How does demand analytics and price optimization help in customer retention?

Once a client signs up and sometime down the line he is not happy with the service or prices, he will start taking quotes from competing institutions. Since he is a ready customer, competing institutions will be more than happy to oblige him by giving better rates. In such a scenario, the institution normally responds with a retention offer in order to retain the client. Since this is the only chance the institution has, it must revert with an intelligent rate where the propensity of client staying back is very high. Building intelligence into rates in such cases is a function of how well the institution understands and implements business intelligence tools such as demand analytics and price optimization. Typically, after such implementations, an institution can expect to retain about 10% more clients than what it was doing earlier. This is a good tool to retain clients who are leaving only for price.

What is the kind of technology that goes into such analytics?

Demand analytics is more of a business problem than a technical problem. From a technology standpoint, the analytics tool needs to be comfortable in working and mining large amounts of data. It should also be capable of statistical analysis of data. The trick lies in defining the problem correctly, building correct statistical models and interpreting the results from these models well.