Monday, June 4, 2018

Risky Data 3 – Planning & Strategy as GDPR goes live!

By Bill Moran and Rich Ptak

GDPR has gone into effect unleashing a flood of commentary, proposals for solutions, and tons of advice, some very good as well some not so good or even bad. It’s time to discuss planning and strategy for enterprises moving forward.
Image courtesy of European Commission 
As stated earlier, the status of GPDR in non-EU jurisdictions is unclear. Still, it is important to understand it and consider its potential to impact your operations. This is vital because of the highly likely proliferation of GDPR-type regulations. There have been too many violations of people’s private data for the current laissez-faire approach to handling personal data to continue. It is doubtful this will happen tomorrow. It is, however, unrealistic to think it will never happen. Or, that it will occur in some distantly, vague future.

In fact, many companies already are taking action as they anticipate some version of GPDR being at least partially enacted, if not imposed in developed countries including the US. These may be initially presented as recommendations before taking on the form of federal regulations or laws, state laws or some mixture of the two. Current actions include limiting or even completely eliminating EU consumers access to services, publications or products.

In any case, hacker-driven incidents will continue. The risks, full costs and fines of Facebook-type occurrences are far from settled, and similar infractions are distinctly possible. Consumers and organized consumer interest groups can be expected to drive regulatory action by pressuring governments “to do something”.

How to Prepare
The question is what should a small and medium(SME) enterprise without a physical presence in the EU do? The first step is to determine which, if any, enterprise activities and actions will be potentially affected by a GDPR-like regulation. Then, develop a strategy. Here we are going to discuss general steps that all enterprises should take. In our next version of this report, there will be more specifics. None of the following should be considered to be or substitute for professional legal advice. It is intended for guidance and information purposes.

Enterprises need to examine their internal processes to consider how they could be changed or improved to align with GDPR principles. In some cases, this will mean incurring additional significant costs. Therefore, management oversight is critical. Pro-active activities are prudent. Waiting until there is external compulsion usually results in ballooning costs. Planning for necessary changes in advance means work can be done in a non-crisis, phased mode.

Initial action - Security
The first area to address is security. Given the level of criminal attacks, it is common sense to have ongoing efforts in this area. Evidence indicates that most companies have failed to take the threat of criminal hacking seriously enough. Virtually any company would be damaged and thrown into management turmoil if hackers penetrate their systems. Critical payroll data, personal data, and sensitive customer information are all at risk. Consider what happened to Sony when hackers penetrated their email system. That attack might have been North Korean hackers, but the results might have been worse if criminal hackers had been involved. The North Koreans’ apparent incentive was to disclose email contents to embarrass and punish Sony for making a movie that mocked their leader. Criminal hackers would not necessarily disclose the penetration. Instead, they could monetize the information for use in identity theft or other costly criminal purposes.

The prudent course is to begin with a security audit. In some cases, involving an outside consultant would be necessary. However, in many cases, it could be performed by internal auditors at relatively low cost. For example, investigations reveal that many systems operate with default ids and passwords. Critical systems, installed years ago, with these exposures were never corrected. Such security risks can be uncovered and fixed without expensive auditors by using someone with authorized access to the system. Another common problem occurs when the ids and accounts of ex-employees are not deleted. There are numerous other such security violations well documented. The point is to review and assure that proper polices have been implemented to fix such problems and prevent their recurrence.

There is another class of problems that demand more work to detect and fix. For instance, handy tools installed by IT to make their jobs easier might be applied to a criminal purpose in the hands of a hacker. Policies must be developed to avoid this situation. Sometimes, the solution is simple, i.e. removing tools from the system when not in use. In other cases, the tool might be critical for production. In such cases, it might be necessary for ongoing code audits to see that what it is doing is necessary. Anytime new software is installed on a system, it should be verified and checked to avoid introducing rogue code or viruses.

In the Equifax penetration, improperly maintained open source software caused the problem. A maintenance audit can uncover this problem. The institution of a rigorously enforced policy of careful maintenance for operating system, open source and all vendor supplied software, will help avoid the problem. When a vendor announces a flaw in their system (with or without a fix) one can guarantee that hackers are aware of the situation and will begin probing to find systems without the fix installed.

Despite taking all reasonable precautions, an installation might still be penetrated. Studies have shown that companies are very slow in detecting such events. There may be reasonable ways to improve this response. These should be standard practice.  Clearly, once a penetration is detected corrective action should be taken immediately to limit damage.

If immediate detection is impossible or not feasible, full or partial encryption of data can be an alternative solution. The cost and overhead associated with encryption has dropped dramatically recently. It may not always be practical, or financially feasible, but it is worth investigating. As an aside, IBM provides pervasive encryption on mainframe Linux systems. Encryption needs to be evaluated in other environments.

In summary, most IT installations need to tighten their security. GPDR imposes rather severe penalties for disclosing confidential and personal information. It is good practice to take practical steps now. Let’s look at another area of enterprise risk not necessarily as obvious, but one that needs attention, personal data.

Personal data protection 
GDPR privacy legislation intends to give citizens ownership and control of their personal data. This includes: 1) knowledge of what personal data is in a system 2) an ability to correct any errors, 3) ability to remove data, 4) information about when a data breach occurs and what was exposed, 5) ability to review data stored in the past upon request. Such past data might be important in tax, criminal or judicial matters or contract disputes. Consideration has to be given to how the data is protected, stored, and for how long it must be retained. All are a normal part of data storage and archival. GDPR sets some restrictive requirements on how quickly these must be available, and notifications sent. AND, penalties for non-compliance are high.

This raises the question about what happens when data retrieval isn’t possible from the current system. For instance, it has been corrupted in some way. System backups will need to be accessed. For historical data, the storage media is typically on tape.

Here is a cautionary tale from real-life. Several years ago, a colleague of ours started a company to update backup tapes. Old backup tapes were to be converted to CD or DVD format. The processed tapes were from a variety of companies and government agencies. He found that about ¼ (25%) of the tapes were bad. There were spots on the old, open reel tapes that were unreadable.

Unfortunately, the situation was actually somewhat worse. His process would only detect unreadable spots. In addition, there were readable records that were still wrong because they had been corrupted.

This story demonstrates the need to examine the process for controlling backups. This should not surprise anyone. Most of us have had the experience of trying to use a PC backup only to  discover that the backup does not work. Failing to check a backup process, means that a failed process is revealed when most damaging. Most organization have a backup process that periodically ships tapes offsite; then forgets them. GDPR-type regulations mean it is wise to take steps to test  that the backups work, and provide valid information. Addressing these issues will improve current operations while preparing for their critical need when some form of GPDR arrives.

We recognize neither of these issues were covered in great detail. Our goal has been to make the point that these and other areas need to be carefully examined along with privacy policies, data movement, network issues etc. There is a great deal of work to do here.

The likely arrival of GPDR-like regulations ought to make companies review and reconsider their policies in areas involving the acquisition, storage, use and protection of customer data. All of these will be impacted by such regulations. It is foolish to wait until the arrival of regulations that force mandatory change in a limited time period. Such a delay will likely raise the costs of review and remediation as well as risk costly fines for missing deadlines if breach is experienced. Of course, some flexibility is needed since the exact details of such regulation are not known currently.

Many vendors, including Compuware, IBM, Microsoft, HPE, BMC etc. are offering services and solutions (partial or comprehensive) that include process review definition, evaluation and planning services. Most recognize the need for implementation flexibility and openness to allow for advances in technology and regulatory changes. Be sure to verify this if you decide to employ a partner in your effort. Whatever you do, remember regulatory details will change and you must be able to adapt.

By starting today, enterprises and companies will have adequate time to study this issue and determine the best way forward. Finally, we are convinced there is no reasonable excuse to delay or wait for regulations to take steps to strengthen existing security. For most, there is much work to do. The best thing is to get started now.

In the next edition of this report we will discuss specific steps that companies without a physical presence in the European Union need to take to steer clear of being entrapped in the GPDR web.

Publication Date: June 4, 2018
This document is subject to copyright.  No part of this publication may be reproduced by any method whatsoever without the prior written consent of Ptak Associates LLC.  

To obtain reprint rights contact 

All trademarks are the property of their respective owners.

While every care has been taken during the preparation of this document to ensure accurate information, the publishers cannot accept responsibility for any errors or omissions.  Hyperlinks included in this paper were available at publication time.

Monday, May 21, 2018

Risky Data 2: GDPR outside the EU

Image courtesy of the European Commission
This is the second in our series examining the impact of GDPR outside the European Union. GDPR (General Data Protection Regulations) is the new privacy law enacted by the European Union that becomes effective May 25, 2018.

The law attempts to enforce an individual’s ownership rights of their personal data. It includes provisions to protect the use of any individual’s data that is collected by an enterprise and/or shared with business partners, etc.

It includes significant control over and restrictions on what can be done with such data without specific permission of the owner. In addition, because of the risk of exposure of private data by ‘bad actors’, it imposes very tight deadlines on reporting exposure of such data, along with severe penalties for violating GDPR provisions. 

As a result, the details of the act become very important. As with any very large, broadly targeted and comprehensive law created by a large bureaucracy, there are certain to be unintended consequences along with the intended consequences of the provisions. GDPR covers procedures for obtaining permissions for data use. There are deadlines set for reporting of data theft, data breaches, loss of control, etc. In this piece, we examine some of those details and the risks entailed as a result.

Areas of Uncertainty

Given the size of the task, it is not so surprising that many areas exist in GPDR regulation requirements that are unclear, lacking in detail, or remain undecided. For example, there exists no clear explanation about how the regulations will function in practice. Also lacking are any hints of what operational changes will have to be implemented during the first several years as the regulation begins to take effect. It is normal to have some timeline and specifics provided to help guide and facilitate implementation efforts.

As an example, in many enterprises, while there are management and audit groups that set policy, it is IT operations that has direct responsibility for the implementation details and activities involved in data collection, storage and management. Therefore, GPDR-related implementation will have a profound effect on IT operations. Operations managers should be aware of areas of concern.

As mentioned in the first article, explicit permission is required for collection and use of data. For minors, either the parents or a legal guardian must consent. That requirement alone can have severe problems in implementation, both practical and legal. What will be the process to contact the parents for consent? If you rely on the child to involve the parents, will they tell the truth? Will they identify someone else, who they know will give permission? What restrictions exist about the data that can be requested?

Another area that comes to mind concerns the GPDR-set reporting deadlines in response to violations or permissions. For example, the time limits set for responding to queries for access to personal data, or for alerting and acting on data access breaches appear unrealistic[1]. They will have to be adjusted as companies fail to meet them. We know from experience that planners seldom anticipate the full consequences of their dictates, nor are they good at estimating the cost and time required to comply with their dictates. Only experience reveals the unintended results. It is reasonable to assume many of the GDPR proposed changes will be revised or radically altered, even eliminated, as actual experiences at applying the rules accumulate.

However, it is not clear how significantly nor how quickly any such adjustment will be made. Nor, is there a guarantee how infractions will be treated in the interim.   

Each country within the EU will have its own GPDR authority. This raises a host of questions. Germany, for example, has historically been the strictest enforcer/protector of data privacy. Applying restrictions and punishing violations much more vigorously than other countries. We don’t expect any change in their positions.

Additionally, will large companies be able to shop around the EU to identify the country with the laxest enforcement policies? This is exactly what happened with corporate tax legislation and enforcement. Companies arranged business accounting, manufacturing and delivery processes to minimize tax liabilities. By implementing complex transaction processes, companies were able to greatly reduce taxes paid. As would be expected, enterprises would include careful consideration of country’s taxation policies when making large scale investment and job creation decisions. Will it be possible to do the same with GPDR?

The way actual fines will be determined is not specified. Will the countries differ in calculation formulas? For example, how would the fine be calculated if the data on 500 people is stolen? Does that count as one infringement, or 500?

Strict reading of GDPR means that American companies, including those that have no physical presence in the EU could be subject to the EU’s worldwide scope if they have personal data on any EU citizen or resident in their system. Presumably, the EU would need the local courts to agree to enforce penalties on these companies. How will that work? Will enterprises have to wait for such a case to reach the US Supreme Court to find out the answer? Or, will it become an issue in trade negotiations? To date, there have been no public announcements, or, as far as we know, no discussions. Nevertheless, it is reasonable to assume the US will enter any such negotiation with its own interests in mind. 

What will be the effect of Brexit on the GPDR? Since the UK is leaving the EU, it would seem that the EU mechanism for enforcing GPDR will not apply to the UK. Will the UK decide to make GPDR a part of its law? If so, will the UK make changes in the version of GPDR that it adopts? If they do, how will it differ? In scope? In fines? In restrictions? Will UK enforcement be similar to or radically different from enforcement in the EU? If not, how will it differ?  Presumably, at least some of these questions will be answered as the UK prepares its exit from the EU.

Finally, the GPDR may be tied up in the European courts for some undetermined period as soon as some of the rules are enforced. And, it is likely that it will be challenged in this way. This may also happen in the UK if Britain leaves the commercial trading jurisdiction of the EU as they exit the EU community.

In sum, there are numerous areas of uncertainty surrounding GPDR. Only experience and time will provide definitive answers. In the meantime, it is wise to determine the potential for GDPR to impact your operations. If it is significant, you will need a strategy to prepare for it. Our next installment will examine issues about that potential, as well as what should be considered in developing such a strategy.

This document is subject to copyright.  No part of this publication may be reproduced by any method whatsoever without the prior written consent of Ptak Associates LLC. 

To obtain reprint rights contact

[1] There are studies that show current response times are on the order of weeks rather than the days required by GPDR rules. Of course, that might not be relevant if response times can be adjusted downward under the pressure of the new rules.

Tuesday, April 17, 2018

Compuware continues to lead in Agile DevOps for the mainframe

By Rich Ptak

Image courtesy of Compuware, Inc.

Compuware continues to add to and extend its mainframe solutions as it advances in its campaign to mainstream the mainframe. This time with two major innovations that help their customers preserve, advance and protect their mainframe investments.

Before we get into the innovations, we want to mention Electric Cloud, a new partner, who proactively integrated their service through the Compuware open API. This is the latest example of how Compuware takes an open borders approach where they integrate with a variety of solutions to help customers build out their DevOps toolchains. 

Now, onto the announcements. First, a new product, Compuware zAdviser. It leverages machine learning and intelligent analysis for continuous mainframe DevOps improvements. This new capability provides development managers with multi-level analysis of tool usage and performance data. They focus on the critical DevOps KPI’s (key performance indicators) of application quality, development team efficiency and velocity. All are also key to agile development. Even better, the product is free to Compuware customers. 

Second, is a new GUI for Compuware’s ThruPut Manager, which provides intuitive, actionable insight into how batch jobs are being initiated and executed, as well as their impact on cost. Users can leverage graphical visualizations of batch jobs that are waiting to execute and when they might run. In-depth detail on why a job has been waiting can also be easily obtained.

zAdviser + KPIs + Measurement = Success
Mainframe KPIs are a must if organizations want to successfully compete in the digital age. After all, you can’t improve what you can’t measure and if you’re not continuously improving, you are wasting your time and worse, your customers’ time. Teams must also be able to prioritize and measure the KPIs that will directly impact development and business outcomes. 

A Forrester Consulting study conducted on behalf of Compuware found that over 70% of firms responding had critical customer-facing services reliant on mainframe operations. Providing the customer with an exceptional experience, not simply good, clean code, has become the new measure of operational success.
According to a recent Forrester Consulting study conducted on behalf of Compuware, enterprises are doing a good job of tracking application quality, but they are considerably less concerned with efficiency and velocity. However, in order to modernize their application development strategies to keep pace with changing market conditions, firms must place as much focus on velocity and efficiency as they do quality.

Compuware zAdviser uses machine-learning to identify patterns that impact quality, velocity and efficiency of mainframe development by exposing correlations between a customer’s Compuware product usage and the KPIs. Equipped with empirical data, IT leadership can identify what capabilities within the tools developers can exploit to become better developers.   The day of beating the drum to go faster are long gone with the machine learning. 

ThruPut Manager: Visualization for Batch Execution
Compuware’s ThruPut Manager brought automated optimization to batch processing. ThruPut Manager allocates resource decisions by balancing the needs of multiple interested parties. It involves cost-benefit tradeoffs between risks and costs, such as risking SLA (service level agreement) violations of timely service delivery to avoid a costly increase in software MLC (monthly license cost) charges.

Compuware reports that batch processing jobs account for about 50% of mainframe workloads!

Today’s complex environments compound the problem with a bewildering number of choices, combinations and alternatives to consider in making these decisions. The amount of data, competing interests and number of options means it takes years of experience to achieve even a reasonable level of competence at this task. Further, a lack of such seasoned staff means that these operationally critical decisions are now being left to new-to-the-mainframe staffs lacking that experience.

ThruPut Manager’s new web interface provides operations staff with a visual representation of intelligible information of the cost/benefit tradeoffs as they work to optimize workload timing and resource performance.

In combination with Compuware Strobe, ops staff can more easily identify potential issues. They can manage and balance competing metrics relating to cost, resource allocation, service policies and customer interests to make the best decisions for optimizing the workloads, as well as application performance.

A big part of ThruPut Manager’s advantage is the multiple drill-down views it provides. Starting with an overview, which displays data about the General Services and Productions Services queue, users can drill down to a detailed view of specific job data and job history, as well as where work is getting selected. The GUI also collects and displays the R4HA information for the last eight hours. And, if the Automated Capacity Management feature is constraining less important workload to mitigate the R4HA, this will be displayed on the graph. 

The Final Word
Mainframe workloads continue to increase even as experts steadily leave the workforce and responsibilities shift to mainframe-inexperienced staff. Organizations must constantly work to modernize mainframe environments and remove impediments to innovation to not only increase their business agility, but also attract a new generation of staff to the platform.

Compuware zAdviser provides concrete data that allows mainframe staff to link the results of actions taken to improve performance based on KPI measurements. DevOps management and staff have access to intelligible, visual information on the impact of those changes in detail. 

Compuware ThruPut Manager provides much needed clarity and insight to fine-tune batch execution for optimal value easing budget stresses while fulfilling business imperatives.

These products provide strong evidence of Compuware’s ability to create innovative ways to identify and resolve challenges in mainframe development, management and operations that have long been barriers to its wider use. The entire team deserves a salute for their 14th consecutive quarter of very agile delivery of solutions that are driving the mainframe more and more into the mainstream of 21st century computing. Congratulations once again for your efforts.