Pages

Tuesday, April 18, 2017

Compuware’s newest solution curbs insider threats against mainframe systems!

By Rich Ptak

Compuware just delivered its 10th consecutive quarter of new capabilities aimed at “Mainstreaming the Mainframe.” 

This time, with a security twist. Having focused on enhancing its Topaz solution suite with DevOps focused products for the last few quarters, with its latest release, Compuware is addressing new challenges in the area of security enablement.
No, Compuware does not plan to become an all-encompassing security firm. Nor, will they be offering security consulting. Both areas are already heavy with talent, product and service options. Compuware is doing what it does best – removing the idiosyncrasies of the mainframe to enable non-mainframe staff to access and work with mainframe data in the same manner as they access data from other platforms.
For its entire lifetime, the mainframe has been the “gold-standard” for platform security. It remains so today. However, threats evolve over time into new directions, demanding adaptive responses and new capabilities.
With the announcement of Compuware Application Audit, directly capturing rich, complete start-to-finish user session activity data in real-time is now faster, easier and more comprehensive than ever. This is critical to increasing mainframe cybersecurity and assuring compliance to security protocols and mandates.
A web-based interface and ability to easily integrate with data from across the enterprise, dramatically adds to the potential benefit from this product.
We quickly review these threats and their costs. We then discuss Compuware’s newest contribution aimed at resolving them.

An expensive and growing challenge

The 2016 published IBM X-Force® Research report[1] on cyber security stated organizations experienced a 5% increase in data breaches between 2014 and 2015; mostly (60%) the result of insider (employee, trusted partner) activities. The just released 2017 X-Force Threat Intelligence Index[2] reveals the number of leaked records had increased by “a historic 566 percent in 2016 from 600 million to more than 4 billion records”!  

Other research[3] conducted in EMEA revealed that it took an average of 469 days to detect such activities versus a global average of 146 days.  A 2016 global study[4] by the Association of Fraud Examiners covering breaches in more than 144 countries found that the cost of such breaches was an average of $2.7M (some as high as $4M). They also discovered detection efforts, such as active monitoring, internal/external audits, fiscal reviews, etc. can significantly lower the cost/loss and duration of a breach.

We agree that the mainframe is inherently secure from outside attacks. However, in today’s world, the risk and danger of exposure of sensitive data are increasingly coming from privileged users. Profiles and motives vary. The user may be unauthorized or authorized. The intent may be malicious, or completely unintended, i.e. a simple mistaken file transfer or miss-keyed command. Whatever the cause, the result can be a breach that exposes confidential internal data or in the illegal access/exposure of personal client information. In the end, the risk of an extremely costly breach not only exists but is demonstrably growing over time.

The safeguards offered today are proving to be insufficient, awkward to implement and frequently inadequate to detect let alone prevent sophisticated or even naive user penetrations. Even when recognized, and attempts made to address, those responsible for prevention, monitoring or protection may actually be the perpetrators.

As would be expected the response has centered on a proliferation of mandates in the form of compliance rules, regulations, audits, inspections, reporting, etc. by governments and watchdog groups. These are layered on top of existing internally generated and imposed mandates. The result is an increasing risk of non-compliance along with penalties on top of the damages done to clients, customers, employees, relationships, etc.

Existing traditional solutions, e.g. SMF data, log scans, SIEM tools, RACF, CA ACF2, CA Top Secret, etc., all effectively deliver their promised, designed-in functionality and capabilities. Unfortunately, none is capable of directly addressing the need to specifically track and store user behavior in real-time. None collects the data necessary to determine what a user is actually doing with an application and with the data. Thus, none can report on who is doing what with which applications and data for how long. Hence, the danger continues and risk escalates. This is the problem that Compuware Application Audit is designed to address for mainframes. Note that while the weakness exists for all systems and platforms, Application Audit focuses exclusively on the mainframe.
Compuware Application Audit: Captures User Behavior in Real-time
The most interesting aspect of Application Audit lies in its unique ability to collect and provide access to ALL user interactions with and IN ANY application on the mainframe. This is done in real-time and over-time, as long as the user is using an app, even if the interaction is interrupted and spread over time. It provides a comprehensive view of exactly what happens from the user’s perspective. It works for privileged or non-privileged users. It tracks ALL activities that occur from a user perspective whether CICS transactions, 3270-based interactions – any interaction (data I/O, moves, changes, etc.) that takes place in and with any application.
Activity tracking is completely transparent to both the user and application. There is no call to Application Audit by the user. No changes are made to any application. All data regarding user interaction with applications is collected in real-time by Application Audit. The data is recorded on the mainframe by the Application Audit Global Record and stored locally.
Data captured by Application Audit can be sent directly to Splunk for analysis or written out as SMF for CorreLog or Syncsort, which store, transport and format the data before delivering it to Splunk, or in the case of CorreLog, to popular SIEM or other analytics engines such as Hadoop. Customers can combine data from across the enterprise within the SIEM tools where it can be analyzed together and correlated for security and compliance. See Figure 1 for an implementation example leveraging CorreLog.
                                         Courtesy of Compuware Corp.
Figure 1 Example of Audit Data & Process Flow

Compuware Application Audit has been designed and implemented as a standalone solution. It does not require the purchase of additional Compuware products. It includes a web interface with basic data display, full data access along with an out-of-the-box customizable Splunk-based dashboard.

Compuware consulted with security experts and auditors on multiple aspects in the design of the product. These include such areas as web interface, menus for reports, data collection and reporting, presentation, alerts as well alert mechanisms and visualizations built into the application.
User experiences prove the value
Compuware described the experiences of two major banks and one healthcare insurance company using Compuware Application Audit to solve security problems. A bank needed to be able to monitor privileged users and collect auditable evidence on user activities. Application Audit data fed to Splunk enabled the bank to identify a privileged user engaging in improper activities.
Another bank required comprehensive insight into mainframe application usage after an exposure of credit card information. Application user behavior data collected by Applications Audit and analyzed in Splunk revealed an outsourced contractor was abusing their privileges.  Application Audit also provided the necessary information to show auditors the bank was operating in compliance with regulations that govern access to sensitive data. The bank is now meeting GDPR compliance requirements with ongoing monitoring.
The Healthcare insurance firm needed to assure compliance with HIPAA mandates and to track viewing of sensitive, personal data records so they could search records. Once again Application Audit user data monitoring and collection combined with Splunk analysis of user behavior resolved the problem. 
The Final Word
Compuware continues to maintain an accelerated pace providing new and enhanced products and solution extensions to improve the mainframe environment and ecosystem. Not content with optimizing processes and tasks ranging from development to infrastructure, operations and service management, Compuware found a way to ease the task of detecting and preventing user-driven security problems by automating real-time data collection about user-behavior.

Ten quarters ago (2 ½ years), Compuware made a commitment and issued a challenge as they committed to a quarterly delivery of products and enhancements that would “Mainstream the Mainframe.”

Quarter by quarter, Compuware has lived up to that commitment. They are delivering new and enhanced products and services that have made life easier, simpler, and more interesting for developers, systems administrators, operations and, now, security staff. They have introduced tools that removed long-standing barriers that discouraged or intimidated non-mainframe IT staff from using or learning about the mainframe. They eased access to the latest in IT tools and technologies once used only in distributed environments. They automated tasks that were onerous, time-consuming and error-prone.

Compuware has not been alone in these efforts. Competitors have risen to the challenge albeit with their own strategic twist on what needs to be done and how to do it. But, no one else has matched their pace of delivery which, by all the evidence we have been shown, they fully intend to continue.

Congratulations to Compuware on this release, which, by the way, also includes some enhancements to other products. Again, we recommend that anyone with a mainframe in their shop invite these folks to discuss what they can do with you. You’ll find that by working with Compuware, “You won’t get tired of winning!”


Tuesday, April 11, 2017

INTERCONNECT 2017 – IBM Cloud Watson & Blockchain on the rise in enterprises, large and small!

By Rich Ptak


IBM, again, delivered more than promised in Interconnect 2017. Attendance was up by at least 10% with more significant product, partnership and initiative announcements covered by the largest attendance of press and analysts to date.

There were plenty of testimonials and presentations by clients, customers and IBM employees. IBM went all out to make the registration and attendance planning process easier. They even provided a Watson-leveraging app to aid attendees to do everything from connecting with fellow attendees to recommending sessions to creating a personal agenda with reminders of sessions to attend.

IBM further added to the event’s cachet by having Ginni Rometty, IBM Chairman, President and CEO host Tuesday’s keynote session. She arrived from China after completing a contract[1] (on Sunday) that partners IBM and a Dalian Wanda Group subsidiary to make select IBM cloud infrastructure-as-a-service and platform-as-a-service (IaaS and PaaS) technologies available in China.

Chairman’s Address

Ginni provided a fast-paced overview of IBM’s strategy and plans. There was heavy focus on the technologies where IBM is breaking new ground or pushing its own limits on what can be done. She featured five (5) enterprise executives. Each of which spoke to their own successful experiences with using IBM Watson, Cloud and other technologies, products and services.

The executives included Randall Stephenson, Chairman and CEO of AT&T, describing how on-going association with IBM helps to improve customer services as data consumption in mobile media explodes. Marc Benioff, Chairman and CEO of Salesforce, spoke to how IBM as a trusted partner and leveraging Watson, cognitive computing and predictive analytics allows them to create new customer-focused services. Bill Cobb, President and CEO of H&R Block, described how a June, 2016 discussion with IBM Senior Vice President Mike Rhodin about Watson technology, allowed them 8-months later to advertise the availability of Watson-based tax consultancy services in a Super Bowl ad.  Bruce Ross, Group Head, Technology & Operations, Royal Bank of Canada spoke to how IBM enables transformation in development and operations in their efforts to economically address a market of one. Reshma Saujani, Founder and CEO of Girls Who Code discussed the impact of IBM’s support on her goal to teach 1 million girls to coding by 2020.

It is clear that Cloud-based efforts are focused on delivering enterprise-ready solutions and services. A traditional IBM sweet spot. Ginni described how a strategy based on technological prowess, acquisition of partners able to deliver cutting edge solutions and management of an expanding ecosystem leads to differentiated enterprise Cloud services solutions. It is not an aspirational strategy. It is a well-planned, managed and documented pathway intended to drive client and customer successes. Interconnect was to demonstrate both the strength of the technological foundation, and its success.

Others may try, but IBM intends to lead the market with their ability to identify and meet the specific needs of the enterprise/organization. They aim to dominate whether it is in assisting transformation efforts, planning adoption of the latest in Cloud technology and services, implementing a private Cloud or outsourcing some/all Cloud functions.

Ginni’s speech and IBM’s litany of offerings hits it all…standalone Cloud, on-premise Cloud, hosted Cloud, off-premise Cloud or Cloud services, private Cloud, hybrid Cloud, DYI Cloud, Bluemix Cloud – IBM has plans, presence, expertise and offerings in every one.

AND, they back these up with an unparalleled infrastructure and architecture that is able to deliver the security, technology, reliability, expertise, solutions and services that exceed anything offered elsewhere, or are uniquely available from IBM.

IBM has clearly learned from its early experiences with WATSON, and applied those lessons as they work to insert boundary pushing technologies, such as blockchain, cognitive, packaged WATSON, even quantum computing into their customer base. Far from going it alone, IBM is partnering and aligning itself with clients, customers, competitors, industry and academics to build an ecosystem for each of these technologies. IBM is all about implementng this strategy as if IBM's future depends upon it - we think it does. Here's more on why we think so. 

Expanding solutions and infrastructure access 

Reflecting IBM’s view of its customers’ concerns focus, the show floor had five zones Cloud, Cognitive, Watson, Internet of Things (IoT), Industry and Dev Zone. IBM executives and staff presented details of new functionality available stand-alone, as part of or baked into Cloud services. These include enhancements in performance, operations and management of infrastructure platforms (Power, OpenPOWER, z Systems, LinuxONE, storage and networking). Also showcased were a wide-range of the application and marriage of the capabilities of Watson processing with cognitive operations and blockchain services.

There were also numerous new services discussed. One example is the very-high-speed, protected, secure blockchain-as-a-service. This is now available for use by enterprises who don’t require or don’t want the undertake the effort required to do a private implementation. IBM’s Blockchain for Hyperledger Fabric v1.0 (scalable to 10K transactions/second). This is based on the just released Hyperledger consortium’s Fabric platform, but extended with IBM infrastructure performance benefits. This benefits not just very large enterprises, but also smaller, grass-roots organizations, such as Plastic Bank as we discuss a little later.

IBM positions itself as especially able to deliver products, solutions and service with designed-in features specifically intended to address the most pressing existing and emerging challenges facing enterprises today. This includes security, verification, transaction handling, speeds and volumes, certification reliability, processing power, data processing and storage, plus much more.

One example is IBM’s approach to improving the already famous security of mainframe systems. Two long-held design tenants in security systems hold that most successful penetrations are internal and no system is impenetrable. IBM’s latest cyber security offerings combines Watson and cognitive computing to analyze events across the operating environment to more rapidly identify, even anticipate, attempts at penetration.

A major part of any such event as Interconnect has been the demonstrations and stories of partner successes based on using the vendor solutions and services along with innovative efforts and collaborations involving a mix of partners, clients and customers. The object of these is to provide first-hand evidence and documentation that promoted technological capabilities and benefits are indeed being realized. These range from cutting edge leveraging of the latest in technologies (blockchain, cognitive computing), platforms (Bluemix, WATSON, IBM Cloud) and, of course, the basic infrastructure (Power, z Systems, network, storage) along with straightforward solutions by industry, sector and application (financial, medical, security, transaction management and handling).

More recently, IBM has been making it easier, and cheaper (in fact, free) for users to experience its wares in ‘try before you buy’ situations. As an example, in addition to presenting details of its efforts in Quantum Computing, IBM offers free access to developers, researchers and students interested in and able to “play” with Quantum technology in IBM’s cloud. The demo at the show as well as the technical staff supporting it were extremely helpful in answering my questions. Shortly after the show, IBM announced the QISKit open source project[2] which allows developers to conduct explorations on IBM’s Quantum Experience using a Python interface. See the IBM Tech Talk introduction to this here[3].

For those interested in mainframes, IBM provides easy free access to mainframe technology with its IBM z Systems Trial Program[4]. You can try the latest in z Systems software. Environments are pre-configured, no set-up needed and are available now for IBM z/OS Connect Enterprise Edition and Apache Spark on z/OS. Coming soon are Application Discovery and Delivery Intelligence, Information Management System (IMS), a CICS Transaction Server, IBM OMEGAMON for JVM on z/OS with more to follow. 
No formal statements about the next generation mainframes were made during Interconnect. However, the current model was introduced 3 years ago. We suspect that in less than 12 months we’ll hear about a new generation mainframe. That and the fit between mainframe strengths and computing needs leads us to conclude that to start learning more about mainframes would be smart.

“Social Plastic”, LinuxONE, Mainframe, Blockchain => Social Change

For a couple of years, a major theme of Interconnect has been the ability of technology and its users to “change the world”. Examples abound, such as behavior changing products for photo/video sharing (Snapchat, Instagram) and social contact (Tumblr, FaceTime), gaming as well as projects aimed at solving the world’s largest problems, cancer treatment, genome modeling, pollution control, etc.

Last year, we met with and wrote about Dr. Piers Nash of the University of Chicago’s Center for Data Intensive Science (CDIS).  CDIS uses IBM Cloud Object Storage with Cleversafe technology to centrally store and manage vast amounts of genomic and clinical data thus accelerating discoveries.

This year, we had the opportunity to meet and speak with social change partners Shaun Frankson, Co-founder, Chief Strategist and Growth Hacker of The Plastic Bank[5] and Ron Argent, CEO and Founder of IBM Business Partner Cognition Foundry[6]. Working with IBM Labs and ISV partner Conquex, the partnership uses Blockchain and LinuxONE (mainframe) as they create a supply chain, provider and producer network in response to vendor/global consumer demand for products using “Social Plastic”. “Social Plastic” is made from recycled plastic containers that otherwise would be polluting oceans, rivers and the planet. 

Manufacturers will respond to consumers demanding “Social Plastic” in their products. The partners also wanted to address global poverty with a sustainable business aimed at the most poverty stricken areas. They needed to set up an affordable structure that allows disadvantaged entrepreneurs to build a sustainable business in even the poorest countries with little or no initial capital. They found their solution in Social Plastic markets that buy, barter, trade or exchange services for plastic trash.

Social Plastic markets were initially set up in Haiti and are underway in the Philippines, as fixed centers for collection, payment and recycling. These provide direct compensation to collectors in the form of cash (including non-monetary such as bitcoin), items (sustainable cooking stoves, solar-powered chargers, sugar cane briquettes, etc.) or services. It is important that the “pedigree” of the plastic and that payment be easily recorded, as well as securely tracked and managed. That is where the technical capabilities of the partners and infrastructure of blockchain implemented on a LinuxONE mainframe environment come into play.

Enterprise evolution continues

Interconnect demonstrates the evolution of the digitized enterprise, that is IBM As-a-Service. It is seen in the platforms (Cloud, storage, application, etc.) and stream of data (visual, financial, weather, location, spend, etc.), collected, sorted analyzed and interpreted. It is seen in the fully automated processes in apps designed to address specific problems (travel, investment, workplace safety, etc.). IBM continues to expand and build ecosystems of solutions. The object is to leverage technology in innovative, solution oriented ways that are available to as wide an audience as possible. Thinking about “speeds ‘n feeds” infrastructure is so outdated that it doesn’t even register with most attendees.

However, this doesn’t lessen the importance of the underlying infrastructure. It is the on-going evolution and advances in Power Systems, OpenPOWER, z Systems, etc. that makes possible today’s computing. It is what drives the radical improvements in data collection, transformation and transfer rates, accelerates processing and analysis to allow the real-time and near real-time results that are used to control operations to develop predictions that drive problem avoidance and minimize the need for corrective action. Standardization has its place aiding in cost management. However, it cannot substitute for or replace the powerful driving force of competition.

Watson is rapidly becoming a pervasive presence in computing – its applications range from preparation of tax forms to cutting edge simulations in multiple fields including medicine, traffic control, research and explorations in multiple areas as well as design, video, data cleansing, on and on. But, at its heart, Watson is a Power System.

IBM Cloud in all its multiple realizations is intimately tied to infrastructure. IBM-as-a-Service Cloud offerings that can be delivered on z Systems or Power Systems. The choice depends on the Cloud service and application. Hardware infrastructure is critical in the delivery as well as creation of services and applications. Cloud services that include analytics, blockchain, data cleansing, Hyperledger and whatever comes along next can and will make exacting demands on whatever platform they run on. Quantum computing may or may not be the next big “thing”. Whatever the results, IBM continues to evolve, extend and innovate to provide the best platforms possible to meet and exceed the expectations of today’s customers and as they face the future.

The Final Word

Interconnect 2017 was time well spent. IBM communicated a detailed, focused and competitive vision of IBM’s intent of how, where and when they will compete with multiple offerings.

It convincingly demonstrated that an enterprise-ready Cloud with distinct features and advantages critical to enterprise success is both necessary. IBM along with its customers, clients and partners provided significant evidence that IBM has already identified and is addressing a sizable and varied enterprise Cloud market.

IBM demonstrated their understanding of and ability to leverage the potential of blockchain and Hyperledger. They are aggressively advancing the spread of blockchain technology as valuable, effective tool already bringing change to multiple market segments.

This year’s Interconnect event, demonstrated IBM, partners, clients and customers have been able to effectively leverage a wide range of technologies Cognitive Computing, Mobility, DevOps and Analytics tied to resolve significant problems. IBM is also effectively addressing security issues, identified last year as an emerging focus. In her talk, Ginni identified the Cloud, and specifically the IBM Cloud as the Platform of the Future because it is: 1) Enterprise-strong, 2) built to optimize data operations, and 3) cognitive to the core. IBM gives every evidence of being able to successfully compete with that.

Challenges remain. IBM needs to grow market share with Cloud offerings. They have demonstrated more solid reasons to expect that to happen. Transitioning and transforming a company has always been tricky to accomplish. IBM has made significant strides in demonstrating vision and ability to execute. We expect that to continue. We will be at Interconnect 2018 to see if we are right.




[1] IBM will partner with Wanda Internet Technology Group to build, distribute and operate the IBM cloud platform.
[2] Here is the link to the information about and access to the QISKit: https://developer.ibm.com/open/openprojects/qiskit/
[4] See what is available at: ibm.biz/ibmztrial
[5] For more information, see: http://plasticbank.org/
[6] For more information, see: http://cognitionfoundry.com/

Thursday, January 26, 2017

Ptak Associates Tech Blog: Dimension Data: Workspaces of the Future!

Ptak Associates Tech Blog: Dimension Data: Workspaces of the Future!: By Bill Moran and Rich Ptak Dimension Data has been a successful international presence for a number of years  - read about their latest acquisition that ups its North American presence.....

Monday, January 16, 2017

Compuware Topaz for Total Test automates Cobol code testing + acquisitions + product enhancements!


By Rich Ptak

In January, 2017 Compuware marked yet another quarter of delivering on its promises to provide solutions and services to “Mainstream the Mainframe.” This time it includes automated COBOL code testing, 4 acquisitions in 12 months plus other product enhancements. Let’s get started.
Billions of lines of COBOL-based programs are the operational heart of computer data centers worldwide. For well over 50 years, COBOL programs continue to be used for a variety of reasons. The primary reason is simply because they work. The adage “if it ain’t broke, don’t fix it,” could have been written exclusively about these programs.
Web, mobile and distributed applications often leverage COBOL programs on the back end. As such, in today’s rapidly evolving, hi-volume computing environment, companies must be able to rapidly implement COBOL code updates and changes to stay digitally competitive. Such changes, however, risk the introduction of serious errors and bugs, which, even once discovered, (in itself a notoriously difficult task) can be even more difficult to correct. Testing is required to uncover or avoid introducing such errors.
Creating mainframe unit tests has been a labor- and time-intensive task as they are manually designed, developed and custom tailored to each program. Making things more difficult, is a frequent lack of program documentation, even as those with expertise and deep program knowledge leave the work force.
Changing and updating mainframe COBOL programs remains an intimidating bottleneck; a task to be avoided, if at all possible. This is untenable in today’s digital enterprise where speedy adaptation to changing circumstances is an absolutely fundamental requirement to the survival of computer-driven services, let alone their on-going success.
Until now, no vendor had attempted to comprehensively attack the challenge of mainframe unit test creation, let alone bring automated Java-like unit testing to the world of COBOL applications. But once again, Compuware steps up to provide an effective and solid solution in the form of Compuware Topaz for Total Test.

First, a little context

Over the last two years, Compuware has introduced solutions that address multiple, long-standing mainframe application lifecycle challenges in mainframe operations. These include:
  1. Intuitive visual analysis of even extremely complex and poorly documented mainframe programs and data structures (Topaz for Program Analysis and Topaz for Enterprise Data). 
  2. Real-time quality control and error detection of mainframe coding syntax (Topaz integration with SonarSource).
  3. Agile cross-platform source code management and release automation (ISPW and integration with XebiaLabs).

Compuware’s newest offering will resolve some important issues currently handicapping unit testing of mainframe code through comprehensive automation of critical tasks. Let’s review what they just introduced.

Topaz for Total Test = Automated Mainframe Unit Test Creation and Execution

By automating the processes of unit test creation, Compuware’s Topaz for Total Test transforms mainframe COBOL application development and testing. It does so without requiring any code changes to a COBOL program, while automatically creating and running tests on logical units of code. Developers at all skill levels can now perform unit testing of COBOL code similar to how it is done for other programming languages (Java, PHP, etc.).
Compuware goes beyond distributed tool capabilities by automating the collection of additional data that can be used in multiple ways. The data is preserved with the unit test and can be used to validate code changes. This approach allows the test data to travel with the test case making it easier to execute test cases on different systems. Developers can collect and save data stubs of existing input data and edit it for testing specific sections of code.  
Topaz for Total Test, as part of the Topaz suite, can be used with other elements to provide a comprehensive solution for dev/test operations. Here is closer look at how Topaz for Total Test automates many of the steps in unit test creation and execution:
  • Use Xpediter to gather test data call parameters and program results, 
  • Topaz for Total Test creates complete test case (fully automated),
  • Topaz for Total Test generates data stubs and program stubs (fully automated),
  • Unit test uses data stub created by Topaz for Total Test (fully automated),
  • Topaz for Total Test allows easy on/off use of stubs – no re-compilation required (fully automated),
  • Topaz for Total Test automatically cleans up after tests, 
  • Topaz for Total Test adds unit tests into test scenario (fully automated),
  • Continuous build process uses CLI to run test suite,
  • Topaz for Total Test executes test suite (automatically).
Benefits realized by IT mainframe organizations include acceleration of development processes, reduced time, effort and number of resources needed to create/run tests as it will be easier to update and change mainframe code. Overall operations efficiency improves as well because potential problems are identified and addressed at the earliest possible time in development.
Among Topaz for Total Test’s unique features and capabilities are Program Stubs that allow the main program to be isolated from the sub-program calls. And, sub-programs may be tested independently of the main program. Together these capabilities enable developers to split the testing of a large program into testing a set of smaller programs.
In effect, Topaz for Total Test reduces the complexity of doing good testing by focusing on small parts of the program. The solution is useful to developers at all levels of skill. Its ease of use and significant automations improve efficiency (faster test failure identification and resolution), speeds execution and development times and provides centralized control of testing.
There is much more to the product than we cover here. Compuware has plans for further enhancements, extensions and integrations to be delivered on a quarterly basis. Given their track record of performance, we expect they will delight their customers. If you have a significant amount of mainframe code in your shop, it makes good sense to check out Topaz for Total Test. 

Other items in the announcement

For their 4th acquisition in the last 12 months, Compuware acquired MVS Solutions with its popular ThruPut Manager, which automatically and intelligently optimizes the processing of batch jobs. ThruPut Manager:  
  • Provides immediate, intuitive insight into batch processing that even inexperienced operators can readily understand,
  • Makes it easy to prioritize batch processing based on business-based policies and goals,
  • Ensures proper batch execution by verifying that jobs have all the resources they need and proactively managing resource contention between jobs,
  • Dramatically reduces customers’ IBM Monthly Licensing Charges (MLC) by minimizing the rolling four-hour average (R4HA) processing peaks without counter-productive “soft-capping.”
As part of their third acquisition in 2016, Compuware added Standardware’s COPE IMS virtualization technology to its portfolio. With COPE, enterprises can rapidly deploy multiple virtual IMS environments to as many different active projects as they require without having to create costly new IMS instances or engage professionals with specialized technical skill-sets. As a result, even less experienced mainframe staff can perform IMS-related Dev/Ops tasks faster and at a lower cost. In addition, integration with Compuware Xpediter permits debugging within COPE environments.
Finally, Compuware announced updates such as graphical visualization of IMS DBDs in Topaz Workbench. The tool presents the structure of IMS databases at a glance and eliminates the need to pore over IMS configuration files to find this information. In addition, a new Strobe Insight Report compares the last execution statistics with the average execution statistics. The data is visualized in an interactive scatter chart based on collected SMF 30 data. With such visualization, analysts are able to quickly identify jobs that have exceeded their norms by a user specified percentage and, then, take the appropriate action. The tabular portion of the report compares and contrasts the average CPU, Elapsed Time and EXCP count with the last values collected.

The Final Word

With the announcement of Compuware Topaz for Total Test, the company has provided a significant advance in mainstreaming the mainframe. The digital agility of any enterprise is constrained by its least agile code base. By eliminating a long-standing constraint to COBOL agility, Compuware provides enterprise IT the ability to deliver more digital capabilities to the business at greater speed and with less risk.
The January announcement marks Compuware’s 9th consecutive quarter of delivering significant solutions that solidify mainstream positioning, while benefiting mainframe development and operations. We’ve commented on and have to admit that we have been impressed at virtually every announcement.
The steady stream of substantive improvements and additions has allowed Compuware to establish a strong market position for themselves. Their delivery of effective, innovative solutions provides solid enhancement to their reputation for successfully resolving significant problems that have hampered mainframe operations.
Congratulations to them. Check them out and see if you don’t agree with us. 

Wednesday, January 11, 2017

Dimension Data: Workspaces of the Future!

By Bill Moran and Rich Ptak


Dimension Data has been a successful international presence for a number of years. They have less visibility in the US and North American markets. Founded in South Africa in 1983, NTT acquired them in 2010. Their current revenue exceeds $7.5 billion, demonstrating strong, consistent growth that continues after joining NTT.

We focus on their end user solutions covered in our briefing with them. We do encourage you to visit their website[1] to view their full range of offerings. This Wikipedia article details their history[2]

First, we must compliment the quality of Dimension Data’s marketing and advertising. Normally, we don’t comment on this aspect of operations. However, we found their recent advertisements in The Economist magazine to be noteworthy, and so include one here. Such creativity will help them capture the attention of the US and North American markets.
For several years’ post- acquisition, NTT wisely kept the management team in place. This, combined with the financial strength and presence resulting from the association with NTT, facilitated expanding their marketplace positioning. As part of a larger entity, Dimension Data got more exposure, as well as added to existing proven customer confidence in their ability to deliver.

The Dimension Data Advantage

Dimension Data recently briefed us on their End–user Computing (EUC) strategy and offerings. To fully understand their offerings requires knowledge of their vision of how Digital transformation is changing companies, and its effect on various company stakeholders. So, we will first examine some of the relevant trends and resulting pressures.

Organizations are under significant pressure to cut costs. In response, some are reducing office space use by individual employees. In many US companies, this trend is implemented by encouraging employee home offices. This significantly affects the infrastructure and technology needed by the company. In other cases, companies are implementing changes to the working environment to attract and retain the best talent and remain competitive. Again, these changes will impact workspace design, communications, digital infrastructure, cloud (especially hybrid), data and information storage, (cyber)security and accessibility.  

Accompanying these macro trends are others specifically related to transformations that accompany the move to an increasingly Digital world. Some well-known, some not. Dimension Data has identified a number of these, which include:
  • Artificial Intelligence & Machine Learning
  • Internet of things
  • Virtual and Augmented Reality
  • Robotics
  • Digital Technology Platforms
  • Cloud, specifically hybrid Cloud
  • Big Data & the tools to analyze it


For many customers simply identifying and installing the correct technology is insufficient. Some are ill-equipped to cope with the new trends. Many risk being overwhelmed by the challenges facing them in the new digital environment. Others are incapable or uninterested in managing the operating technology.
                   Figure 1 Workspace for Tomorrow
That is precisely the entry-point identified by Dimension Data that provides as the opportunity for them to stand-out and outshine the competition. Dimension Data steps in with the ability to deliver a consultative workshop engagement specifically designed to help clients to develop a plan to smoothly move their workspaces to the next level.
Dimension Data is able to provide both an overall architecture, and a framework adaptable to fit the specific needs of any organization. Dimension Data is focused on enabling “Workspaces for Tomorrow” (Figure 1 – at right). This provides the basis for implementation and delivery of a comprehensive suite of workspace services to design, implement, maintain and even manage workspaces. 
Dimension Data has a unique offering which consists of a complete set of managed services to help customers designing “digital workspaces to embrace the way employees live, work, and collaborate.” Further, they “help organisations seamlessly unify the physical and virtual world into a digital experience.”
Let’s look specifically at Microsoft technology services. Dimension Data builds its expertise in this area on recently acquired Canada-based Ceryx. Ceryx specialized in assisting customers install and manage email services. Under Dimension Data’s auspices, they are broadening their offerings to provide ‘Managed Cloud Services for Microsoft’, which include all of Office 365, Skype for Business and Microsoft Cloud, as well as other Microsoft products. As a North American company, Ceryx additionally benefits Dimension Data with increased visibility in Canadian and US markets. The End-user Computing suite of Workspace services includes Workspace Mobility, Workspace Productivity Consulting Services and Software Services.

The Final Word

End-user Computing spearheads the “Workspaces for Tomorrow” effort within Dimension Data. Existing business units in networking, security, datacenter, collaboration, customer experience and service support cross-selling. With strategic partnerships with both Microsoft and VMware and the support of the NTT Group, Dimension Data is the engagement leader for outcome based services to enable “Workspaces for Tomorrow”. All this combines to provide an impressive array of experience, expertise and product.
Dimension Data has an impressive reference list of worldwide customers for end user computing, including well known banks, oil & gas companies, automotive manufacturers, etc. We believe that a significant opportunity for growth exists for them in the US and North American markets. We highly recommend investigating what they have to offer. We think that there is a very good chance that they just might turn out to be your best partner for your modernization efforts.