Category: Research

  • Conference Whispers: CONEXPO-CON/AGG & IFPE 2023

    Conference Whispers: CONEXPO-CON/AGG & IFPE 2023

    Analyst: Dr. Doreen Galli

    Photojournalist: Dr. Doreen Galli

    ABSTRACT

    North America’s largest construction trade show, CONEXPO-CON/AGG and IFPE are held together every three years in Las Vegas. The conference featured over 3 million net square feet of exhibits, over 2400 exhibitors, 200 educational courses, and 139,000 attendees from 133 countries. There were endless examples of Industrial Intelligence of Things (IIoT) and early examples of digitally transformed construction sites. The digital transformation of the construction site is well underway enabling multivendor views to optimize asset use and the total carbon footprint of a given job.  

    The Conference

    • The first CONEXPO was held in 1909 in Columbus, Ohio. The first CON/AGG was held in Detroit Michigan in 1928. CONEXPO and CON/AGG merged for the 19961 show held in Las Vegas. The Association of Equipment Manufacturers (AEM) is responsible for putting on the show. Today, CONEPXO is one conferences with three domains: specifically, CONEXPO, CON/AGG and IFPE. Due the size and complexity of bringing all the equipment together, it is held every 3 years at the LVCC – the only location that can physically accommodate the weight of the equipment.
    • CONEXPO-CONN/AGG and IFPE was held in Las Vegas and leveraging the entire expanded LVCC including West, Central, South and North Halls and the festival lot. The Conference featured with 139,000 registered attendees including international attendees from 133 countries. The exhibitions featured over 2400 exhibitors in over 3 million net square feet of exhibits. There was a significant presence of women at the conference not simply joining attendees.
    • Attendees at CONEXPO-CONN/AGG and IFPE 2020 held titles such as Job Site Director, Director Heavy Equipment Rentals, Machine Operator, Logistics Manager, Electrical Engineer, Mechanical Engineer, Director of Product Management, Director City Planning, and new this time – data scientist.

    Highlights

    • Sensors is standard on heavy equipment this time around. Today’s sensor solutions are more intelligent than ever. Predictive maintenance is a reality when information from telematics systems is integrated into maintenance systems.
    • Digital transformation is arriving at construction job sites and heavy equipment rentals.
    • While there are new Government regulations such as US bidding requirements to disclose carbon footprint and plan to reduce emissions, OEM (original equipment manufacturers) equipment itself does not provide this information. Additional work is required to understand one’s footprint.

    Cautions

    • The industry would benefit from leveraging six sigma or other practices common in manufacturing to formally document the business benefit of their digital transformation.
    • There are many lessons learned in IT data management and manufacturing transformations that can benefit the construction digital transformation. Hopefully, these lessons will be shared so maximum success can be achieved.

    Conference Vibe

    The largest North American construction conference surpassed the attendance of their last conference which was held while Covid 19 was declared a pandemic2. The conference requires an entire month to set up – complex enough that it is only held every 3 years. The exhibits leveraged the entire Las Vegas Convention Center (LVCC). It spanned the parking lots, South Hall, North Hall, Central Hall, the new West Hall, and the Las Vegas Festival Grounds. When asked attendees reported meeting their business objectives, the answer was a resounding Yes. Some indicated the expected deals came through along with a few pleasant surprises. Once again – the quality of attendees itself was mentioned by the exhibitors. Fortunately, the press room and many exhibits had coffee available – both regular and decaffeinated AND hot water for tea this time around.

    Digital Transformation of Commercial Fleets and Job Sites

    According to IFM3, 85% of vehicle telematics is used for vehicle tracking with only 27% used to track fatigue management. While the 2020 conference had education modules on digital transformation, in 2023 there were competing modules. CONEXPO 2023 live forced one to choose between ‘Top 10 Uses Cases using advanced Machine Data4and ‘Predictive Maintenance: Plan the Work and Work the Plan’. In 2020 participants were told the value of sharing one important piece of data5. In 2023, the conference offered a plethora of machine data transformational use cases.

    Attendees saw a Plante Moran’s customer discuss how they learned to leverage data to better manage their fleets6. ICC provided a great session on crane operation safety using data analysis of standards and incidents. When all was finished, the audience understood that less than 23.8% of crane safety responsibility falls on the operator’s shoulders7.

    POLARIS Laboratories® demonstrated the advantages of simply connecting one’s fluid analysis to the maintenance management system8. In another session, participants learned how to use their machine data for 10 separate use cases. The use cases varied from maximizing the ROI of their machines, saving money by maximizing use of remote investigation to minimizing the costs of batteries9. Another topic growing in popularity due to federal contracting requirements was understanding the carbon footprint of a job site. A complete solution set with various accuracy levels to measure fleet emissions was presented by Clue10. DOKA presented a platform that can estimate your carbon footprint. In addition, DOKA platform can optimize most metrics for job sites across the company and across machine vendors11. Furthermore, DOKA offers an augmented reality solution to help customers and teams picture the job site before it is realized12.

    Finally, one cannot discuss all this technology and data without a great session on cybersecurity. To that end CONEXPO did not disappoint with a great panel revealing ransomware and DDOS (Distributed Denial of Service) attacks are top of mind13.

    In conclusion, it was exciting to see how far and how fast the construction and agricultural industries are progressing toward digital transformed job sites and enterprises. Unfortunately, transformations stories frequently could not communicate their business impact. If I had one piece of advice to share with the industry it would be leverage six sigma or some business process impact/improvement (BPI) assessment during the transformations. These types of processes capture the financial advantage of the actual transformation. Despite the lack of BPI assessment, construction benefits from the constant turnover of construction equipment within companies. Thus, I would not be surprised to see the construction industry successfully transform faster than manufacturing.

    Next Year’s Conference

    CONEXPO-CON/AGG is held every three years. The next CONEXPO-CON/AGG and IFPE will occur March 3-7, 2026, in Las Vegas, NV. One may find more information here: The Future of Construction on Display: CONEXPO-CON/AGG Exhibitors Take the Industry.

    Citations

    1. https://en.wikipedia.org/wiki/Conexpo-Con/Agg
    2. https://abcnews.go.com/Health/coronavirus-cases-surpass-1000-us-tsa-agents-test/story?id=69525688
    3. https://youtu.be/IbvC4mMd-l8
    4. https://youtu.be/_jZawl8zD0I
    5. https://youtu.be/btAlIcG9rEw
    6. https://youtu.be/dzjPV2WrXY8
    7. https://youtu.be/t1BLnp0iNNg
    8. https://youtu.be/MG2R9badaW4
    9. https://youtu.be/_jZawl8zD0I
    10. https://youtu.be/Xgl0z1xQZGc
    11. https://youtu.be/oI4phMkytB0
    12. https://youtu.be/QbbkZl4u-04
    13. https://youtu.be/EFx-SqwNDMw

    ©2019-2023 TBW Advisors LLC. All rights reserved. TBW, Conference Whispers, Technical Business Whispers, Whisper Reports, Whisper Studies, Whisper Ranking and Fact-based Research and Advisory are trademarks or registered trademarks of TBW Advisors LLC. This publication may not be reproduced or distributed in any form without TBW’s prior written permission. It consists of the opinions of TBW’s research organization which should not be construed as statements of fact. While the information contained in this publication has been obtained from sources believed to be reliable, TBW disclaims all warranties as to the accuracy, completeness, or adequacy of such information. TBW does not provide legal or investment advice and its research should not be construed or used as such. Your access and use of this publication are governed by the TBW Usage Policy. TBW research is produced independently by its research organization without influence or input from a third party. For further information, see Fact-based research publications on our website for more details.

  • Whisper Report: Business Agility Requires Modern Data Management

    ABSTRACT

    Business agility is a must during these pandemic times. Business agility requires data-driven decisions. Data-driven decision making requires data agility and the modernization of data management to enable business-led analytics. The most common, successful, and scalable data management modernizations involve data virtualization, IPaaS, and data hub technologies to provide a data layer.

  • Whisper Report: Six Data Management Optimizations You Need

    ABSTRACT

    Digital transformation requires business and execution agility. Modern data management solutions frequently provide six types of optimizations. Push down optimizations, remote function execution and functional compensation enable full leveraging of the environment. Automation, such as auto ETL, no hard-code and the ability to leverage autoscaling without significant configuration, enables agility and simplifies even the most complex global hybrid environments.

  • Whisper Report: Six Advantages of Copy Data Management

    Online Research Summary

    ABSTRACT

    Copy data management solutions are being added to architectures to address the demands of CCPA and GDPR. Copy data management is a capability available as a stand-alone data management tool, as well as provided within data virtualization platforms. Advantages of copy data management discussed in this research include the ability to stop orphan data copies, eliminate extra storage costs and keep copies up to date. Advantages discussed also include the ability for copies to be kept equal to each other, simplified governance and accelerated development.

  • Whisper Studies: Data Virtualization

    Online Research Summary

    ABSTRACT

    These Whisper Studies demonstrate two initial uses cases that prompted organizations to add data virtualization to their architecture. The first study involves a North American-based railroad’s evolution to Precision Railroading, which required entirely new operational metrics. To develop the metrics, the team required replica of production without affecting production. The railroad leveraged Teradata QueryGrid. The second study is from a consumer electronics manufacturer that wished to reduce its cost basis for data storage and computation. The original application did not require any changes. The manufacturer leveraged Gluent on Oracle and Hadoop to achieve their desired result.

    The Studies

    Railroad Access On-Premise Data

    • Scenario: Production data in Teradata – needed parallel dev environment 
    • Start State:  Production data in Teradata Active Enterprise Data Warehouse
    • End State: Teradata QueryGrid provides data access to dev environment
    • Data Size: 5 terabytes
    • Set-up Time: Half day to install, 6 months to optimize tuning
    • Interview(s): August 2019

    Consumer Products Manufacturer to Lower Cost Performance Data Management

    • Scenario: Application leverage Oracle production database
    • Start State: Oracle storage growing too fast, too expensive
    • End State:  Leveraged Gluent to evolve data and computations to Hadoop
    • Data Size: 31 terabytes of application data in Oracle Database
    • Set-up Time: One-hour installation, two months to implement

    Interview(s): March 2020

    Key Takeaways

    • Data virtualization is frequently brought into environments to provide access to production data without disturbing the production environment.
    • Data virtualization is frequently brought into environments to reduce the cost structure of an environment. Data virtualization is also useful in bringing legacy applications into a modern data management solution.  
    • Set up for data virtualization is often quick; initial set up frequently takes less than a day, while most organizations become fluent in tuning the environment within 2-6 months.

    Data Virtualization Use-Case Studies

    These Whisper Studies center on two cases leveraging a data virtualization platform. In both scenarios, it is the organization’s first data virtualization use case or the one that caused them to add it to their architecture. The first study involves an organization that needed to update its operational models and related analytics. They needed to leverage production data to develop and confirm new metrics without disrupting production. The second Whisper Study wished to reduce the cost profile of their production and analytics without a major architecture change. Both leveraged data virtualization to address their needs.

    A North American Railroad’s Parallel Production Data Environment

    • Scenario: Production data in Teradata – needed parallel dev environment 
    • Start State:  Production data in Teradata Active Enterprise Data Warehouse  
    • End State: Teradata QueryGrid provides data access to Dev   
    • Data Size: 5 terabytes
    • Set-up Time: Half day to install, 6 months to optimize tuning
    • Interview(s): August 2019

    A North American-based railroad company needed to move to new operational analytics as part of moving toward Precision Schedule RailRoading1. To accomplish this, the railroad wanted to evaluate the new operational metrics in development before updating production.

    Precision Railroading Background

    To evaluate the new metrics, the organization required a parallel environment to that of production. This parallel environment required some 30 tables and millions of rows of data – all without interrupting or burdening production. The data was primarily transportation data with some finance data mixed in.

    To accomplish this, development needed an exact copy of the large volumes of production data. The copy needed to be scheduled, properly updated based on all dependent processes, and the development copy needed to be complete. In addition, to compare the new set of operational metrics to the current operational models, target tables of production were also required in the parallel environment. 

    Note that as a railroad is land rich with significant bandwidth available along the lines, the railroad owns and operates two of its own data centers. This also allows the organization to control the highly sensitive data regarding its operations that affect multiple industries, since they ship raw ingredients across the continent. As such, their entire solution is considered on-premise.

    Teradata QueryGrid Solution

    Because a majority of the data was in Teradata Active Enterprise Data Warehouse on-premise, it was natural to reach out to Teradata for a solution, which provided Teradata QueryGrid2, a data virtualization solution. Additional research that provides details on QueryGrid’s capabilities can be found in “Whisper Report: Six Data Engineering Capabilities Provided by Modern Data Virtualization Platforms.

    By leveraging QueryGrid, the railroad had a perfect replica of production without the concern of interfering with production. When using a data virtualization platform, the platform provides a view of the data to execute your needs. This view is independent of the original form of the data but may or may not actually involve an additional complete physical copy of the data. More importantly, the data virtualization technology is able to maintain an up to date view of the data, as depicted in Figure 1. 

    The Set-Up

    To leverage Teradata’s QueryGrid, the following steps were required.

    Connect the source: As with all data layers, the specific sources to be used by the platform must be connected. When connecting the sources, the majority of the time was spent tracking down and setting up the permissions to connect.

    Configure the views: Data virtualization platforms such as QueryGrid operate by providing data views. The second step was creating the required data views as required for the Precision Railroading Project.

    To protect production, only official DBAs within IT could create views leveraging QueryGrid – they did not want production data to be wrongly exploited. No major problems were incurred by the project.

    Figure 1. Develop with Production Data without Affecting Production

    The Results

    With the exact replica of production data and related current operational metrics, the railroad was able to perform a side-by-side comparison with the incoming Precision Railroading Metrics. It was critical for the business to get comfortable with the impact of the new metrics before they became the official operating metrics for the company. Accuracy was critical, as the railroad’s operational metrics are publicly released. Note, formal data validation platforms were not used to compare the data, but rather, SQL scripts were leveraged (see Whisper Report: Decision Integrity Requires Data Validation for related research).

    The new corporate reporting metrics tracked events such as how long a train took to go from Point A to Point B, as well as how long the train stayed or stopped at each of the stations between the two points. Overall, there were an assortment of metrics that are part of the Precision Railroading that they wanted to realize. As a result of the new operational insights, they found numerous opportunities to make improvements in the process. For example, visibility was given to instances where certain customers required multiple attempts to successfully deliver a load. With the waste identified, the organization could now address issues that negatively impacted their efficiencies.

    This project was the railroad’s first Teradata QueryGrid project. With success under their belt, the next project will expand the business’s ability to be more involved in self-service.

    Consumer Electronics Manufacturer Reducing Cost Profile

    • Scenario: Application Leverage Oracle Production Database
    • Start State: Oracle Storage Growing Too Fast, Too Expensive
    • End State:  Leveraged Gluent to seamlessly evolve older data to Hadoop
    • Data Size: 31 terabyte of application data in Oracle Database
    • Set-up Time:  One hour to install, two months to implement
    • Interview(s): March 2020

    Background on Reducing the Cost Profile

    The second study involves a large electronics manufacturer seeking to reduce their cost profile. The electronics manufacturer has a large amount of sensor data coming from their machines (a single data set is 10 terabytes). The data regarding the machines is stored in an Oracle database, which worked well at first but was not able to maintain the cost profile desired by the organization. There was an annual expense for additional space required in order to continue leveraging the application storing information in Oracle. This organization wished to reduce the cost profile without rewriting the application.

    The Gluent Solution

    The consumer electronics manufacturer decided to leverage Gluent data virtualization solution3. Gluent was installed on the Oracle server and Hadoop. The application simply connected to Oracle without any changes whatsoever. Behind the scenes, the data and the work on the data was now spread between Oracle and Hadoop significantly reducing their cost structure and eliminating the need for the organization to expand their Oracle footprint. The fact that the data was spread between Oracle and Hadoop was invisible to the application and its users, as depicted in Figure 2.

    The Set-Up

    In order to leverage Gluent the following steps were required.

    Install Gluent: Gluent is installed on all data sources, particularly Hadoop and Oracle. When Oracle or Hadoop is called today, users are actually using the Gluent code installed on the server. The work is now able to be seamlessly offloaded to Hadoop as needed and is cost optimized. The install took less than one hour. Once again, it is critical to have access passwords available. Setting permissions correctly is also required.  

    Use the migration tool: Gluent has a built-in migration tool the consumer manufacturer was able to leverage to handle the initial set up. This automatically migrated some of the Oracle data to Hadoop while maintaining a single view of the data.

    Query Tune: This is a continual effort that gets easier over time. When optimizations turn out to not be optimal, Gluent allows “Hints,” which are the methods one can design to optimize specific scenarios.

    Figure 2. Gluent is Used to Extend Oracle Data to Hadoop Reducing Cost

    The Results

    The Oracle Applications still call on and use Oracle. Behind the scenes, Gluent installed on Oracle is able to leverage Hadoop for storage and compute power. The application itself did not require any changes. Their cost profile for data storage and computations is now reduced. The plan is to not change the Oracle application at all but, rather, to simply continue reducing the actual data and computations conducted by Oracle. Fortunately, this also moved this application group in line with other internal groups that are using big data solutions on Hadoop. The Hadoop environment is familiar to the other teams, and through Hadoop due to Gluent, those users can now leverage the Oracle application and related data without Oracle skills. This capability is due to two functionalities that are common in data virtualization.

    Remote Function Execution: The ability of data virtualization to parse a query and have portions of a query executed on another remote system. In this instance, one can access Oracle and run the query on Hadoop. Likewise, one can access Hadoop and have a query run on Oracle. Where a query runs is subject to configuration and constraints such as cost and time.

    Functional Compensation: The ability perform an Oracle operation, specifically SQL, on your Hadoop data, even though Hadoop does not support SQL queries natively.

    Together, these two capabilities enable the manufacturer to leverage their Oracle experts without retraining. This benefit is in addition to reducing their storage and computational costs.

    TBW Advisors Recommended Reading

    Whisper Report: Digital Transformation Requires Modern Data Engineering

    Whisper Report: Six Data Engineering Capabilities Provided by Modern Data Virtualization Platforms

    Whisper Report: Six Use Cases Enabled by Data Virtualization Platforms

    Whisper Ranking: Data Virtualization Platforms Q1 2020

    Whisper Report: ETL Not Agile? Here’s 3 Alternatives

    Whisper Report: Decision Integrity Requires Data Validation

    Citations

    1. https://en.wikipedia.org/wiki/Precision_railroading
    2. https://www.teradata.com/Products/Ecosystem-Management/IntelliSphere/QueryGrid
    3. https://gluent.com/

    Corporate Headquarters

    2884 Grand Helios Way

    Henderson, NV 89052

    ©2019-2020 TBW Advisors LLC. All rights reserved. TBW, Technical Business Whispers, Fact-based research and Advisory, Conference Whispers, Whisper Reports, Whisper Studies, Whisper Ranking are trademarks or registered trademarks of TBW Advisors LLC. This publication may not be reproduced or distributed in any form without TBW’s prior written permission. It consists of the opinions of TBW’s research organization which should not be construed as statements of fact. While the information contained in this publication has been obtained from sources believed to be reliable, TBW disclaims all warranties as to the accuracy, completeness or adequacy of such information. TBW does not provide legal or investment advice and its research should not be construed or used as such. Your access and use of this publication are governed by the TBW Usage Policy. TBW research is produced independently by its research organization without influence or input from a third party. For further information, see Fact-based research publications on our website for more details.

  • Whisper Report: Five Quantifiable Advantages of Data Virtualization

    Online Research Summary

    ABSTRACT

    Data virtualization platforms have numerous benefits to organizations that add it to their architecture. This research examines five quantifiable advantages enterprises that adopt data virtualization experience. Quantifiable advantages include user access to data, copy data management, centralized governance, increased agility, and reduced infrastructure costs. Data Virtualization platforms provide measurable advantages to the digitally transformed and significantly contribute to the return on investment (ROI) realized by the architecture.

  • Whisper Ranking: Data Validation Platforms Q2 2020

    Online Research Summary

    ABSTRACT

    Digitally transformed organizations expect reliable, data-driven decisions. Data validation platforms are able to test data ingestions and transformations within an enterprise. Many data validation platforms can test structured data, big data, BI Tools and ERP Systems, as well as non-standard data types, be it flat files or streaming. Data validation platforms can conduct regression tests and monitor production data. Likewise, data validation platforms are being used to support six different and critically important use cases. This research evaluates and ranks the various modern data validation platforms according to their architectural capabilities and ability to successfully meet popular use cases.

  • Whisper Report: Six Use Cases Enabled by Data Validation Platforms

    Online Research Summary

    ABSTRACT

    When selecting technologies for your data architecture, it is important to understand common use cases enabled by the technology. This research examines six use cases enabled by data validation and the architecture capabilities used to support the use case. To this end, we examine the validation of ingestion and transformations, the data migration, as well as cloud update use cases. The use cases for production monitoring, completeness of data sets, the ability to compare BI tools’ values, as well as data DevOps are also evaluated.

  • Conference Whispers: CONEXPO-CON/AGG & IFPE 2020

    Online Summary

    ABSTRACT

    North America’s largest construction trade show, CONEXPO-CON/AGG and IFPE are held together every three years in Las Vegas. There were endless examples of Industrial Intelligence of Things (IIoT) and edge computing. The conference featured 2.7 million net square feet of exhibits, over 2,300 exhibitors, 150 educational courses, and 130,000 attendees from 150 countries. The intelligence enabled by sensors continues to expand as does the visibility of the results. The digital transformation of the construction site is underway with multiple vendors offering supply chain and job site integrated views. The show closed one day early due to COVID-19.

  • Whisper Report: ETL Not Agile? Here’s 3 Alternatives

    Online Summary

    ABSTRACT

    TDWI Las Vegas was an educational and strategy event for 425 attendees, including 66 international attendees from 15 countries. Four major educational tracks featured over 50 full day and half-day sessions, as well as exams available for credit on related courses to become a Certified Business Intelligence Professional (CBIP). The educational tracks included: Modern Data Management, Platform and Architecture, Data Strategy and Leadership, Analytics and Business Intelligence. The strategy Summit featured 14 sessions, including many case studies and a special session on Design Thinking. The exhibit hall featured 20 exhibitors and 6 vendor demonstrations and hosted lunches and cocktail hours.