OPTIMIZE PERFORMANCE FOR YOUR ENTIRE BIG DATA STACK

PLATFORM SPOTLIGHT

APPLICATION SPOTLIGHT

CAPACITY OPTIMIZER

The 451 Take on Cloud-Native: Truly Transformative for Enterprise IT

Helping to shape the modern software development and IT operations paradigms, cloud-native represents a significant shift in enterprise IT. In this report, we define cloud-native and offer some perspective on why it matters and what it means for the industry.

Elements of Big Data APM Success

Pepperdata delivers proven big data APM products, operational experience, and deep expertise.

PLATFORM SPOTLIGHT
PLACEHOLDER

Request a trial to see firsthand how Pepperdata big data solutions can help you achieve big data performance success. Pepperdata’s proven APM solutions provide a 360° degree view of both your platform and applications, with realtime tuning, recommendations, and alerting. See and understand how Pepperdata big data performance solutions helps you to quickly pinpoint and resolve big data performance bottlenecks. See for yourself why Pepperdata’s big data APM solutions are used to manage performance on over 30K Hadoop production nodes.

Request Trial

Resources

Cloudwick Collaborates with Pepperdata to Ensure SLAs and Performance are Maintained for AWS Migration Service

Pepperdata Provides Pre- and Post-Migration Workload Analysis, Application Performance Assessment and SLA Validation for Cloudwick AWS Migration Customers

San Francisco — Strata Data Conference (Booth 926)  — March 27, 2019 — Pepperdata, the leader in big data Application Performance Management (APM), and Cloudwick, leading provider of digital business services and solutions to the Global 1000, today announced a collaborative offering for enterprises migrating their big data to Amazon Web Services (AWS). Pepperdata provides Cloudwick with a baseline of on-premises performance, maps workloads to optimal static and on-demand instances, diagnoses any issues that arise during migration and assesses performance after the move to ensure the same or better performance and SLAs.

“The biggest challenge for enterprises migrating big data to the cloud is ensuring SLAs are maintained without having to devote resources to entirely re-engineer applications,” said Ash Munshi, Pepperdata CEO. “Cloudwick and Pepperdata ensure workloads are migrated successfully by analyzing and establishing a metrics-based performance baseline.”

“Migrating to the cloud without looking at the performance data first is risky for organizations and if a migration is not done right, the complaints from lines of business are unavoidable,” said Mark Schreiber, General Manager for Cloudwick. “Without Pepperdata’s metrics and analysis before and after the migration, there is no way to prove performance levels are maintained in the cloud.”

For Cloudwick’s AWS Migration Services, Pepperdata is installed on customers’ existing, on-premises clusters — it takes under 30 minutes — and automatically collects over 350 real-time operational metrics from applications and infrastructure resources, including CPU, RAM, disk I/O, and network usage metrics on every job, task, user, host, workflow, and queue. These metrics are used to analyze performance and SLAs, accurately map workloads to appropriate AWS instances, and provide cost projections. Once the AWS migration is complete, the same operational metrics from the cloud are collected and analyzed to assess performance results and validate migration success.

To learn more, stop by the Pepperdata booth (926) at Strata Data Conference March 25-28 at Moscone West in San Francisco.

More Info

About Pepperdata
Pepperdata (https://pepperdata.com) is the leader in big data Application Performance Management (APM) solutions and services, solving application and infrastructure issues throughout the stack for developers and operations managers. The company partners with its customers to provide proven products, operational experience, and deep expertise to deliver predictable performance, empowered users, managed costs and managed growth for their big data investments, both on-premise and in the cloud. Leading companies like Comcast, Philips Wellcentive and NBC Universal depend on Pepperdata to deliver big data success.

 Founded in 2012 and headquartered in Cupertino, California, Pepperdata has attracted executive and engineering talent from Yahoo, Google, Microsoft and Netflix. Pepperdata investors include Citi Ventures, Costanoa Ventures, Signia Venture Partners, Silicon Valley Data Capital and Wing Venture Capital, along with leading high-profile individual investors. For more information, visit www.pepperdata.com.

About Cloudwick

Cloudwick is the leading provider of digital business services and solutions to the Global 1000. Its solutions include data migration, business intelligence modernization, data science, cybersecurity, IoT and mobile application development and more, enabling data-driven enterprises to gain competitive advantage from big data, cloud computing and advanced analytics. Learn more at www.cloudwick.com.

###

Contact:
Samantha Leggat
samantha@pepperdata.com

Pepperdata and the Pepperdata logo are registered trademarks of Pepperdata, Inc. Other names may be trademarks of their respective owners.

March 27, 2019

Pepperdata Announces Free Big Data Cloud Migration Cost Assessment to Automatically Select Optimal Instance Types and Provide Accurate Cost Projections

Pepperdata Eliminates Guesswork and Complexity Associated with Identifying Best Candidate Workloads Down to Queue, Job and User Level, for Moving to AWS, Azure, Google Cloud or IBM Cloud

CUPERTINO, Calif. — March 6, 2019 — Pepperdata, the leader in big data Application Performance Management (APM), today announced its new Big Data Cloud Migration Cost Assessment for enterprises looking to migrate their big data workloads to AWS, Azure, Google Cloud or IBM Cloud. By analyzing current workloads and service level agreements, the detailed, metrics-based Assessment enables enterprises to make informed decisions, helping minimize risk while ensuring SLAs are maintained after cloud migration.

The Pepperdata Big Data Cloud Migration Cost Assessment provides organizations with an accurate understanding of their network, compute and storage needs to run their big data applications in the hybrid cloud. Analyzing memory, CPU and IO every five seconds for every task, Pepperdata maps the on-premises workloads to optimal static and on-demand instances on AWS, Azure, Google Cloud, and IBM Cloud. Pepperdata also identifies how many of each instance type will be needed and calculates cloud CPU and memory costs to achieve the same performance and SLAs of the existing on-prem infrastructure.

“When enterprises consider a hybrid cloud strategy, they estimate the cost of moving entire clusters, but that’s not the best approach,” said Ash Munshi, Pepperdata CEO. “It’s far better to identify specific workloads that can be moved to take full advantage of the pricing and elasticity of the cloud. Pepperdata collects and analyzes detailed, granular resource metrics to accurately identify optimal workloads for cloud migration while maintaining SLAs.”

The Big Data Cloud Migration Cost Assessment enables enterprises to:

  • Automatically analyze every workload in your cluster to accurately determine their projected cloud costs
  • Get cost projections and instance recommendations for workloads, queues, jobs, and users
  • Map big data workloads to various instance types including static and on-demand
  • Compare AWS, Azure, Google Cloud, and IBM Cloud

Availability

Pepperdata Big Data Cloud Migration Cost Assessment is available free at pepperdata.com/free-big-data-cloud-migration-cost-assessment. Pepperdata customers should email support@pepperdata.com for their free assessment.

Learn more:

About Pepperdata
Pepperdata (https://www.pepperdata.com) is the leader in big data Application Performance Management (APM) solutions and services, solving application and infrastructure issues throughout the stack for developers and operations managers. The company partners with its customers to provide proven products, operational experience, and deep expertise to deliver predictable performance, empowered users, managed costs and managed growth for their big data investments, both on-premise and in the cloud. Leading companies like Comcast, Philips Wellcentive and NBC Universal depend on Pepperdata to deliver big data success.

 Founded in 2012 and headquartered in Cupertino, California, Pepperdata has attracted executive and engineering talent from Yahoo, Google, Microsoft and Netflix. Pepperdata investors include Citi Ventures, Costanoa Ventures, Signia Venture Partners, Silicon Valley Data Capital and Wing Venture Capital, along with leading high-profile individual investors. For more information, visit www.pepperdata.com.

###

Contact:
Samantha Leggat

925-447-5300
samantha@pepperdata.com

Pepperdata and the Pepperdata logo are registered trademarks of Pepperdata, Inc. Other names may be trademarks of their respective owners.

March 5, 2019

Pepperdata Unveils 360° Reports, Enabling Enterprises to Make More Informed Operational Decisions to Maximize Capacity and Improve Application Performance

360° Reports Empower Executives to Better Understand Financial Impacts of Operational Decisions

CUPERTINO, Calif. — February 19, 2019 — Pepperdata, the leader in big data Application Performance Management (APM), today announced the availability of 360° Reports for Platform Spotlight. Pepperdata 360° Reports leverage the vast amount of proprietary data collected and correlated by Pepperdata to give executives capacity utilization insights so they better understand the financial impacts of operational decisions.

“Pepperdata 360° Reports demonstrate the power of data and the valuable insights Pepperdata provides, enabling enterprises to make more informed and effective operational decisions,” said Ash Munshi, Pepperdata CEO. “Operators get a better understanding of what and where they’re spending, where waste can be reclaimed, and where policy and resource adjustments can be made to save money, maximize capacity and improve application performance.”

360° Reports for Pepperdata Platform Spotlight include:

  • Capacity Optimizer Report: This gives operators insight into memory and money saved by leveraging Pepperdata Capacity Optimizer to dynamically recapture wasted capacity.
  • Application Waste Report: This report compares memory requested with actual memory utilization so operators can optimize resources by changing resource reservation parameters.
  • Application Type Report: This gives operators insight on the technologies used across the cluster and the percentage of each (percentage of Spark jobs, etc.). This provides executives with insights into technology trends to make more data-driven investment decisions.
  • Default Container Size Report: This report identifies jobs using default container size and where any waste occurred so operators can make default container size adjustments to save money.
  • Pepperdata Usage Report: This presents Pepperdata dashboard usage data, highlighting top users, days used, and more to give operators insights to maximize their investment. With this data, operators can identify activities to grow the user base, such as promoting features, scheduling onboarding sessions, and training on custom alarms.

Availability

Pepperdata 360° Reports are available immediately for Pepperdata Platform Spotlight customers. For a free trial of Pepperdata, visit https://www.pepperdata.com/trial.

About Pepperdata
Pepperdata (https://pepperdata.com) is the leader in big data Application Performance Management (APM) solutions and services, solving application and infrastructure issues throughout the stack for developers and operations managers. The company partners with its customers to provide proven products, operational experience, and deep expertise to deliver predictable performance, empowered users, managed costs and managed growth for their big data investments, both on-premise and in the cloud. Leading companies like Comcast, Philips Wellcentive and NBC Universal depend on Pepperdata to deliver big data success.

 Founded in 2012 and headquartered in Cupertino, California, Pepperdata has attracted executive and engineering talent from Yahoo, Google, Microsoft and Netflix. Pepperdata investors include Citi Ventures, Costanoa Ventures, Signia Venture Partners, Silicon Valley Data Capital and Wing Venture Capital, along with leading high-profile individual investors. For more information, visit www.pepperdata.com.

###

Contact:
Samantha Leggat
samantha@pepperdata.com

Pepperdata and the Pepperdata logo are registered trademarks of Pepperdata, Inc. Other names may be trademarks of their respective owners.

Sample report attached.

Sample Capacity Optimizer Report – memory and money saved with Capacity Optimizer

February 19, 2019

Why Financial Services Needs Big Data APM

Why Financial Services Needs Big Data APM

Financial Services organizations operate in a challenging environment. As one of the most heavily regulated industries in the world, they are a constant target of hackers and fraudsters. At the same time, their applications and services are essential components of the global economy. These systems must be highly available and performance-optimized while generating investor and shareholder returns.

The primary big data use case for financial services is business analytics that run on Hadoop.  Data-driven analytics are key to the current and future competitiveness of financial services companies.  By capturing and leveraging massive volumes of data, financial services companies are capitalizing on new data-driven business opportunities. But the highly regulated nature of the financial services sector and concerns around uptime and data security make managing these applications difficult.

Proactively monitoring the performance of your critical applications and services with a big data Application Performance Management (APM) solution can help you avoid operational nightmares and enable you to find and fix application and infrastructure issues before they impact your organization. Pepperdata Big Data APM products like Application Spotlight and Platform Spotlight monitor and optimize business intelligence applications that analyze customer data, manage thousands of concurrent queries, automate business processes, optimize risk controls and business outcomes, and ultimately improve customer experience and drive growth.

Optimizing Performance of BI Applications and Workloads – Seven Use Cases

Here are seven examples of financial services BI applications and workloads that Pepperdata big data APM solutions monitor and optimize for performance. Each of these delivers tangible business benefits to the organization.

  1. Predicting the risk of churn for individual customers and recommending proactive retention strategies to improve customer loyalty. Banks and card issuers can identify at-risk customers and respond quickly to retain them.
  2. Providing early warning predictions using liability analysis to identify potential exposures prior to default. This enables proactive engagement with customers to manage their liabilities and limit exposure.
  3. Predicting risk of loan delinquency and recommending proactive maintenance strategies by segmenting delinquent borrowers and identifying “self-cure” customers. With this insight, banks can better tailor collection strategies and improve on-time payment rates.
  4. Detecting financial crime such as fraud, money laundering, or counter-terrorism financing activities by identifying transaction anomalies or suspicious activities using transactional, customer, black-list, and geospatial data.
  5. Predicting operational demand based on historical data and future events. With this insight, banks can anticipate call center traffic volumes or predict demand for cash at ATMs.
  6. Evaluating customer credit risk by analyzing application and customer data for automated real-time credit decisions based on information such as age, income, address, guarantor, loan size, job experience, rating, and transaction history.
  7. Managing customer complaints using data from various interaction channels to understand why customers complain, identify dissatisfied customers, find the root causes of problems, and rapidly respond to affected customers.

The applications and workloads that the Pepperdata big data APM solutions optimize in these analytics and BI use cases provide the “source of truth” that ultimately underlies customer-facing, transactional use cases. For example, banks and card issuers now deploy chatbots that address customer needs and inquiries, walk customers through process steps, provide predictive messages and behavior insights, and automate tasks such as money transfers or balance inquiries. Over time, the behavioral data that chatbots collect is analyzed in the Hadoop cluster to further develop and refine appropriate replies to user requests.

Big Data APM Scalability for Massive Deployments

Pepperdata big data APM solutions provide the scalability that makes them the choice of the world’s largest financial services organizations, with some customers running in excess of 1,000 nodes in their distributed computing environment. Customers with high node counts face unique operational challenges, including extremely high numbers of concurrent queries. They cannot afford any service or data loss. To reduce risks associated with potential downtime and data loss, some organizations have established data centers with triple-redundancy cluster architectures.

Financial services organizations with such huge physical infrastructure investments naturally want to maximize their workloads and utilize their infrastructure as efficiently as possible. For these customers, Pepperdata big data APM solutions automatically optimize infrastructure capacity and application performance to provide:

  • 90% capacity utilization without manual application tuning
  • Up to 50% improvement in throughput that results in significant savings in infrastructure spend
  • 95% reduction in MTTR, with an average 5,200 hours per year saved on triage and troubleshooting time

Bridging the DevOps Communication Gap

Our financial services customers appreciate the ability of Pepperdata big data APM solutions to help bridge the communication gap that can exist between developers and IT operations, a situation that can negatively impact application development and the production workloads.  Using Pepperdata Application Spotlight, customers can readily monitor an app as it transitions through the development cycle from pre-product to production.  As the application evolves, issues like bottlenecks, CPU, and memory mismatches can be quickly detected and resolved using Pepperdata Platform Spotlight and Capacity Optimizer to ensure optimal performance in the production environment. Better communications enable ITOps to help the application team efficiently work through the development transitions. These benefits optimize application performance and uptime and help ensure that SLAs are met.  

We don’t need to explain the significance of ROI to IT Operations leaders in the financial services industry.  At a macro level, profitability is the function of stable and high performing analytics, applications and services that result in customer loyalty and retention.  With an investment in big data APM solutions from Pepperdata, you can bulletproof your foundational analytics applications and workloads and not only avoid application performance issues but also increase revenue and customer satisfaction.

Other Things You Can Do

July 16, 2019

How Pepperdata Big Data APM Delivers ROI by Controlling Cloud Costs

For most big data enterprises, application performance management (APM) is considered an essential element of application-centric IT operations and a DevOps-enabling bridge between production and development on one side, and IT and digital business on the other.  APM strives to detect and diagnose complex application performance problems to maintain an expected level of service, and in doing so, APM can reduce mean time to repair (MTTR), reduce IT maintenance and infrastructure costs, and improve business outcomes.

It’s been said that almost every business now is a software business in some form or another. That means that the reliability and performance of your software applications are critical to your success. From this perspective, APM solutions can deliver a significant return on investment (ROI) if used to their full potential. Strictly speaking, the ROI is the ratio between the net profit of an investment and what it cost to implement it. It is often expressed as a percentage, to represent how much profit was made compared to the costs.

ROI for IT Investments is Different

Traditionally, when IT professionals and executive management discussed the ROI of an IT investment, they were dealing primarily with hardware/infrastructure and mostly thinking of “financial” benefits. Financial benefits include impacts on the organization’s budget and finances, e.g., cost reductions or revenue increases.

With the rise of software-defined everything and cloud-based service offerings, business leaders and technologists also consider the “non-financial” benefits of IT investments, including impacts on operations or mission performance and results, e.g., improved customer satisfaction, better information, shorter cycle-time. These are the so-called “intangibles”, “soft”, or “unquantifiable” benefits of information technology.  Unlike financial returns, there may be no widely-accepted metrics that can be applied. However, IT’s potential for producing positive impacts on business performance is undeniable. Both financial and non-financial benefits must be taken into account to fully assess the value of any technology solution, and APM is no different.

Enabling Big Data DevOps ROI with APM

Large enterprises typically run multi-tiered applications across a variety of systems and platforms. These can range from in-house systems to external clouds. With the accelerating use of cloud-based apps, the complexity of integrating these applications is a challenge for even the most sophisticated IT teams. Greater agility is the underlying business case for a DevOps approach. Leveraging increased automation, DevOps applies agile and lean practices throughout the software lifecycle. It allows IT to launch higher quality applications and deploy them faster than in the past. 

As more organizations discover the efficiencies of adopting DevOps best-practices for application lifecycle management, they quickly realize that APM enables DevOps ROI. Similarly, IT operations teams are recognizing the value of APM to manage expensive cluster resources more efficiently and to better inform DevOps teams who depend on reliable and consistent infrastructure availability and performance.

Align Your Compute Resources and Costs with Actual Service Demands

Pepperdata APM solutions are not only helpful for measuring the performance of your applications and helping to identify opportunities for improvement, they can also deliver more tangible “financial” ROI by reducing your infrastructure and hosting costs through analysis and optimizations.  Applications and IT infrastructure must work together. IT resources represent both capital and operational expenses, putting more pressure on IT organizations to optimize the use of existing resources and acquire new resources only when required. 

Pepperdata Capacity Optimizer is a capacity management solution that aligns IT resources with service demands, optimizing resource utilization, and reducing costs. Capacity Optimizer leverages active resource management features in Hadoop to dynamically tune cluster resources and eliminate inefficiencies and bottlenecks. Running continuously, it improves the capacity utilization of your existing production clusters without manual tuning or intervention. Enterprise deployments typically achieve a 30-50% increase in throughput performance on existing hardware with Capacity Optimizer, enabling them to save thousands of dollars in unnecessary infrastructure and services expenditures.

Only Pay for the Cloud Resources You Need

For organizations migrating to the cloud, Pepperdata Capacity Optimizer provides an even more compelling benefit. It’s easy to forgive IT Ops for over-provisioning on-premises compute resources to avoid an application bottleneck.  On-prem resources represent a sunk cost and are already paid for, so the worst that can happen is an up-tick in chargeback. But taking the same approach to cloud-based resources will yield a nasty surprise in the form of an unexpectedly high monthly bill from your cloud service provider who charges you for every memory and CPU instance that you’ve subscribed to, whether you need and have used those resources or not.

Capacity Optimizer ensures that you are only using the compute resources in the cloud (and on-prem) that you actually need to achieve optimal application performance.  When you assess your priorities for monitoring and managing your technology stack, remember that the only thing that your customers see, and thus the only thing that they care about is the performance of the application they’re using. Whatever may be happening in your big data stack, the application is where the rubber meets the road.  

Fine-tune your big data application environment and achieve tangible ROI with Pepperdata Capacity Optimizer by understanding exactly what CPU and memory resources each application requested, what it needs, what it used, and what it wasted, and identify the true impact on your big data application performance.

July 10, 2019

AI – Separating Hype from Reality

Artificial intelligence (AI) makes it possible for machines to learn from experience, adjust to new inputs and perform human-like tasks. Most AI examples that you hear about today – from chess-playing computers to self-driving cars – rely heavily on deep learning and natural language processing. Using these technologies, computers can be trained to accomplish specific tasks by processing large amounts of data and recognizing patterns in the data. AI’s recent resurgence can be attributed to increased data volumes, advanced algorithms, and improvements in computing power and storage, but AI is not new.  The term artificial intelligence was coined in 1956 by John McCarthy.

Early AI research in the 1950s explored topics like problem-solving and symbolic methods. In the 1960s, the US Department of Defense took interest in this type of work and began training computers to mimic basic human reasoning. For example, the Defense Advanced Research Projects Agency (DARPA) completed street mapping projects in the 1970s. And DARPA produced intelligent personal assistants in 2003, long before Siri, Alexa or Cortana were household names.  These efforts paved the way for the automation and formal reasoning that we see in computers today, including decision support systems and smart search systems that can be designed to complement and augment human abilities.

Even as artificial intelligence has become the most disruptive class of technologies driving digital business forward, there is confusion about what it is, and what it can and cannot do—even among otherwise tech-savvy professionals. If you search the web, you’ll find as many definitions of AI as there are people who write them.  So let’s take a different approach and identify what AI can do in an applied environment.

The 6 Pillars of AI

  1. AI automates repetitive learning and discovery through data. But AI is different from hardware-driven, robotic automation. Instead of automating manual tasks, AI performs frequent, high-volume, computerized tasks reliably and without fatigue. For this type of automation, human inquiry is still essential to set up the system and ask the right questions.
  2. AI adds intelligence to existing products. In most cases, AI will not be sold as an individual application. Instead, products you already use will be improved with AI capabilities, much like Alexa was added as a feature to Amazon. Automation, conversational platforms, bots and smart machines can be combined with large amounts of data to improve many technologies.
  3. AI adapts through progressive learning algorithms to let the data do the programming. AI finds structure and regularities in data so that the algorithm acquires a skill: The algorithm becomes a classifier or a predictor. Just as the algorithm can teach itself how to play chess, it can also teach itself what product to recommend online, and adapt when given new data.
  4. AI analyzes more and deeper data using neural networks that have multiple hidden layers. Building a fraud detection system with five hidden layers was almost impossible a few years ago, but that has changed with incredible computer power and big data. Deep learning models require lots of data because they learn directly from the data. The more data you can feed them, the more accurate they become.
  5. AI achieves incredible accuracy through deep neural networks. For example, your interactions with Alexa, Google Search and Google Photos are all based on deep learning, and they become more accurate the more we use them. In the medical field, AI techniques from deep learning, image classification and object recognition can now be used to find cancer on MRIs and match the same accuracy as highly trained radiologists.
  6. AI extracts the most value out of data. When algorithms are self-learning, the data itself can become intellectual property. The answers are in the data; you just have to apply AI to get them out. Since the role of the data is now more important than ever before, it can create a competitive advantage. If you have the best data in a competitive industry, even if everyone is applying similar techniques, the best data will win.

Peak Hype for AI?

Every new technology goes through a hype cycle in which the news coverage is strongly positive at first, often gushing with the possibility for life-altering transformation. Even though AI (Artificial Intelligence) is not new, and having already experienced hype cycles, the current cycle which began in 2012, has been notable for the sheer volume of media coverage.

Gartner’s Hype Cycle tracks emerging information technologies in their journey towards mainstream adoption. It is designed to help companies tell hype from viable business opportunity, and give an idea when that value may be realized.

AI is at peak hype—and still an unknown quantity for many. Even as artificial intelligence is set to become the most disruptive class of technologies in driving digital business forwards during the next 10 years, there is confusion on what it is, and what it can and cannot do—even among otherwise tech-savvy professionals.

AI is at peak hype: “Democratized Artificial Intelligence” was recognized as one of the five megatrends in Gartner’s most recent (2018) Hype Cycle.  Machine learning and deep learning are at peak hype, and predicted to be 2-5 years away from mainstream adoption. Cognitive computing is also at peak hype, but up to 10 years away, while artificial general intelligence (AI with the ‘intelligence’ of an average human being) is 10+ years away and in early innovation phase.

Confusion and Unsubstantiated Vendor Claims

The Verge recently reported that many companies in Europe are taking advantage of the AI hype to make unsubstantiated claims in an effort to generate excitement and increase sales and revenue.  According to a survey from London venture capital firm MMC, 40 percent of European startups that are classified as AI companies don’t actually use artificial intelligence in a way that is “material” to their businesses. MMC studied some 2,830 AI startups in 13 EU countries to come to its conclusion, reviewing the “activities, focus, and funding” of each firm in a comprehensive report published earlier this year. This situation is certainly not limited to EU-based vendors, it’s a global issue.

Tech Talks also identified similar misuse by companies claiming to use machine learning and advanced artificial intelligence to gather and examine data to enhance the user experience in their products and services. Many in the tech industry and the media are also confusing what is truly AI, and what is truly machine learning.  We’ll take a closer look at that quandary in a future blog.

Other Things You Can Do

June 12, 2019