Optimizing data for strategic decision making in business
The rapid development of tech and increased access to various digital services have resulted in a significant rise in the amount of data we produce. This has made large institutions realize that the value of data has changed and that information can influence decision-making, which has spurred many of these firms to transform into data-driven organizations. However, making this shift is not easy. According to an Accenture survey, only 32% of organizations have gained tangible value from data. This means that there are still large gaps to overcome before companies can optimize the data they gather.One of these challenges is building a tech foundation that can process large amounts of information, as the data an organization collects through tech is multistructured.Google’s answer to this need is Google Cloud. The cloud computing solution transforms data, enabling companies to get optimized data value via a single intelligently enabled platform to get optimized data value.Last October, at the 2021 Google Cloud Next event, the tech giant introduced several new developments and improvements to its services, including BigQuery and Vertex AI.
Both services were highlighted in the Innovate and Accelerate with Data Cloud panel, which discussed how organizations can simplify and unify data to optimize it and, in turn, increase enterprise productivity using Google Cloud solutions.BigQuery is a data warehouse that provides solutions for Google Cloud services and can help companies manage and analyze data. With the service, companies don’t have to think about infrastructure and can focus on data analysis to generate reliable insights.Three main improvements were added to BigQuery: the BigQuery interoperability feature, BigQuery Migration Service and Management (currently under preview), and the real-time and predictive analytics process.
The interoperability feature sets BigQuery apart from other data warehouses. This feature can help data teams analyze data faster, as it allows them to seamlessly connect and exchange information with various types of services.For example, the platform uses BigQuery Storage API, which provides a simpler architecture that requires less data movement, removing the need to have multiple copies of the same information stored in different places. There is also BigQuery Omni, a fully managed enterprise data warehouse that integrates Google Cloud solutions with other cloud services such as Azure and Amazon Web Services, which allows companies to perform analysis without moving the data from other cloud platforms to the Google Cloud platform.
The BigQuery Migration Service and Management feature targets data engineers who are new to BigQuery. This solution can help users migrate data quickly, affordably, and with low risk, so they can immediately modernize data as needed.For those who are already using BigQuery, the BigQuery Administration feature helps users monitor resource usage in real-time. According to Google, BigQuery Storage API’s real-time process analytics capabilities are continuously improved to help users get faster results. As for predictive analytics, Google Cloud’s Explainable AI service can clarify how each feature contributes to the prediction results and helps identify biases in the model. Vertex AI is designed specifically for developers and brings Google’s machine learning tools together on one platform. This makes the experimentation and deployment process faster while simplifying AI model management.Since its launch in May 2021, Vertex AI has seen adoption rise consistently. It has witnessed a threefold increase in total API requests, a 70% rise in total training volume, and an increase in total prediction volume by 2x. Three features of the service were highlighted at the 2021 Google Next event, namely Vertex AI Metadata, Vertex AI Pipeline, and Vertex AI Neural Architecture Search (NAS).
Vertex AI Metadata makes it easier to audit and manage the machine learning model workflows that are under development and automatically tracks inputs and outputs to all components.
When a machine learning model is in production, input data can sometimes deviate from what a model is trained on, which causes the model to deteriorate. Vertex AI’s Model Monitoring feature reviews the model and warns developers of deviant signals, diagnoses the cause of deviations, and triggers retraining paths or collects relevant training data.
The Vertex AI Pipeline, on the other hand, helps automate, monitor, and manage machine learning systems by setting up serverless workflows and storing workflow outputs – often called artifacts – using Vertex AI Metadata. This way, users can analyze workflow artifacts such as machine learning model pedigrees, which may include training data, hyperparameters, and the code used to build the model.
Meanwhile, Vertex AI NAS can optimize neural networks, enabling machine learning experts to perform complex tasks with higher accuracy, lower latency, and lower power requirements without human intervention.
The recently launched Vertex AI Workbench, which is now under public review, is another addition to the Vertex AI service. It is an AI development environment for data scientists that helps them increase the speed of distribution of machine learning models. With this service, data practitioners can analyze all their data in BigQuery with a single interface, letting them build and test machine learning models 5x faster than on a regular laptop.
Komentar
Posting Komentar