Retail Store en Google Cloud Platform

1

Lo que aprenderás sobre GCP para ecommerce

2

Etapas clave y MLOps

3

Arquitectura de alto nivel

4

Tour de la aplicación de retail

5

Backend as a Service y modelo de seguridad

6

Introducción al proyecto

7

Medición de interacciones

8

Setup de Google Tag Manager

9

Etiquetando con Google Tag Manager

10

Etiquetas relevantes para CLV

11

Integración con servicios

Exposición de servicios con Apigee

12

Servicios expuestos con APIs

13

¿Qué son las APIs?

14

Apigee

15

Creación de tu primer API Proxy en Apigee

16

Administración de APIs con Apigee

17

Creando un portal de desarrolladores

18

Interactuando con el portal de desarrolladores

19

Insights to Actions

Generación de modelos AI/ML

20

Machine Learning con datos estructurados

21

BigQuery para modelos de Forecasting y LTV

22

Bigquery ML - Manos a la Obra

23

Auto ML vs. Bigquery ML

24

Consideraciones para entrenar un modelo en BigQueryML

25

Entrenamiento del modelo en BigQuery ML

26

Cómo exportar modelos hechos en BQML

27

Exportando un modelo hecho con BQML

Consumo de servicios de AI/ML

28

Cómputo Serverless y Contenedores

29

¿Qué es Kubernetes?

30

Consumo de modelos ML mediante BigQuery API

31

Almacenamiento de predicciones

32

Ejecución de predicciones y persistencia

33

Despliegue continuo con Cloud Run

34

Ejecución de despliegue con Cloud Run

35

Escalamiento de servicios en Cloud Run

36

AuthN y AuthZ con Cloud Run

Google Marketing Platform

37

Análisis de las predicciones

38

Segmentamos nuestras Predicciones

39

Caso práctico para definir tu estrategia de activación

40

Generemos nuestros modelos en la plataforma

41

Segmentamos nuestras audiencias en BigQuery

42

Carga tus audiencias y conecta tu medio de activación

You don't have access to this class

Keep learning! Join and start boosting your career

Aprovecha el precio especial y haz tu profesión a prueba de IA

Antes: $249

Currency
$209
Suscríbete

Termina en:

1 Días
4 Hrs
2 Min
27 Seg

Caso práctico para definir tu estrategia de activación

39/42
Resources

How to structure Google Analytics data with SQL?

Let's dive into the exciting world of data management with SQL, specifically related to Google Analytics to improve our marketing campaigns. This process is essential to better understand our customers, as well as their behaviors, and create strategies based on concrete data. It is crucial to have a clear view on how much time customers spend on our screens and how many times they return, as this can be indicative of their interest. Let's start figuring out how to structure this data efficiently!

In what ways can Google Analytics data be aggregated?

When we talk about Google Analytics data, it usually comes disaggregated and at a fairly granular level. Therefore, the first thing to do is to aggregate this information at the customer level, focusing on:

  • Time on each screen: measuring how much time each user spends viewing a given screen.
  • Session frequency: Count how many times a client has interacted with the screen.

To process this information we use SQL. The basic SQL statement to aggregate this data would be:

SELECT user, SUM(time_on_screen) AS time_on_screen, SUM(unique_sessions) AS unique_screen_viewsFROM datasetGROUP BY user;

In this code, we replace "dataset" with the specific name of our project, and group the results according to users.

How to integrate the CRM data?

The next step is to merge this information with that of our CRM to get a complete picture of each customer. This includes:

  • User ID: Corroborated with the "Danielito ID" created in our company's systems.
  • Annual revenue: Know how much revenue each customer has generated.
  • Loyalty programs: Add the points program to the analysis.
  • Demographics: Incorporate age and other relevant demographic data.

We implement the following SQL query to perform this matching:

SELECT user_id, account_number, annual_revenue, time_on_screen, unique_screen_views, loyalty_program, ageFROM your_crm_dataset.

In this section, we are joining the user data with the values we already added in the previous part.

What are the segmentation and modeling steps?

After structuring and combining the information, it is time to segment the data using a K Means model, ideal for clustering audiences. This model helps us to identify patterns such as:

  • VIP customers
  • New customers
  • Customers with high buying potential

The K Means process involves defining one or more centroids, and can be initiated using the following SQL script:

CREATE MODEL `project.dataset.model_name`OPTIONS(  model_type='kmeans', num_clusters = 9, distance_type = 'euclidean')ASSELECT variable_1, variable_2FROM data;

How to determine the right number of clusters?

It is not wise to implement too many campaigns at the same time. The "elbow" strategy allows us to optimize the number of clusters. This involves analyzing the inertia curve versus number of clusters to identify a point where adding more clusters does not substantially improve the compactness of the model.

When plotting the results of this technique, we will look for a point on the graph where the slope begins to flatten, indicating the optimal number of clusters. This analysis will help to negotiate efficiently with marketing departments.

Conclusion

This process of data structuring and segmentation, coupled with a correct analysis of the number of clusters, effectively optimizes marketing campaigns. The importance of a data-driven approach to business decision making cannot be underestimated. Try these techniques and get ready to enhance your customer interactions - keep learning and improving!

Contributions 2

Questions 0

Sort by:

Want to see more contributions, questions and answers from the community?

Muy bien que haya explicado el método del codo

Excelente, ahora si estoy comprendiendo mas esta parte....