Google KerasRS To Build And Train Recommendation Systems

KerasRS

Google recently announced the release of Keras Recommenders (KerasRS), a new library that aims to make cutting-edge recommendation methods easily accessible to developers. KerasRS was unveiled with the goal of assisting developers in building accurate and efficient recommender systems.

Today’s digital experiences rely heavily on recommendation algorithms, which fuel everything from in-game advertisements to social media feeds and movie recommendations. Delivering individualised experiences is more crucial than ever as Artificial Intelligence continues to advance, and recommender systems are essential to producing high-caliber digital experiences.

Based on Keras 3, which was created for human developers with an emphasis on maintainability, code elegance, and speed of debugging, Keras Recommenders offers a set of building blocks for the whole recommender system development process. These blocks provide APIs with elements made especially for retrieval and ranking activities. KerasRS is used by Google themselves to power the Google Play feed.

Keras Recommenders’ natural compliance with Keras 3’s multi-backend methodology, which enables it to function flawlessly with TensorFlow, JAX, or PyTorch, is a major benefit. KerasRS-built models may be serialised and trained in one framework and then reused in another without requiring expensive migrations.

Keras Recommenders is simple to get started with. Developers may use pip to install the library: Installing Keras-rs using pip Following installation, the KERAS_BACKEND environment variable must be set before importing any Keras libraries in order to specify the preferred backend (JAX, TensorFlow, or PyTorch):

import os6 os.environ[“KERAS_BACKEND”] = “jax”

For recommender tasks, the library offers specific layers, losses, and metrics. For example, keras_rs.layers is part of KerasRS.In a retrieval architecture, BruteForceRetrieval is used to find potential suggestions. Additionally, it has the ability to rank particular components, such keras_rs.losses.Keras_rs.metrics and PairwiseHingeLoss( ).To assess ranking performance, use NDCG().

The seamless integration of Keras Recommenders with the conventional Keras APIs is a significant advantage. Model.fit( ) makes it simple for developers to set up the training loop, while model.compile( ) allows them to construct their model.

The documentation includes code examples that show how to utilise the KerasRS components. A retrieval layer and query and candidate models may be included in the framework of a retrieval model:

class SequentialRetrievalModel(keras.Model):  def __init__(self):  self.query_model = keras.Sequential([ ... ])  self.candidate_model = keras.layers.Embedding(...)  self.retrieval = keras_rs.layers.BruteForceRetrieval(k=10)  self.loss_fn = keras.losses.CategoricalCrossentropy(from_logits=True)  def call(self, inputs):  query_embeddings = self.query_model(inputs)  predictions = self.retrieval(query_embeddings)  return {"query_embeddings": query_embeddings, "predictions": predictions}

Using KerasRS components, such a model may be compiled as follows:

 model.compile(  loss=keras_rs.losses.PairwiseHingeLoss(),  metrics=[keras_rs.metrics.NDCG(k=8, name="ndcg")],  optimizer=keras.optimizers.Adagrad(learning_rate=3e-4), )

The well-known Keras fit algorithm is used for training:

model.fit(train_ds, validation_data=val_ds, epochs=5)

Another illustration demonstrates the use of KerasRS’s FeatureCross layer:

import keras_rs # ... define inputs and embedding layer x0 x1 = keras_rs.layers.FeatureCross()(x0, x0) x2 = keras_rs.layers.FeatureCross()(x0, x1) output = keras.layers.Dense(units=10)(x2) model = keras.Model(inputs, output)

Standard Keras techniques are then used to assemble and fit this model:

model.compile(  loss = keras.losses.MeanSquaredError(),  optimizer = keras.optimizers.Adam( learning_rate = 3e-4 ) ) model.fit(x, y = y)

A keras_rs.layers will be released as part of the library’s future goals.For extensive distributed embedding lookups, the DistributedEmbedding class makes use of SparseCore chips on TPUs. To further facilitate the development of cutting-edge systems, popular model implementations will be introduced on a regular basis.

You may find comprehensive instructions, examples, and documentation at keras.io/keras_rs. In addition to more complex courses like SASRec, these sites include beginning examples for traditional models like the Deep and Cross Network (DCN) and two-tower embedding models. Additionally, developers are welcome to use the GitHub repository to contribute and see the code.

Although backwards compatibility assurances are the goal, Keras Recommenders is presently under pre-release (0.y.z) development, which means APIs are not yet regarded as stable. The library works with MacOS, Windows, and Unix and requires Python >= 3.9. Since TensorFlow packages Keras 3 by default, it is advised to use version 2.16 or later.
Keras Recommenders seeks to enable developers to create cutting-edge recommender systems rapidly possibly in as little as ten minutes by emphasising user-friendliness and robust building blocks.

Keras Recommenders seeks to enable developers to create cutting-edge recommender systems rapidly possibly in as little as ten minutes by emphasising user-friendliness and robust building blocks.

Drakshi
Drakshi
Since June 2023, Drakshi has been writing articles of Artificial Intelligence for govindhtech. She was a postgraduate in business administration. She was an enthusiast of Artificial Intelligence.
RELATED ARTICLES

Page Content

Recent Posts