Skip to content

Commit 862fdae

Browse files
author
DvirDukhan
authored
Merge pull request #850 from RedisAI/update_documentation
Update documentation_docs
2 parents d72af70 + ee00003 commit 862fdae

File tree

9 files changed

+311
-262
lines changed

9 files changed

+311
-262
lines changed

README.md

Lines changed: 65 additions & 90 deletions
Original file line numberDiff line numberDiff line change
@@ -3,118 +3,112 @@
33
[![Dockerhub](https://img.shields.io/badge/dockerhub-redislabs%2Fredisai-blue)](https://hub.docker.com/r/redislabs/redisai/tags/)
44
[![codecov](https://codecov.io/gh/RedisAI/RedisAI/branch/master/graph/badge.svg)](https://codecov.io/gh/RedisAI/RedisAI)
55
[![Total alerts](https://img.shields.io/lgtm/alerts/g/RedisAI/RedisAI.svg?logo=lgtm&logoWidth=18)](https://lgtm.com/projects/g/RedisAI/RedisAI/alerts/)
6-
7-
# RedisAI
86
[![Forum](https://img.shields.io/badge/Forum-RedisAI-blue)](https://forum.redislabs.com/c/modules/redisai)
97
[![Discord](https://img.shields.io/discord/697882427875393627?style=flat-square)](https://discord.gg/rTQm7UZ)
108

11-
A Redis module for serving tensors and executing deep learning models.
9+
# RedisAI
10+
RedisAI is a Redis module for executing Deep Learning/Machine Learning models and managing their data. Its purpose is being a "workhorse" for model serving, by providing out-of-the-box support for popular DL/ML frameworks and unparalleled performance. **RedisAI both maximizes computation throughput and reduces latency by adhering to the principle of data locality**, as well as simplifies the deployment and serving of graphs by leveraging on Redis' production-proven infrastructure.
1211

13-
## Cloning
14-
If you want to run examples, make sure you have [git-lfs](https://git-lfs.github.com) installed when you clone.
12+
To read RedisAI docs, visit [redisai.io](https://oss.redis.com/redisai/). To see RedisAI in action, visit the [demos page](https://oss.redis.com/redisai/examples/).
1513

16-
## Quickstart
14+
# Quickstart
15+
RedisAI is a Redis module. To run it you'll need a Redis server (v6.0.0 or greater), the module's shared library, and its dependencies.
1716

18-
1. [Docker](#docker)
19-
2. [Build](#building)
17+
The following sections describe how to get started with RedisAI.
2018

2119
## Docker
22-
23-
To quickly tryout RedisAI, launch an instance using docker:
24-
25-
```sh
26-
docker run -p 6379:6379 -it --rm redislabs/redisai:edge-cpu-xenial
20+
The quickest way to try RedisAI is by launching its official Docker container images.
21+
### On a CPU only machine
2722
```
28-
29-
For docker instance with GPU support, you can launch it from `tensorwerk/redisai-gpu`
30-
31-
```sh
32-
docker run -p 6379:6379 --gpus all -it --rm redislabs/redisai:edge-gpu-xenial
23+
docker run -p 6379:6379 redislabs/redisai:latest-cpu-x64-bionic
3324
```
3425

35-
But if you'd like to build the docker image, you need a machine that has Nvidia driver (CUDA 10.0), nvidia-container-toolkit and Docker 19.03+ installed. For detailed information, checkout [nvidia-docker documentation](https://github.com/NVIDIA/nvidia-docker)
26+
### On a GPU machine
27+
For GPU support you will need a machine you'll need a machine that has Nvidia driver (CUDA 11.2 and cuDNN 8.1), nvidia-container-toolkit and Docker 19.03+ installed. For detailed information, checkout [nvidia-docker documentation](https://github.com/NVIDIA/nvidia-docker)
3628

37-
```sh
38-
docker build -f Dockerfile-gpu -t redisai-gpu .
39-
docker run -p 6379:6379 --gpus all -it --rm redisai-gpu
29+
```
30+
docker run -p 6379:6379 --gpus all -it --rm redislabs/redisai:latest-gpu-x64-bionic
4031
```
4132

42-
Note that Redis config is located at `/usr/local/etc/redis/redis.conf` which can be overridden with a volume mount
4333

34+
## Building
35+
You can compile and build the module from its source code. The [Developer](https://oss.redis.com/redisai/developer/) page has more information about the design and implementation of the RedisAI module and how to contribute.
4436

45-
### Give it a try
37+
### Prerequisites
38+
* Packages: git, python3, make, wget, g++/clang, & unzip
39+
* CMake 3.0 or higher needs to be installed.
40+
* CUDA 11.2 and cuDNN 8.1 or higher needs to be installed if GPU support is required.
41+
* Redis v6.0.0 or greater.
42+
43+
### Get the Source Code
44+
You can obtain the module's source code by cloning the project's repository using git like so:
4645

47-
On the client, set the model
4846
```sh
49-
redis-cli -x AI.MODELSTORE foo TF CPU INPUTS 2 a b OUTPUTS 1 c BLOB < tests/test_data/graph.pb
47+
git clone --recursive https://github.com/RedisAI/RedisAI
5048
```
5149

52-
Then create the input tensors, run the computation graph and get the output tensor (see `load_model.sh`). Note the signatures:
53-
* `AI.TENSORSET tensor_key data_type dim1..dimN [BLOB data | VALUES val1..valN]`
54-
* `AI.MODELRUN graph_key INPUTS input_key1 ... OUTPUTS output_key1 ...`
50+
Switch to the project's directory with:
51+
5552
```sh
56-
redis-cli
57-
> AI.TENSORSET bar FLOAT 2 VALUES 2 3
58-
> AI.TENSORSET baz FLOAT 2 VALUES 2 3
59-
> AI.MODELRUN foo INPUTS bar baz OUTPUTS jez
60-
> AI.TENSORGET jez META VALUES
61-
1) dtype
62-
2) FLOAT
63-
3) shape
64-
4) 1) (integer) 2
65-
5) values
66-
6) 1) "4"
67-
2) "9"
53+
cd RedisAI
6854
```
6955

70-
## Building
71-
72-
You should obtain the module's source code and submodule using git like so:
56+
### Building the Dependencies
57+
Use the following script to download and build the libraries of the various RedisAI backends (TensorFlow, PyTorch, ONNXRuntime) for CPU only:
7358

7459
```sh
75-
git clone --recursive https://github.com/RedisAI/RedisAI
60+
bash get_deps.sh
7661
```
7762

78-
This will checkout and build and download the libraries for the backends (TensorFlow, PyTorch, ONNXRuntime) for your platform. Note that this requires CUDA 10.0 to be installed.
63+
Alternatively, you can run the following to fetch the backends with GPU support.
7964

8065
```sh
81-
bash get_deps.sh
66+
bash get_deps.sh gpu
8267
```
8368

84-
Alternatively, run the following to only fetch the CPU-only backends even on GPU machines.
69+
### Building the Module
70+
Once the dependencies have been built, you can build the RedisAI module with:
8571

8672
```sh
87-
bash get_deps.sh cpu
73+
make -C opt clean ALL=1
74+
make -C opt
8875
```
8976

90-
Once the dependencies are downloaded, build the module itself. Note that
91-
CMake 3.0 or higher is required.
77+
Alternatively, run the following to build RedisAI with GPU support:
9278

9379
```sh
94-
ALL=1 make -C opt clean build
80+
make -C opt clean ALL=1
81+
make -C opt GPU=1
9582
```
9683

97-
Note: in order to use the PyTorch backend on Linux, at least `gcc 4.9.2` is required.
84+
### Backend Dependancy
9885

99-
### Running the server
86+
RedisAI currently supports PyTorch (libtorch), Tensorflow (libtensorflow), TensorFlow Lite, and ONNXRuntime as backends. This section shows the version map between RedisAI and supported backends. This extremely important since the serialization mechanism of one version might not match with another. For making sure your model will work with a given RedisAI version, check with the backend documentation about incompatible features between the version of your backend and the version RedisAI is built with.
10087

101-
You will need a redis-server version 6.0 or greater. This should be
102-
available in most recent distributions:
10388

104-
```sh
105-
redis-server --version
106-
Redis server v=6.2.5 sha=00000000:0 malloc=jemalloc-5.2.1 bits=64 build=c3504d808f2b2793
107-
```
89+
| RedisAI | PyTorch | TensorFlow | TFLite | ONNXRuntime |
90+
|:--------|:-------:|:----------:|:------:|:-------------:|
91+
| 1.0.3 | 1.5.0 | 1.15.0 | 2.0.0 | 1.2.0 |
92+
| 1.2.5 | 1.9.0 | 2.6.0 | 2.0.0 | 1.9.0 |
93+
| master | 1.9.0 | 2.6.0 | 2.0.0 | 1.9.0 |
94+
95+
Note: Keras and TensorFlow 2.x are supported through graph freezing. See [this script](https://github.com/RedisAI/RedisAI/blob/master/tests/flow/test_data/tf2-minimal.py) to see how to export a frozen graph from Keras and TensorFlow 2.x.
10896

109-
To start Redis with the RedisAI module loaded:
97+
## Loading the Module
98+
To load the module upon starting the Redis server, simply use the `--loadmodule` command line switch, the `loadmodule` configuration directive or the [Redis `MODULE LOAD` command](https://redis.io/commands/module-load) with the path to module's library.
99+
100+
For example, to load the module from the project's path with a server command line switch use the following:
110101

111102
```sh
112-
redis-server --loadmodule install-cpu/redisai.so
103+
redis-server --loadmodule ./install-cpu/redisai.so
113104
```
114105

115-
## Client libraries
106+
### Give it a try
107+
108+
Once loaded, you can interact with RedisAI using redis-cli. Basic information and examples for using the module is described [here](https://oss.redis.com/redisai/intro/#getting-started).
116109

117-
Some languages have client libraries that provide support for RedisAI's commands:
110+
### Client libraries
111+
Some languages already have client libraries that provide support for RedisAI's commands. The following table lists the known ones:
118112

119113
| Project | Language | License | Author | URL |
120114
| ------- | -------- | ------- | ------ | --- |
@@ -131,35 +125,16 @@ Some languages have client libraries that provide support for RedisAI's commands
131125

132126

133127

134-
135-
136-
## Backend Dependancy
137-
138-
RedisAI currently supports PyTorch (libtorch), Tensorflow (libtensorflow), TensorFlow Lite, and ONNXRuntime as backends. This section shows the version map between RedisAI and supported backends. This extremely important since the serialization mechanism of one version might not match with another. For making sure your model will work with a given RedisAI version, check with the backend documentation about incompatible features between the version of your backend and the version RedisAI is built with.
139-
140-
141-
| RedisAI | PyTorch | TensorFlow | TFLite | ONNXRuntime |
142-
|:--------|:-------:|:----------:|:------:|:-------------:|
143-
| 0.1.0 | 1.0.1 | 1.12.0 | None | None |
144-
| 0.2.1 | 1.0.1 | 1.12.0 | None | None |
145-
| 0.3.1 | 1.1.0 | 1.12.0 | None | 0.4.0 |
146-
| 0.4.0 | 1.2.0 | 1.14.0 | None | 0.5.0 |
147-
| 0.9.0 | 1.3.1 | 1.14.0 | 2.0.0 | 1.0.0 |
148-
| 1.0.0 | 1.5.0 | 1.15.0 | 2.0.0 | 1.2.0 |
149-
| master | 1.7.0 | 1.15.0 | 2.0.0 | 1.2.0 |
150-
151-
Note: Keras and TensorFlow 2.x are supported through graph freezing. See [this script](https://github.com/RedisAI/RedisAI/blob/master/tests/test_data/tf2-minimal.py) to see how to export a frozen graph from Keras and TensorFlow 2.x. Note that a frozen graph will be executed using the TensorFlow 1.15 backend. Should any 2.0 ops be not supported on the 1.15 after freezing, please open an Issue.
128+
The full documentation for RedisAI's API can be found at the [Commands page](commands.md).
152129

153130
## Documentation
131+
Read the docs at [redisai.io](https://oss.redis.com/redisai/).
154132

155-
Read the docs at [redisai.io](http://redisai.io). Checkout our [showcase repo](https://github.com/RedisAI/redisai-examples) for a lot of examples written using different client libraries.
156-
157-
## Mailing List / Forum
133+
## Contact Us
134+
If you have questions, want to provide feedback or perhaps report an issue or [contribute some code](contrib.md), here's where we're listening to you:
158135

159-
Got questions? Feel free to ask at the [RedisAI Forum](https://forum.redislabs.com/c/modules/redisai)
136+
* [Forum](https://forum.redis.com/c/modules/redisai)
137+
* [Repository](https://github.com/RedisAI/RedisAI/issues)
160138

161139
## License
162-
163-
Redis Source Available License Agreement - see [LICENSE](LICENSE)
164-
165-
Copyright 2020, [Redis Labs, Inc](https://redislabs.com)
140+
RedisAI is licensed under the [Redis Source Available License Agreement](https://github.com/RedisAI/RedisAI/blob/master/LICENSE).

0 commit comments

Comments
 (0)