2025年10月

Tensorflow-lite

https://tensorflow.google.cn/lite/guide/build_cmake?hl=zh-cn

Download git - tensorflow

cd ~
mkdir tflite-micro
cd tflite-micro

git clone https://github.com/tensorflow/tensorflow

cd tensorflow

Install deps

  • Ubuntu
sudo apt-get update
sudo apt-get install git build-essential cmake python3 python3-pip
  • macos
brew install git cmake python

Download & instal Bazel

  • 安装 Bazelisk(自动管理 Bazel 版本)
curl -L https://github.com/bazelbuild/bazelisk/releases/download/v1.17.0/bazelisk-linux-amd64 -o /usr/local/bin/bazel
chmod +x /usr/local/bin/bazel
  • [推荐] 手动管理版本

首先进入 tensorflow 目录 查看 .bazelversion

7.4.1

比如 目录配置是 7.4.1 那么,就打开 bazel - git https://github.com/bazelbuild/bazel/releases 找到对应的版本,如

  • macos

https://github.com/bazelbuild/bazel/releases/download/7.4.1/bazel-7.4.1-darwin-arm64

  • linux

https://github.com/bazelbuild/bazel/releases/download/7.4.1/bazel-7.4.1-linux-x86_64


tflite-micro

1. download tflite-micro

git clone https://github.com/tensorflow/tflite-micro
cd tflite-micro

2. download the suitable version of bazel

cat .bazelversion

# result like 7.0.0

go to https://github.com/bazelbuild/bazel/releases

and download the bazel binary package

mkdir ~/tools/bazel/7.0.0
cd ~/tools/bazel/7.0.0

wget https://github.com/bazelbuild/bazel/releases/download/7.0.0/bazel-7.0.0-linux-x86_64

chmod +x bazel-7.0.0-linux-x86_64
# create link
ln -s bazel-7.0.0-linux-x86_64 bazel

set the env

vi ~/.bashrc

or

vi ~/.zshrc # macos
# bazel_env_start
export PATH="~/tools/bazel/7.0.0:$PATH"
# bazel_env_end

flush the rcfile

# bash
source ~/.bashrc

# zsh
source ~/.zshrc

Now the env setting is done.

check the version of cmd bazel

bazel --version

# terminal shows below
bazel 7.0.0

Spark是用于分布式数据管理和处理系统,源于google的mapreduce系统以及开源社区Hadoop,并对其做了诸多补充和升级,譬如采用内存来管理数据,大幅度提高了响应速度,更适合现代的各类分布式数据场景,如实时大数据交互、人工智能、空间数据处理等

以下是在本机创建用于验证和个人部署spark的简要步骤:

macOS

brew install apache-spark

安装完成后,spark会将安装在 /opt/homebrew/Cellar/apache-spark/$version 下,比如4.0.1 就在 /opt/homebrew/Cellar/apache-spark/4.0.1

我们将路径输出到环境变量中,编辑 vi ~/.zshrc,不用zsh的需要编辑其他rc文件

在后面添加

# spark_env_start
export SPARK_DIR="/opt/homebrew/Cellar/apache-spark/4.0.1"
export PATH="$SPARK_DIR/bin:$SPARK_DIR/libexec/sbin:$PATH"
# spark_env_end

编辑完成后重新打开terminal 或者 source ~/.zshrc 完成安装和环境配置

Ubuntu

安装JDK, ...

建议使用Scala 3.x,相应的jdk版本建议使用11,如果安装Scala 2.x的话 需要安装jdk@8

sudo apt install default-jdk

安装others

ubuntu

sudo apt install scala -y

安装spark

首先打开spark官网下载页 https://dlcdn.apache.org/spark/

选择一个版本,比如 3.5.7

复制对应的下载地址,如 https://dlcdn.apache.org/spark/spark-3.5.7/pyspark-3.5.7.tar.gz

创建一个目录,比如 ~/spark

mkdir -p ~/spark
cd ~/spark

下载

wget https://dlcdn.apache.org/spark/spark-3.5.7/spark-3.5.7-bin-hadoop3.tgz

解压并进入文件夹

tar -xf spark-3.5.7-bin-hadoop3.tgz -C ./
cd spark-3.5.7-bin-hadoop3
ll #查看目录 是否有 bin sbin

total 96
drwxr-xr-x 1 shezw shezw   170 Sep 18 04:52 ./
drwxr-xr-x 1 shezw shezw   100 Oct  5 15:27 ../
-rw-r--r-- 1 shezw shezw 22916 Sep 18 04:52 LICENSE
-rw-r--r-- 1 shezw shezw 57842 Sep 18 04:52 NOTICE
drwxr-xr-x 1 shezw shezw     6 Sep 18 04:52 R/
-rw-r--r-- 1 shezw shezw  4605 Sep 18 04:52 README.md
-rw-r--r-- 1 shezw shezw   166 Sep 18 04:52 RELEASE
drwxr-xr-x 1 shezw shezw   748 Sep 18 04:52 bin/
drwxr-xr-x 1 shezw shezw   288 Sep 18 04:52 conf/
drwxr-xr-x 1 shezw shezw    68 Sep 18 04:52 data/
drwxr-xr-x 1 shezw shezw    14 Sep 18 04:52 examples/
drwxr-xr-x 1 shezw shezw 13296 Sep 18 04:52 jars/
drwxr-xr-x 1 shezw shezw    32 Sep 18 04:52 kubernetes/
drwxr-xr-x 1 shezw shezw  2402 Sep 18 04:52 licenses/
drwxr-xr-x 1 shezw shezw   338 Sep 18 04:52 python/
drwxr-xr-x 1 shezw shezw  1030 Sep 18 04:52 sbin/
drwxr-xr-x 1 shezw shezw    56 Sep 18 04:52 yarn/

其中 bin, sbin是存放 可执行与服务启动文件的目录,需要配置到系统环境变量中,使用pwd获取到当前目录,如 /home/shezw/spark/spark-3.5.7-bin-hadoop3

编辑 vi ~/.bashrc 在最后添加

# spark_env_start
export SPARK_DIR="/home/shezw/spark/spark-3.5.7-bin-hadoop3"
export PATH="$SPARK_DIR/bin:$SPARK_DIR/sbin:$PATH"
# spark_env_end

编辑完成后重新打开terminal 或者 source ~/.zshrc 完成安装和环境配置


使用 start, stop来开启和关闭spark

start-master.sh
stop-master.sh

启动完成后,可以通过

localhost:8080 来访问spark的web页面,其中也会显示服务的端口号,一般是7077