TensorFlow powers 78% of production deep learning models (Google Cloud, 2024). This tutorial covers everything from basic operations to deploying production models with TensorFlow 2.x and Keras API.
Deep Learning with TensorFlow: The Complete Guide
TensorFlow Usage Distribution (2024)
1. TensorFlow Fundamentals
Core Concepts:
- Tensors: N-dimensional arrays (tf.Tensor)
- Graph Execution: Define-and-run paradigm
- Eager Mode: Immediate execution (default in TF 2.x)
- Keras API: High-level model building
Basic Operations:
import tensorflow as tf
# Create tensors
a = tf.constant([[1, 2], [3, 4]])
b = tf.constant([[5, 6], [7, 8]])
# Matrix multiplication
c = tf.matmul(a, b) # [[19, 22], [43, 50]]
# Automatic differentiation
with tf.GradientTape() as tape:
tape.watch(a)
y = tf.reduce_sum(a * 2)
grad = tape.gradient(y, a) # [[2, 2], [2, 2]]
2. Building Models with Keras
Model Building Approaches:
Method | Example | When to Use |
---|---|---|
Sequential API | model = Sequential([layers...]) | Simple stacks |
Functional API | inputs = Input(); x = Dense()(inputs) | Complex architectures |
Model Subclassing | class MyModel(Model):... | Custom implementations |
CNN Example:
from tensorflow.keras import layers, models
model = models.Sequential([
layers.Conv2D(32, (3,3), activation='relu', input_shape=(28,28,1)),
layers.MaxPooling2D((2,2)),
layers.Flatten(),
layers.Dense(64, activation='relu'),
layers.Dense(10, activation='softmax')
])
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
3. Training & Deployment
Production Workflow:
Data Pipeline
tf.data.Dataset
Optimized loadingDistributed Training
MirroredStrategy
Multi-GPUModel Serving
TF Serving
Low-latencyEnd-to-End Example:
# Create dataset pipeline
train_ds = tf.data.Dataset.from_tensor_slices((X_train, y_train))
train_ds = train_ds.shuffle(1000).batch(32).prefetch(tf.data.AUTOTUNE)
# Train with callbacks
model.fit(train_ds, epochs=10,
callbacks=[
tf.keras.callbacks.ModelCheckpoint('model.keras'),
tf.keras.callbacks.EarlyStopping(patience=3)
])
# Export for serving
tf.saved_model.save(model, 'saved_model')
TensorFlow Ecosystem Tools
Tool | Purpose | Key Feature |
---|---|---|
TensorFlow Lite | Mobile/IoT | Quantization |
TensorFlow.js | Browser | WebGL acceleration |
TFX | ML Pipelines | End-to-end orchestration |
TensorBoard | Visualization | Training metrics |
4. Advanced Techniques
Cutting-Edge Features:
- Custom Training Loops: Fine-grained control
- Mixed Precision: FP16/FP32 training
- Distributed Strategies: Multi-worker training
- TF Hub: Pretrained models
Custom Training Example:
@tf.function # Graph compilation
def train_step(x, y):
with tf.GradientTape() as tape:
preds = model(x, training=True)
loss = loss_fn(y, preds)
grads = tape.gradient(loss, model.trainable_variables)
optimizer.apply_gradients(zip(grads, model.trainable_variables))
return loss
×