How to Use
Select a scalar field:
- f(x,y): Choose preset
- Point: (x₀, y₀)
- Direction: Unit vector û
Key Properties
- ∇f ⊥ level curves
- |∇f| = max rate of change
- D_u f = ∇f · û
- ∇(fg) = f∇g + g∇f
In Machine Learning
Gradient descent: θ_{n+1} = θ_n − α∇L(θ). Learning rate α controls step size. Stochastic GD uses mini-batches. Adam optimizer adapts learning rates per parameter.
Step-by-Step Instructions
- 1Select f(x,y).
- 2Enter point (x₀,y₀).
- 3View gradient vector.
- 4Check magnitude.
- 5Compute directional derivative.