Homework Set 1: Exploring metric spaces and functions spaces





Metrics — Guided Practice (Hints Only Edition)




Metrics — Guided Practice (Hints Only Edition)

Structured prompts with scaffolded hints, common pitfalls, and key takeaways. No solutions revealed.



Q1 • Verifying a Metric

Learning goal. State the metric axioms precisely and verify them for \(d(x,y)=|x-y|\) on \(\mathbb{R}\).

Definition. A metric on \(X\) is a function \(d:X\times X\to\mathbb{R}\) such that for all \(x,y,z\in X\):
\[
d(x,y)\ge 0,\quad d(x,y)=0\iff x=y,\quad d(x,y)=d(y,x),\quad d(x,z)\le d(x,y)+d(y,z).
\]

Hint: Axioms via absolute value
Recall: \(|a|\ge0\); \(|a|=0\iff a=0\); \(|a|=|{-a}|\); and the triangle inequality \(|a+b|\le|a|+|b|\). Map \(a\) and \(b\) to expressions in \(x,y,z\).
Common pitfall. Using \(d(x,y)=x-y\) (without absolute value) fails non-negativity and symmetry.

Q2 • Subspace (Inherited) Metrics

Learning goal. Recognize inherited metrics and compute them in simple embeddings.

If \(Y\subseteq X\) and \(d\) is a metric on \(X\), then \(d|_{Y\times Y}\) is a metric on \(Y\).

Task. Let \(X=\mathbb{R}^2\) with \(d_2(x,y)=\sqrt{(x_1-y_1)^2+(x_2-y_2)^2}\). For \(Y=\{(x,0):x\in\mathbb{R}\}\), identify the inherited metric.

Hint: Plug \(Y\)-points into the ambient formula
Take \(x=(a,0)\), \(y=(b,0)\). Compute \(d_2(x,y)=\sqrt{(a-b)^2+0^2}\) and simplify carefully.
Key idea. An inherited metric uses the same distance formula, evaluated on the subset.

Q3 • The Discrete Metric

Learning goal. Work with an extreme example: distance detects only equality.

For any set \(X\),
\[
d(x,y)=\begin{cases}0,&x=y,\\ 1,&x\ne y.\end{cases}
\]

Task. Tabulate \(d\) on \(X=\{1,2,3,4\}\).

Hint: Matrix pattern
The diagonal entries are \(0\); every off-diagonal entry is \(1\).
Common pitfall. “Discrete” does not mean “finite.” The construction works on any set.

Q4 • Metrics on \(\mathbb{R}^n\): \(d_p\) and \(d_\infty\)

Learning goal. Compute \(d_1,d_2,d_\infty\) and connect them to geometry.

For \(x=(x_1,\dots,x_n)\), \(y=(y_1,\dots,y_n)\) and \(1\le p\le \infty\),
\[
d_p(x,y)=\Big(\sum_{k=1}^n |x_k-y_k|^p\Big)^{1/p},\qquad
d_\infty(x,y)=\max_{1\le k\le n}|x_k-y_k|.
\]

Task. For \(x=(1,2)\), \(y=(4,6)\), compute \(d_1,d_2,d_\infty\).

Hint: Use the difference vector
Work with \(y-x\). Then apply the definitions of \(d_1\), \(d_2\), and \(d_\infty\) directly.
Geometric cue. \(d_2\) is Euclidean distance; \(d_1\) is “taxicab”; \(d_\infty\) captures the largest coordinate gap.

Q5 • Infinite-Dimensional Space \(\ell^1\)

Learning goal. Decide membership in \(\ell^1\) using comparison and integral tests.

\(\ell^1=\{(x_k)_{k\ge1}:\sum_{k=1}^\infty |x_k|<\infty\}\).

Examples for context (not tasks). \((1/k^2)\) is summable; \((1/k)\) is not.

  1. Does \(x_k=\dfrac{(-1)^k}{k}\) belong to \(\ell^1\)?
  2. Corrected Show \(x_k=\dfrac{1}{k(\log(k+1))^{2}}\in\ell^1\).
  3. Give a sufficient condition on \((x_k)\) ensuring \((x_k)\in\ell^1\).
Hints
  • (1) Absolute convergence ignores the sign: compare with the harmonic series.
  • (2) Use the Integral Test: compare with \(\displaystyle \int \frac{dx}{x(\log x)^2}\).
  • (3) For large \(k\), a bound like \(|x_k|\le \dfrac{C}{k^{1+\varepsilon}}\) with \(\varepsilon>0\) is enough by comparison.
Common pitfall. Conditional convergence \(\not\Rightarrow\) absolute convergence; \(\ell^1\) requires absolute summability.

Q6 • The \(\ell^1\) Metric

Learning goal. Compute distances and test the triangle inequality concretely.

For \(x,y\in\ell^1\), define \(d_1(x,y)=\sum_{k=1}^\infty |x_k-y_k|\).

  1. Let \(x_k=\dfrac{1}{k^2}\) and \(y_k=\dfrac{2}{k^2}\). Compute \(d_1(x,y)\).
  2. Let \(z_k=\dfrac{3}{k^2}\). Verify \(d_1(x,z)\le d_1(x,y)+d_1(y,z)\).
Hints
  • Use linearity for series of nonnegative terms: \(\sum(|a_k|+|b_k|)=\sum|a_k|+\sum|b_k|\).
  • Observe the relationship between consecutive differences in this specific example.
Idea. Triangle inequality can be sharp when differences align in the same “direction.”

Q7 • Function Metrics: Uniform vs \(L^1\)

Learning goal. Contrast the uniform and \(L^1\) metrics through targeted computations.

For bounded \(f,g:[0,1]\to\mathbb{R}\),
\[
d_u(f,g)=\sup_{t\in[0,1]}|f(t)-g(t)|,\qquad
d_1(f,g)=\int_0^1 |f(t)-g(t)|\,dt.
\]

Task A. Let \(f(t)=t\), \(g(t)=t^2\). Compute \(d_u(f,g)\) and \(d_1(f,g)\).
Hints for Task A
  • Maximize \(t(1-t)\) on \([0,1]\) (use calculus or symmetry).
  • For the integral, note \(t-t^2\ge0\) on \([0,1]\) and compute \(\int_0^1 (t-t^2)\,dt\).
Task B. Let \(f_n(t)=t^n\). Compare \(d_u(f_n,0)\) and \(d_1(f_n,0)\) as \(n\to\infty\).
Hints for Task B
  • For the sup, study behavior near \(t=1\).
  • For the integral, compute \(\int_0^1 t^n\,dt\) and take the limit.
Common pitfall. Confusing \(L^1\)-convergence with uniform convergence; they are independent notions.

Mastery Check (No Solutions)

  1. Concept. Which axiom can fail for \(d(x,y)=\|x-y\|_2^2\) on \(\mathbb{R}^n\)?
    Hint
    Try a right triangle and compare the squared lengths against the triangle inequality.
  2. Technique. Give a simple sufficient condition on terms \(x_k\) to guarantee \((x_k)\in\ell^1\).
    Hint
    Compare to \(p\)-series with \(p>1\) on the tail of the sequence.
  3. Application. On \(C([0,1])\), which metric directly encodes uniform convergence?
    Hint
    Think: “supremum of the pointwise difference.”
Takeaway. Choose metrics to match the kind of closeness you need: pointwise-max vs. aggregate-average.