The first question answered in
this paper is; if A;λ → μ is a linear operator between sequence spaces, with a matrix
representation (aij), does it follow that the associated diagonal matrix (aijδjj) maps
λ into μ? An affirmative answer is given if λ is a normal (or monotone) sequence
space and μ is a perfect sequence space.
Morever, if λ,μ are normed sequence spaces, under what conditions will the
following inequality hold for all matrix maps (aij) from λ to μ : ∥(αij)∥≧∥(aijδij)∥
(where ∥⋅∥ denotes the operator sup norm)?
We apply our answer to the first problem to give another proof for a theorem of S.
Mazur.
|