Executive Summary Link to heading

What “Analysis” Means and Why It Changed: In the late 17th century, “analysis” in mathematics referred broadly to a method of problem solving—the process of breaking a problem into known elements (as opposed to synthetic, axiomatic proof). Isaac Newton and Gottfried Wilhelm Leibniz employed “analysis” in this classical sense of analysis versus synthesis (tracing back to Pappus’s ancient descriptions) to characterize their new infinitesimal calculus methods. Over time, however, “analysis” evolved from a general mode of reasoning into a distinct discipline of mathematics concerned with limits, continuity, infinite processes, and the continuum. By the 18th century, “analysis” had become nearly synonymous with the calculus and its extensions (power series, differential equations, etc.), often contrasted with “synthetic” Euclidean geometry. Each subsequent generation refined the meaning of “analysis” in response to internal mathematical developments and external influences from physics and other sciences.

18th–19th Century Shifts: In the Eulerian era (18th century), Leonhard Euler and colleagues greatly expanded the domain of analysis—introducing the formal notion of a mathematical function and applying infinite series and calculus techniques to solve a vast range of problems in mechanics and astronomy. “Analysis” in this period denoted the highly successful analytical method (algebraic and calculus-based approach) as opposed to geometrical methods, establishing analysis as the lingua franca of theoretical physics (e.g. Euler’s Introductio in analysin infinitorum, 1748). In the early 19th century, Augustin-Louis Cauchy’s Cours d’Analyse (1821) launched a new standard of rigor: defining limits, continuity, and convergence with precise epsilon-condition logic[1]. This marked the transformation of “analysis” into the foundation of real and complex analysis, with an emphasis on rigorous proof. Bernhard Bolzano (1810s) and Cauchy severed analysis from its intuitive infinitesimal roots by demanding limit-based definitions, while Karl Weierstrass in the later 19th century completed the “arithmetization” of analysis by formalizing the real number system and (ε,δ)-definitions. The scope of analysis simultaneously broadened: complex function theory (Cauchy, Riemann) and Fourier analysis (Fourier’s Analytical Theory of Heat, 1822) became core parts of “analysis.” The term “Mathematical Analysis” thus came to encompass real-variable calculus, complex analysis, and the emerging theory of functions, distinguished from algebra or synthetic geometry.

Boundary with Other Fields: Historically, the boundaries of analysis have been fluid. In the 18th century, analysis was often juxtaposed to algebra and geometry: for example, “analytic” (or infinitesimal) methods versus synthetic Euclidean constructions. By the late 19th century, distinct mathematical disciplines crystallized—algebra, geometry, arithmetic/number theory, logic, and analysis—but with significant overlap. Key concepts like analytic geometry (Descartes’ merger of algebra and geometry in 1637) blurred these lines early on. Throughout the 19th and 20th centuries, topics oscillated between domains: potential theory and differential geometry used analytical tools but were sometimes seen as geometric; analytic number theory applied complex analysis to integer problems, bridging analysis and number theory. The hardening of boundaries often corresponded to institutional structures (university chairs, journals, curricula) that defined “analysis” as a teaching and research domain. For instance, by 1900, a typical mathematics department might separate professors of analysis versus algebra, reflecting a consensus that topics like calculus, differential equations, and function theory belonged to “analysis,” while group theory or projective geometry did not. Still, certain subfields have repeatedly been contested: is calculus of variations part of analysis or a separate applied subject? Are partial differential equations (PDEs) a subfield of analysis or of applied mathematics? These debates persisted, but generally “analysis” came to mean the theory of limits, infinite series, integration, and the structures built atop these concepts, in contrast to the discrete structures of algebra or the finite constructions of synthetic geometry.

Key Developments and Turning Points: Several watershed moments redefined analysis. Newton (1660s–1700) and Leibniz (1670s–1700): the invention of calculus (fluxions/infinite series for Newton, differential calculus for Leibniz) expanded “analysis” to include infinite processes, prompting contemporaries to speak of “infinitesimal analysis.” Newton wrote in 1676 how “the limits of analysis are enlarged by such infinite equations… analysis reaches, I might almost say, to all problems”, highlighting the explosion of problem-solving power calculus provided. Euler (1740s–1760s): the systematic use of power series and introduction of the function concept, treating analysis as the general study of functions and their expansions (analysis infinitorum). Cauchy (1821): rigorous definitions of limit and continuity begin the foundation of real analysis as an independent, rigorous field, distinguishing analysis by its demand for proof “with the rigor of geometry”. Mid-19th century: Riemann (1850s) introduced the Riemann integral and complex analysis concepts (Riemann surfaces), extending analysis into the geometric realm of multi-valued functions. Weierstrass (1860s–1870s) formalized epsilons and deltas, eliminated heuristics of infinitesimals, and even produced pathological functions (like the 1872 Weierstrass function, continuous everywhere but differentiable nowhere) that shocked his contemporaries by defying intuitive geometric smoothness. Such examples “upended mathematics, overturning proofs that relied on geometric intuition” and solidified the need for purely analytic (arithmetical) foundations. Turn of the 20th century: Henri Lebesgue (1902) revolutionized integration by introducing measure theory, vastly extending the scope of integrable functions and resolving decades-old paradoxes where classical integrals failed. This ushered in modern analysis with abstract spaces (metric, measure, function spaces). Early 20th century also saw functional analysis emerge (Hilbert, Banach) generalizing analysis to infinite-dimensional vector spaces and operators. Mid-20th century: the formal axiornatization of probability by Kolmogorov (1933) using measure theory treated probability theory as an extension of analysis, and the development of distribution theory by Sobolev (1936) and Schwartz (1945) provided “generalized functions” to rigorously handle objects like the Dirac delta, effectively “the calculus of today” for solving PDEs in a broad sense. Meanwhile, applied analysis took shape: during World War II and the Cold War, analysts like John von Neumann, Norbert Wiener, and others applied analysis to quantum mechanics, signal processing, and computation (e.g. the 1965 Fast Fourier Transform algorithm was co-developed by J. W. Cooley and John Tukey to detect nuclear test signals, showing analysis’s reach into engineering). Late 20th century: Harmonic analysis blossomed (Calderón-Zygmund theory of singular integrals in the 1950s–60s gave powerful tools to handle PDEs and oscillatory integrals), ergodic theory and dynamical systems merged probabilistic and analytical ideas to study long-term behavior of systems, and complex analysis techniques drove breakthroughs in number theory (e.g. the prime number theorem and advances in L-functions). Numerical analysis became a respected branch, formalizing the analysis of algorithms for approximating analytical problems (errors, stability, convergence of numerical schemes), especially as digital computers rose. By the 21st century, “analysis” now encompasses a vast portfolio: classical real and complex analysis, functional analysis (Banach/Hilbert space theory), harmonic and Fourier analysis, PDE and variational analysis, geometric measure theory, wavelet and time-frequency analysis, nonlinear analysis in high dimensions, and more – and it increasingly overlaps with geometry, combinatorics, and data science.

Institutions, Education, and Rhetoric: The meaning of “analysis” was also shaped by who studied and taught it, and how. In the 19th century, the École Polytechnique in Paris under Cauchy and others set a template with courses titled “Analyse” focusing on calculus and its rigorous foundations, thereby institutionalizing a curriculum for “analysis”. In the German realm, Berlin University (with Dirichlet, then Weierstrass) and Göttingen (with Gauss, Riemann, later Hilbert) became centers of analytical research and teaching. By the early 20th century, American universities (e.g. Princeton, Chicago, Harvard) imported European analytical rigor; G. H. Hardy’s Course of Pure Mathematics (1908) trained generations in the English-speaking world in rigorous analysis, cementing that term in curricula. Rhetorically, “analysis” was often paired oppositionally: “analysis vs. synthesis” (methodological debate of reasoning methods), “analytic vs. geometric” (styles of solution, e.g. algebraic calculus methods versus synthetic geometry, as Newton contrasted in 1670s–90s), “analytic vs. algebraic” (in number theory or geometry, distinguishing use of calculus/limits vs. discrete algebraic manipulations), “rigor vs. intuition” (analysis being the pursuit of rigor, especially post-Cauchy, versus intuitive or physical reasoning), and “pure vs. applied” (with analysis playing roles on both sides: Fourier’s and Laplace’s analytical equations for heat and celestial mechanics were applied, while Weierstrass’s epsilon-delta and Cantor’s set theory were pure foundational work). Throughout the 20th century, foundational debates (e.g. the intuitionist critique of classical analysis, or the advent of nonstandard analysis in the 1960s reintroducing rigorous infinitesimals) kept the philosophical meaning of analysis in flux. Yet, analysis proved remarkably unifying for mathematics: it supplied common language and tools not only for traditional calculus problems but for quantum physics, economics (e.g. differential equations in modeling), and even pure fields like topology (through analytical invariants).

Why It Matters Today: Understanding the historical semantics of “analysis” illuminates how mathematics has grown and organized itself. The polysemy of “analysis” – from a method of reasoning to a catch-all for calculus, then to a rigorous field defined by limit processes, and now to a constellation of sub-disciplines – reflects mathematics’ response to challenges of infinity, continuity, and change. Many modern mathematical tools, from the algorithms underlying computer simulations to the Fourier transforms enabling signal processing, are direct products of this centuries-long evolution of analysis. Moreover, the cultural identity of mathematicians has been shaped by whether they align as “analysts,” “algebraists,” or “geometers,” each with different values and techniques. The narrative of analysis reveals a constant tension and cross-pollination between pursuit of rigor (the Weierstrassian ideal) and pursuit of utility (the Eulerian and applied tradition). Today, analysis remains a cornerstone of both pure mathematics (e.g. Navier–Stokes equation theory, Langlands program’s analytic side) and practical applied science (e.g. data analysis, machine learning algorithms drawing on optimization and harmonic analysis). By tracing how “analysis” acquired its many meanings, we gain insight into how mathematics continually redefines its frontiers while building on a rich legacy of ideas.

Annotated Timeline (Newton → Present) Link to heading

  • 1660s–1670s: Newton develops calculus (“method of fluxions”) – Isaac Newton (England) invents the calculus of fluxions and infinite series. In an unpublished tract De Analysi (1669), Newton speaks of “analysis by infinite series”, expanding functions into infinite series to solve problems. He later contrasts the power of this new analytical method to “solve almost all problems” with classical synthetic geometry. (Leibniz independently develops his calculus on the Continent in the 1670s.)

  • 1676: Newton’s letter on “the limits of analysis” – In a letter to Oldenburg (for Leibniz), Newton boasts how “the limits of analysis are enlarged by such infinite equations… by their help analysis reaches… to all problems.” This reflects early optimism about calculus (infinitesimal analysis) vastly extending mathematical problem-solving.

  • 1684–1687: Leibniz publishes calculus; Newton’s Principia – Leibniz publishes the first account of differential calculus (1684) and integral calculus (1686), calling it a new analysis. Newton’s Principia Mathematica (1687), by contrast, uses classical synthetic geometry, but in an appendix (1704 Opticks Latin edition) Newton discusses analysis-synthesis methods in mathematics and physics. The Newton–Leibniz calculus controversy ensues (accusations of plagiarism), entangling national mathematical communities and their preferred “analysis” styles (British fluxions vs. Continental differentials).

  • 1696: l’Hôpital’s Analyse des infiniment petits* – The first calculus textbook, by Marquis de l’Hôpital (based on Johann Bernoulli’s lectures), explicitly uses “analysis of the infinitely small” in its title, cementing the term analysis to mean infinitesimal calculus in early 18th-century France.

  • 1715: Brook Taylor’s definition of integration (Britain) – Brook Taylor gives early rigorous definition of integration as limit of sums (foreshadowing later analysis), but British mathematics soon falls behind the Continent in analysis.

  • 1720s: “Analysis” vs. “Synthesis” revived in geometry – Newton’s later years and disciples (e.g. James Gregory, Colin Maclaurin) attempt to rediscover the Greek analysis method for synthetic geometry. Newton in the 1680s-90s works on classical geometry problems via analysis vetorum (ancient analysis), lamenting Descartes’ algebraic “modern analysis” as less elegant. This contributes to a semantic shift: by the 18th century’s end, “analysis” often meant algebraic/calculus approach and “synthesis” meant geometrical construction.

  • 1730s–1750s: Euler’s era – expansion of analytic methods – Leonhard Euler (1707–1783) publishes Introductio in analysin infinitorum (1748), defining mathematics broadly as “the science of quantity” and elevating the notion of function. Euler introduces the concept of a function $y = f(x)$ and systematically develops infinite series, the exponential and logarithmic functions, trigonometric series, etc., laying “the foundations of modern mathematical analysis”. His work and that of the Bernoullis applies analysis to solve problems in mechanics (Euler’s equations of motion, etc.), establishing analysis as the dominant mathematical language of science. (Euler also pioneers analytic number theory by using series to tackle problems in number theory, though this label comes later.)

  • 1755: Euler’s Institutiones calculi differentialis*** – Euler’s treatise on differential calculus (and later integral calculus, 1768) provide a comprehensive toolbox of analytic techniques (differentiation rules, differential equation solutions, etc.), widely influencing education across Europe.

  • 1788: Lagrange’s Mécanique Analytique – Joseph-Louis *Lagrange publishes Analytical Mechanics, reformulating classical mechanics purely in terms of algebraic equations and calculus (no diagrams). He famously notes that “No figures will be found in this work.” Here “analytical” implies the use of calculus and algebraic methods to derive mechanics’ laws, underscoring the prestige of analysis as an advanced, powerful method in contrast to geometric approaches.

  • 1797: Lagrange’s Théorie des fonctions analytiques – Lagrange attempts to ground calculus on algebraic expansion (power series) without infinitesimals, reflecting an effort to rigorize analysis by purely algebraic means. He coins term “fonctions analytiques”* (analytic functions) meaning those expressible as power series – an early notion of analysis leaning toward formal power series manipulation.

  • 1816: Bolzano’s modern definition of continuity – Bernard Bolzano (Bohemia) gives a ε–δ style definition of continuity (published 1817), an isolated but prescient step toward arithmetization. His work, largely unnoticed until later, marks an early attempt to put analysis on a rigorous footing (free of infinitesimals), anticipating the later Cauchy-Weierstrass approach.

  • 1821: Cauchy’s Cours d’Analyse *Augustin-Louis Cauchy publishes Cours d’analyse for École Polytechnique, a landmark in rigor. Cauchy defines limit (“when the values of a variable approach a fixed value indefinitely…”), continuity, and the definite integral in terms of limits of sums. He insists on geometric rigor in analysis “so that one need never rely on arguments drawn from the generality of algebra.”. This book can be seen as the first textbook of “analysis” as we know it (real analysis): it helped establish analysis as a subject teaching the foundational calculus concepts with rigor, separating it from naive calculus. Cauchy’s approach triggers debates (his use of infinitesimals alongside limits confused some, and sparked criticism by Abel and others) but ultimately sets a new standard.

  • 1822: Fourier’s Théorie Analytique de la Chaleur *Jean-Baptiste Fourier publishes the Analytical Theory of Heat, using trigonometric series (Fourier series) to solve the heat equation. Fourier’s work, though lacking rigorous underpinnings, shows the power of analysis applied to PDEs and physics, and introduces the idea of expanding arbitrary functions into infinite series of sines/cosines – a bold use of analysis that later raises questions (What is a function? Do Fourier series always converge?). This work is foundational for harmonic analysis and functional analysis to come, and it expands “analysis” to encompass Fourier methods in solving physical problems.

  • 1820s–30s: Emergence of analytic and abstract function theoryNiels Henrik Abel and Carl Gustav Jacobi develop the theory of elliptic functions (inverse of elliptic integrals), an extension of analysis into complex periodic functions. Meanwhile, Évariste Galois (1830) uses the word “analysis” in a different sense in algebra (analyzing solvability of equations), reminding that “analysis” still could mean general mathematical investigation. Nonetheless, by mid-19th century, theory of functions (especially of a complex variable) becomes a recognized branch – often considered part of analysis (in German, Funktionentheorie was intertwined with analysis).

  • 1837: Dirichlet’s formal definition of functionPeter Gustav Lejeune Dirichlet defines a function simply as a correspondence $y = f(x)$ such that each $x$ in an interval maps to a single $y$ value, “whatever the rule of correspondence” (including possibly different formulas on different intervals) – a modern abstract notion of function. This widens analysis to include any function, even wildly discontinuous, breaking from the earlier view that functions must be given by one formula. It paves the way for later pathology in analysis (functions without nice formulas).

  • 1841: Liouville’s theorem (complex analysis) and transcendenceJoseph Liouville proves the existence of transcendental numbers using analytical methods (continued fractions). He also rigorously develops complex analysis results (e.g., Liouville’s theorem that bounded entire functions are constant). This period sees complex analysis emerge as a robust field within analysis (Cauchy’s integral theorem was 1820s; by 1850s Riemann and Weierstrass further develop it).

  • 1851: Cauchy’s Exercises d’analyse et de phys. math. – Cauchy’s later writings, including solving differential equations and developing *complex function theory (residues, etc.), spread analytic methods. By mid-century, analysis courses and texts across Europe teach both real-variable and complex-variable theory under the analysis umbrella.

  • 1854: Riemann’s breakthroughs (habilitationsschrift)Bernhard Riemann in his 1854 habilitation lecture (published 1868) on the foundations of geometry does not directly deal with analysis, but a year earlier in 1853–1854 he develops Riemann integration (published 1868) and revolutionizes complex analysis (Riemann surfaces, mapping theorem in 1851 dissertation). Riemann’s approach is more geometric and intuitive compared to Weierstrass. His 1859 paper on the zeta function applies complex analysis to number theory, inaugurating the field of analytic number theory (use of analytic methods to study primes, etc.). The mid-19th century thus sees analysis’ domain expanding to include set-theoretic and geometric viewpoints (Riemann’s work implicitly uses topology/analysis blend).

  • 1872: Meridian year for rigor – Weierstrass, Cantor, et al. – Three major publications on real numbers: Karl Weierstrass’ lecture (not officially published, but disseminated via students) on ε–δ definitions, Eduard Heine and Georg Cantor on the concept of number and trigonometric series convergence, and Richard Dedekind’s pamphlet Continuity and Irrational Numbers (with Dedekind cuts). Cantor (1872) also proves the uniqueness of Fourier series coefficients and begins to found set theory. Meanwhile, Weierstrass publishes his famous nowhere-differentiable continuous function example (1872), shocking the mathematical world by defying the assumption that continuity implies some differentiability. The late 1860s–70s thus complete the arithmetization of analysis: analysis is built on precise real number axioms (no geometric infinitesimals), and pathologies are embraced to strengthen rigor (counterexamples to false intuitions). By now Weierstrass is known as “the father of modern analysis”, and his Berlin lectures shape a generation of “analysts” (including Königsberger, Mittag-Leffler, Sonia Kovalevsky). Also in 1874, Cantor proves $\mathbb{R}$ is uncountable, introducing set-theoretic ideas that become fundamental to analysis’ understanding of the continuum.

  • 1884: Sofia Kovalevskaya appointed (analysis professorship) – Kovalevskaya, a student of Weierstrass, becomes professor in Stockholm, publishing on PDEs (the rotating solid body problem). Her career illustrates analysis’ spread and the increasing acceptance of women in the field. The late 19th century also sees the term “analysis situs” (analysis of position) used by Henri Poincaré (1890s) for what becomes topology, though named analysis, this “situs” is eventually spun off from mathematical analysis proper.

  • 1890: Peano’s counterexamples; rigor in integralsGiuseppe Peano constructs a continuous space-filling curve (1890), another analytical pathology surprising mathematicians (continuous image of an interval filling a square). In 1891, he also gives a counterexample where a Riemann-integrable function’s derivative is not Riemann integrable – highlighting the need for a better integration theory. These examples push the development of measure theory.

  • 1899: Courses and textbooks titled “Analysis” – By the turn of century, universities commonly offer courses like “Advanced Analysis” or “Functions of a Real Variable.” For instance, Cambridge’s Tripos (under influence of Hardy later) had “the theory of functions” as advanced analysis. In 1898, Édouard Goursat publishes Cours d’analyse mathématique in France, a comprehensive text on calculus and differential equations, showing how analysis teaching had matured and stabilized.

  • 1900: Hilbert’s Problems and analysis – At the 1900 International Congress, David Hilbert poses 23 problems; several are analysis-centric: e.g. the Riemann Hypothesis (analytic number theory), continuum hypothesis (foundation of analysis and set theory), the existence of solutions to PDEs (#6, #23), criteria for integrals (variation calculus). Hilbert’s own work at this time includes the theory of integral equations (Fredholm theory), leading to Hilbert space concept (1902) – an infinite-dimensional analytical framework.

  • 1902–1904: Lebesgue’s measure and integralHenri Lebesgue defines Lebesgue measure (1901) and Lebesgue integral (1902). Published in his 1902 doctoral thesis and a 1904 book, this addresses the “pathological” cases in integration. Lebesgue integration vastly “broadened the scope of integration far beyond… Riemann”, handling functions with “too many” discontinuities and ensuring the fundamental theorem of calculus holds under more general conditions. As a result, Fourier series and other analytic expansions can be placed on solid footing. Lebesgue’s work is often seen as the birth of modern real analysis. Simultaneously, the French school (Borel, Baire, Hadamard) and the Russian school (Egorov, Luzin) develop measure theory and descriptive set theory, integrating logic into analysis.

  • 1905: Poincaré’s criticism and intuitionHenri Poincaré publishes papers on the philosophy of mathematics, distinguishing between logicists (Peano, Russell) and analysts (Cantor, Weierstrass). Poincaré defends intuitive reasoning in analysis, warning against overreliance on formalism, which ignites discussion on the balance of rigor vs intuition in analysis foundations.

  • 1910: Fréchet’s metric spaces; birth of abstract analysisMaurice Fréchet defines abstract metric spaces (1906) and in 1910 general topology, greatly generalizing the notion of space for analysis. His work and that of Felix Hausdorff (1914) create point-set topology, which becomes the language underpinning analysis (e.g. concepts of completeness, compactness). Functional analysis also takes shape: Fréchet (1907), David Hilbert (1904) on Hilbert spaces for integral equations, Erhard Schmidt (1908) on infinite-dimensional orthonormal expansions.

  • 1915: Emmy Noether’s work in analysis? – While Noether is known for algebra, in 1915–1918 she also contributes to physics (Noether’s Theorem) using variational calculus (an analytic method). This reflects ongoing interplay: algebraic methods influencing analysis and vice versa (Noether’s ideas later influence functional analysis via symmetry considerations).

  • 1920: Banach and Polish schoolStefan Banach and mathematicians in Lwów (Poland) found Banach space theory. Banach’s 1922 thesis (published 1929 Théorie des opérations linéaires) is the first systematic treatise on functional analysis (the term functional analysis itself coined by Fréchet in 1920). They study complete normed vector spaces (Banach spaces), linear operators, and fixed-point theorems (Banach fixed-point theorem 1922) – providing a unifying abstract framework for differential and integral equations. This establishes functional analysis as a major branch of analysis by the 1930s[2].

  • 1924: Hardy’s A Course of Pure Mathematics (3rd ed.)G. H. Hardy’s textbook (1st ed. 1908, 3rd ed. 1921) educates English-speaking mathematicians in rigorous real analysis and series. By its centenary it’s noted that “Hardy’s presentation of mathematical analysis is as valid today as when first published”, demonstrating its foundational role. Hardy’s influence helps Britain catch up in modern analysis (Britain had lagged in the 1800s). Hardy also works (with Littlewood) on Fourier series and Tauberian theorems (1910s), advancing harmonic analysis.

  • 1926: Courant & Hilbert’s Methods of Mathematical Physics – Published in German (Vol. I 1924, II 1926), by *Richard Courant and David Hilbert. These volumes compile methods of applied analysis for solving PDEs in physics (variational methods, eigenfunction expansions, Green’s functions), exemplifying the strong post-WWI drive to apply analysis to scientific problems. It spreads techniques like the Ritz method (basis of finite element later) and shows the inseparability of “pure” and “applied” analysis. Courant later founds an institute in New York (NYU) emphasizing applied analysis and PDE.

  • 1931: Gödel and constructivist critiques – Kurt Gödel’s incompleteness theorem (1931) shakes foundational certainty. In analysis, L. E. J. Brouwer and the intuitionists criticize nonconstructive proofs (like existence of maxima via completeness) used in analysis. Although mainstream analysis remains classical, alternative schools (constructive analysis, Russian recursive analysis) develop, highlighting that “analysis” isn’t monolithic in philosophy.

  • 1933: Kolmogorov’s Foundations of Probability *Andrey Kolmogorov axiomatizes probability theory on measure-theoretic grounds. This work “transformed the calculus of probability into a mathematical discipline”, comparable to Euclid’s role for geometry (as one historian noted). Probability theory thus becomes a sub-branch of analysis (stochastic processes are studied with analytical tools—e.g. integrals of random functions, characteristic functions via Fourier analysis). Also in 1933, Kolmogorov with Alexandrov formalize measure-theoretic topology (introducing notions like completeness of measure, etc.), blending analysis and topology.

  • 1936: Sobolev spaces (functional analysis meets PDE)Sergei Sobolev in USSR defines Sobolev spaces (1936–1938), function spaces incorporating weak derivatives, enabling rigorous treatment of PDE solutions. This is a milestone in modern PDE analysis: it extends the domain of “analysis” to generalized solutions of PDEs and variational problems. (Sobolev’s work, though initially classified due to military applications, becomes widely known post-WWII.)

  • 1936–1939: Bourbaki group forms – Young French mathematicians (Dieudonné, Weil, etc.) begin publishing under pseudonym Nicolas Bourbaki. They aim to rewrite mathematics on an axiomatic structural basis. Bourbaki’s Éléments de mathématique series (from 1939) includes volumes on Topologie générale, Algèbre, and crucially Espaces vectoriels topologiques (Locally Convex Topological Vector Spaces, 1953) which abstract functional analysis. Bourbaki’s influence (particularly post-WWII) promotes an austere, general axiomatic style in analysis: for example, Bourbaki’s terminology (injective, surjective, etc.) and emphasis on structure over computation becomes standard. This shapes mid-20th-century pedagogy: many analysis texts (e.g. Dieudonné’s Foundations of Modern Analysis, 1960) adopt Bourbaki’s structural approach.

  • 1939–1945: World War II and applied analysis – War needs (balistics, aerodynamics, signal processing, nuclear physics) drive advances in applied analysis. E.g., John von Neumann works on shock wave PDEs and numerical methods (invention of Monte Carlo method with Ulam). Norbert Wiener develops cybernetics and filtering theory (Wiener’s 1942 work on Brownian motion and Fourier analysis leads to Wiener filter). Mathematical analysis becomes crucial in engineering contexts (differential equations for flight, control theory beginning with Wiener). After the war, many European analysts (Wiener, von Neumann, etc.) are in the US, boosting American mathematics.

  • 1945: Schwartz’s DistributionsLaurent Schwartz (France) develops distribution theory (generalized functions) around 1944–45, published 1948–1951. Distributions (like the Dirac delta “function”) extend analysis by allowing differentiation and integration of objects previously ill-defined. Together with Sobolev spaces, this provides a rigorous framework for generalized solutions of PDEs. It’s said “distribution theory has become the calculus of today” in 20th-century analysis, underscoring its foundational impact. Schwartz receives Fields Medal (1950) for this work[3].

  • 1948: Norbert Wiener’s Cybernetics and signal analysis – Wiener’s book (1948) synthesizes control theory, signal processing, and computing, relying on Fourier analysis and probability. Meanwhile, Claude Shannon (1948) applies Fourier analysis in information theory. These works popularize terms like harmonic analysis (for analyzing frequencies in signals) beyond pure math.

  • 1950: Atiyah & Singer (early career) and Hodge theory – While the famous Atiyah–Singer index theorem comes in 1963, the seeds in the 1950s show a merge of geometry and analysis: Hodge theory (Hodge 1941, but widely used 1950s) uses elliptic differential operators to connect topology and analysis. Analytic methods (harmonic forms, PDE) solve problems in algebraic geometry – a harbinger of global analysis on manifolds.

  • 1952: Calderón-Zygmund TheoryAlberto Calderón and Antoni Zygmund publish a seminal paper on singular integral operators (Acta Math 1952), founding modern harmonic analysis. Their results (Calderón-Zygmund decomposition, etc.) provide $L^p$ estimates for integral operators and revolutionize analysis on $\mathbb{R}^n$, with applications to PDE and ergodic theory. This establishes a trans-Atlantic school (Zygmund in Chicago mentoring many) and solidifies real-variable harmonic analysis as a central subfield.

  • 1954: Lax-Milgram lemma and functional analysis appliedPeter Lax and Arthur Milgram formulate the Lax-Milgram theorem (existence/uniqueness of solutions to certain variational problems) – a key result using functional analysis (Hilbert space theory) to solve boundary-value PDEs. This exemplifies the power of abstract analysis in solving concrete problems and becomes a staple in modern analysis education.

  • 1957: Sputnik shock – math education (New Math) – The launch of Sputnik spurs a push in the US for advanced math education. Under influence of Bourbaki and modern math, “New Math” reforms (1960s) introduce set theory and rigorous analysis early in schools. Although controversial and eventually pulled back, it shows the perceived foundational importance of analysis and structures in basic education.

  • 1960: Foundations of modern analysis (Dieudonné) – Jean Dieudonné (a Bourbaki member) publishes Foundations of Modern Analysis, a textbook epitomizing the Bourbaki style (starting from set theory and topology, building up through normed spaces to differentiation and integration in $\mathbb{R}^n$). This book helps shape how analysis is taught at advanced levels worldwide – emphasizing structure and generality over classical calculation.

  • 1962: Kolmogorov–Arnold–Moser (KAM) theorem – Solves a long-standing problem in dynamical systems (stability of integrable Hamiltonian systems) using a blend of analysis (Fourier series, iterative schemes) and geometry. Marks the growth of nonlinear analysis and dynamical systems as a field marrying analysis, topology, and mechanics.

  • 1964: Atiyah & Singer’s Index Theorem – Published in 1963–1968, the Atiyah-Singer theorem connects analysis (elliptic differential operators) and topology (index of operators). This result is a pinnacle of “global analysis” on manifolds – using tools like pseudodifferential operators (analyzed by Hörmander and others) and $K$-theory. It symbolizes analysis reaching deeply into pure geometry/topology.

  • 1967: Founding of the Journal of Functional Analysis (JFA) – Reflecting the maturity of functional analysis, JFA is founded by R. Schatten and others as a dedicated venue. Other new journals around this time: SIAM Journal on Applied Math (1953) for applied analysis, Journal of Differential Equations (1965), etc., indicate the specialization of “analysis” into subfields with their own communities.

  • 1970: Hörmander’s Linear Partial Differential Operators – Lars *Hörmander publishes the first volume of his treatise (1963–1985), synthesizing the theory of linear PDEs using distribution theory and introducing microlocal analysis (with colleagues like Mikio Sato developing hyperfunctions, Fourier integral operators). Microlocal analysis (ca. 1970s) refines Fourier analysis to study singularities of solutions. Hörmander’s work earns him the Fields Medal (1962) and becomes the reference for analysts in PDE.

  • 1975: Fefferman’s Fields Medal (and Szemerédi)Charles Fefferman (Fields Medal 1978) is recognized for work in multi-dimensional complex analysis and Fourier analysis (like convergence of Fourier series). The mid-1970s also see Fields medals to analysts tackling combinatorial or discrete problems by analytic methods (e.g. Szemerédi 1976 for using ergodic theory in combinatorics). This illustrates analysis techniques spreading into other areas.

  • Late 1970s: Dispersive and nonlinear PDE advances – E.g., Terence Tao’s mentor group (though Tao is later) builds on work by Jean Bourgain, etc., in dispersive equations and harmonic analysis. The nonlinear Schrödinger and KdV equations are studied with Fourier transform techniques, marking the rise of nonlinear Fourier analysis and soliton theory (integrable systems were earlier, but now analysis tackles non-integrable, chaotic systems too).

  • 1982: Grenoble Battle (Pure vs Applied) – In France, a famous debate arises about separating pure and applied math at universities (the Bourbaki-influenced pure mathematicians vs. applied analysts and others). This reflects institutional tension: analysis was a common ground for pure and applied, but increasing specialization and funding pressures caused rifts.

  • 1984: Wavelet theory beginnings – Stéphane Mallat, Ingrid Daubechies (Fields 1994), et al., develop wavelet analysis in the late 1980s, providing a new tool for harmonic analysis with wide applications (image compression, signal processing). Wavelets show analysis adapting to digital needs, connecting functional analysis, Fourier, and practical algorithms.

  • 1986: Nonlinear Analysis in Geometry – Yau’s Fields MedalShing-Tung Yau (Fields 1982) recognized for solving Calabi conjecture using nonlinear PDE methods (complex Monge-Ampère eq.). Simon Donaldson (Fields 1986) also uses analytical PDE (Yang-Mills equations) in topology. These underscore that by the 1980s, geometric analysis (using analysis to solve geometric problems) is a thriving approach.

  • 1991: Banach Center Term “Analysis” Expands – As new fields emerge (fractals, chaos, theoretical computer science), the term “analysis” further broadens. For example, fractal geometry uses measure and dimension theory (analytic concepts), blur the line with analysis. The Polish Banach Center in a 1991 conference on “Analysis” covers topics from classical functional analysis to dynamical systems, showing the term’s broad scope.

  • 1994: Proof of Fermat’s Last Theorem (and role of analysis?) – Andrew Wiles proves FLT using algebraic geometry. While primarily algebraic, techniques from analysis (modular forms theory involves complex analysis and harmonic analysis on groups) were crucial. This highlights interplay: analytic number theory and algebraic geometry combined in modern breakthroughs.

  • 2000: Clay Millennium Problems – analysis prominent – 7 famous unsolved problems are announced with $1M prizes. Several are analytic: e.g. Navier–Stokes regularity (PDE analysis), Riemann Hypothesis (complex analysis + number theory), P vs NP (not analysis), Hodge conjecture (analysis in geometry), Yang–Mills gap (analysis of PDE in physics). This shows how central analysis questions remain in math & physics.

  • 2002: Fields Medals to Analysts (V. Jones, etc.) – Over the years, a large share of Fields medalists are from analysis or partial-analysis fields: 2002 Laurent Lafforgue (algebraic geometry but draws on analytic number theory), 2006 Terence Tao (for harmonic analysis, PDE, additive combinatorics), 2010 Cédric Villani (optimal transport, kinetic theory – an analytical field), 2014 Martin Hairer (stochastic analysis/PDE). Analysis continues to drive cutting-edge research.

  • 2010s: High-dimensional and computational analysis – Rapid growth in high-dimensional analysis (e.g. compressed sensing – using $\ell^1$ optimization for sparse solutions, as pioneered by Candès, Tao, etc.), optimization theory (Nesterov, Nemirovski – blending convex analysis and computing), and machine learning theory (optimization and measure concentration). Analysis branches like harmonic analysis find new life in data science (Fourier = FFT in signal processing, wavelets in image compression). Optimal transport (Villani) connects analysis, geometry, and probability to solve economics and physics problems. The meaning of “analysis” now comfortably includes probabilistic and combinatorial tools (e.g. analytic combinatorics, use of complex analysis in computing combinatorial counts).

  • 2020: Analysis and COVID-19 modeling – As a real-world note, during the COVID pandemic, differential equation models (SIR models, etc.) and analysis-driven simulations guide policy. This public visibility of applied analysis echoes how “analysis” historically was tied to solving pressing problems (from Newton’s planetary motions to Fourier’s heat flow).

  • Present (2025): Analysis as 25% of mathematics (by MSC categories) – Modern classification (AMS MSC2020) dedicates roughly one-quarter of all math research categories to analysis[4]. Categories 26–49 encompass real analysis, complex, functional, differential equations, Fourier analysis, operator theory, calculus of variations, dynamical systems, etc. “Analysis” today denotes an ever-evolving toolkit of mathematical methods for continuous phenomena, and its semantic boundary is still expanding – now touching computer science (analysis of algorithms, continuous optimization in machine learning), data (harmonic analysis on graphs for network science), and beyond. The story of “analysis” is ongoing, with new chapters being written as mathematics responds to new challenges.

(This timeline highlights select people, works, and events illustrating shifts in the meaning and scope of “analysis.” It is not exhaustive, covering major trends per era.)

Main Report Link to heading

Introduction: What Is “Analysis”? Link to heading

Polysemy of “Analysis”: The word analysis in mathematics has never had a single fixed meaning. Its semantic evolution mirrors the development of mathematics itself. Originally, analysis (from the Greek analusis, “a loosening, undoing”) referred to a method of solving problems by breaking them down – an idea well established in ancient Greek geometry. Pappus of Alexandria (4th century AD) famously described analysis as reasoning “from the thing sought, as if it were given, to that which is known”, and synthesis as the reverse. Early modern mathematicians revived these terms: by the 17th century, “analysis” often meant the heuristic or discovery phase of mathematics, as opposed to the formal presentation of results (synthesis). For example, René Descartes in La Géométrie (1637) extolled analysis (using algebraic equations) to solve geometric problems, whereas a synthetic solution would then formalize it.

However, as Newton, Leibniz, and their successors developed the infinitesimal calculus, analysis gained new meaning. Newton used “analysis” both in the classical sense and to denote the new calculus methods. In a 1676 letter, Newton speaks of “the moderns’ ‘analysis’ which through the use of infinite series (‘infinite equations,’ as he called them) could solve almost all problems”. Here “analysis” stands for the entire arsenal of algebraic and infinite series techniques—in stark contrast to ancient synthetic geometry. Newton also pursued the “Analysis of the Ancients” (geometric problem-solving) and lamented that Descartes’ algebraic analysis lacked the elegance of synthetic geometry. Thus even within Newton’s work, we see analysis as a method of calculus (new analysis) and as a reference to classical problem-solving.

By the 18th century, analysis had largely become identified with calculus and its extensions. The term “infinitesimal analysis” was commonly used for calculus. Euler in 1748 defined analysis in his preface essentially as the study of functions by means of infinite processes. An English commentator might have said analysis is “the art of calculating infinitely small and infinitely large quantities,” reflecting the calculus-centric view. The blurring of analysis with calculus was such that in 19th-century Britain, the calculus was often taught under the title “the higher analysis.” For instance, when the Analytic Society of Cambridge (formed by Babbage, Herschel, Peacock in 1812) advocated using Leibniz’s calculus notation, they described themselves as promoting “analysis” in Britain.

Yet “analysis” as a distinct discipline truly crystallized in the 19th century when mathematicians like Cauchy and Weierstrass introduced rigorous definitions and theorems that separated analysis from the older intuitive calculus. This period introduced phrases still in use: “mathematical analysis” (or Analyse mathématique in French, Mathematische Analyse in German) meaning essentially the rigorous theory of calculus and its growth into function theory and differential equations. Cauchy’s Cours d’Analyse (1821) is a landmark explicitly framing analysis as a course/topic of study, covering series, continuity, derivation, integrals, etc., with careful proofs. After Cauchy, the term analysis in academic contexts firmly meant theoretical calculus (often both real and complex). For example, in the 1850s, the Berlin school under Weierstrass used “analysis” to include power series, complex functions, elliptic functions, and so on, while courses on, say, number theory or projective geometry were separate.

Thus, analysis transitioned from a method to a body of knowledge. By 1900, an mathematician would say: “My field is analysis” to imply expertise in things like real variables, complex analysis, differential equations, etc., as opposed to algebra or geometry. This coincided with the institutional adoption of analysis: universities had chairs or professorships in “analysis” (often called something like “Professor of Analysis and Mechanics” or “Professor of Pure Mathematics (Analysis)”).

What core concept unites the various meanings of “analysis”? The through-line is the focus on breaking problems into simpler pieces – whether it’s a problem-solving heuristic (ancient analysis) or decomposing a function into infinitesimal pieces (calculus integration) or expanding a function into a series (Fourier analysis) or expressing an operator in simpler components (spectral analysis). Analysis is about understanding the whole via limits of the small: sums of tiny pieces, limits of approximations, expansions into simpler functions. Even when analysis became an abstract enterprise (ε–δ definitions, function spaces), it was still centered on the idea of limits, approximation, and continuity – in short, on the continuum. Indeed, as one modern historian put it, the story of analysis is the story of humankind’s attempt to understand the continuum. Early debates on the nature of the continuum vs. the discrete (Zeno’s paradox, atomism vs continuity) eventually led to calculus, and then to rigorous real number theory. Thomas Sonar (2020) explicitly defines analysis broadly as “the human endeavor to understand infinity” – capturing the fact that dealing with infinite processes (infinitely many terms, infinitely small increments) is the hallmark of analysis.

Key Facets of Analysis (historical senses):

  • Analysis as Method vs. Analysis as Subject: Up to the 17th–18th century, one could speak of “the analytical method” (solving a problem by assuming a result and deducing backwards). By the 19th century, one speaks of “studying analysis” as a subject. This shift from method to content marks analysis becoming a branch of math in its own right.

  • Infinitesimal Analysis: A common phrase in the 18th and 19th century (used by e.g. Euler, Laplace) meaning calculus. The qualifier infinitesimal eventually dropped out as the field matured and alternatives (like epsilon-delta) replaced actual infinitesimals. But interestingly, in the 20th century, Abraham Robinson’s nonstandard analysis (1960s) reintroduced infinitesimals rigorously, calling itself a new form of analysis, though it remained a niche approach.

  • Analysis vs. Analytic: The adjective analytic can mean “obtained by analytic (power series) expansion” (e.g., analytic solution vs numerical), or in complex analysis, an “analytic function” means complex-differentiable (a term introduced by Cauchy/Weierstrass). Historically, “analytic” also referred to the use of algebraic symbols—as in “analytic geometry.” There is rhetorical power in calling something analytic: it implies a certain sophistication and generality.

  • Mathematical Analysis vs. Other Analysis: The term “analysis” has broader meanings in general language (e.g. chemical analysis, analytic philosophy). In the 19th century it was not always obvious if someone said “analysis” whether they meant mathematical analysis or something like logical analysis. To distinguish, writers would say “analytical mathematics” or “mathématique pure (analyse)” in French. Over time “analysis” unqualified has solidified to mean mathematical analysis within scientific contexts.

  • Pure vs Applied Analysis: The pure side focuses on theory and proofs (e.g. existence theorems, qualitative behavior), while applied analysis might solve specific equations or model physical phenomena. However, the line is blurry: historically, some of the deepest pure advances (Fourier series, distribution theory) were driven by applied needs (heat equation, quantum field theory). Rhetorically, purity or applied flavor influenced the definition of analysis at various times (e.g. Weierstrass’s pure epsilon-delta vs. engineers’ applied calculus).

Plan of the Report: In the chapters that follow, we trace the evolution of analysis through distinct eras, roughly century by century from Newton to the present, examining key shifts in definition, scope, and practice. We will see how “analysis” was contested and refined by major figures (from Newton and Euler to Hilbert and Hörmander), how it was institutionalized in curricula and publications, how its boundary with other fields changed (e.g. the absorption of set theory and topology into analysis, or the peeling off of probability as its own field before being reabsorbed via measure theory), and how external pressures (physics, war, computing) pushed the meaning of analysis in new directions. By the end, we aim to understand not just historical facts, but why “analysis” mattered: it served as a battleground for debates on rigor, a bridge between pure thought and practical application, and a unifying concept that showed the underlying continuity of mathematics. As Dieudonné (of Bourbaki) wrote, “Mathematics is a strongly unified branch of knowledge... at the center of our universe are found the great types of structures,” among which he included the core structures underlying analysis. The story of analysis is very much the story of forging that unity through the study of the continuum and the infinite.

From “Analysis vs. Synthesis” to Infinitesimals (Newton–Leibniz Era) Link to heading

In the late 17th century, the term analysis carried an inheritance from classical geometry but was also being reshaped by the burgeoning calculus. This chapter explores how Newton, Leibniz, and their contemporaries understood “analysis” in mathematics, and how the invention of calculus transformed that understanding between roughly 1670 and 1720.

Classical Origins: Mathematicians of the 1600s were well-versed in the idea of analysis versus synthesis from geometric classics. For example, Descartes in his Discourse on Method (1637) had extolled algebra (his new analytic geometry) as an analytic approach to geometry: a way to reduce geometry problems to equations. In doing so, he consciously echoed Pappus’s distinction. Descartes claimed the analytic method was superior for discovery, whereas synthesis was better for presenting a polished solution. This view influenced both Leibniz and Newton.

Leibniz’s Perspective: For Gottfried Wilhelm Leibniz, analysis was a broad concept encompassing symbolic calculation and discovery. He saw algebraic manipulation as a kind of blind (mechanical) reasoning – he used the phrase “analysis, or the art of solving problems, is the literal calculus”. Leibniz distinguished between analysis as a general problem-solving tool and synthesis as the geometrical construction or proof after the fact. In the realm of calculus, Leibniz referred to “infinitesimal analysis” or sometimes “transcendental analysis,” meaning the calculus of infinitesimals that extended algebra to the infinitely small. He also had another usage: analysis situs (analysis of position), which was an early idea of a geometry of position (what we might see as topology). So in Leibniz’s writings of the 1680s–90s, analysis could variously mean the calculus, algebraic methods generally, or even a new kind of geometrical analysis (analysis situs). This polysemy did not trouble Leibniz, who believed in the unity of these methods. Leibniz emphasized that analysis was complementary to synthesis – one discovers with analysis, then justifies with synthesis.

One illuminating anecdote: Leibniz wrote of his dream of a “characteristica universalis”, a universal symbolic language in which all reasoning (not just math) could be done by a sort of algebra – this reflects an analytic ideal of reducing reasoning to calculation. His calculus was a piece of that vision: a symbolic analysis for continuous change.

Newton’s Two Faces of Analysis: Isaac Newton had a more ambivalent, evolving view. Early in his career (1660s–70s), Newton was a champion of what he called “the new analysis” – essentially the algebraic and calculus-based approach introduced by Descartes and enhanced by infinite series. He worked on problems of tangents, curvature, and quadratures by expanding functions in series or using fluxional equations, proudly seeing himself “advancing the moderns’ analysis.”. The quote from Newton’s 1676 letter is telling: “by their help [infinite series] analysis reaches, I might almost say, to all problems.” Newton clearly uses analysis here to mean the new calculus methods that leverage infinite processes. This is Newton the analyst, in line with Leibniz, claiming the power of analysis.

However, in the 1680s, something shifts. Newton studies ancient Greek works (Pappus, Apollonius) more deeply. He becomes increasingly enamored with the synthetic geometry of the ancients. As a result, Newton begins contrasting the old analysis (geometry) favorably against the new analysis (Cartesian algebra). By the time he writes his Enumeratio (1704) and other later writings, Newton often criticizes “the algebraic artifice” of Descartes as being a forced, mechanical solution that lacks elegance. He praises the “Elegance of Ancient Geometry” and points out that the Greeks kept their analysis secret, presenting only synthetic proofs, perhaps because analysis was seen as not rigorous or as a private tool. Newton tried to reconstruct the lost “Analysis of the Ancients”, which he believed relied on more geometrical methods (like those involving projective properties of curves)[5].

So Newton’s use of the term analysis is complex: in youth, it’s the modern method of calculus; in maturity, it’s the hidden gem of Greek geometry. Newton even wrote a paper “On Analysis by Equations with an infinite number of terms” (his De Analysi, 1669, published 1711) which clearly uses analysis to mean series expansions—the cutting edge calculus approach. But he also wrote “On the Analysis of the Ancients”, extolling synthetic methods. Newton saw value in both: ideally, one would discover via analysis (either algebraic or inventive geometry) and then demonstrate via synthesis. In Newton’s famous phrase in Opticks (Query 31), he compares the analysis-synthesis method in math to that in natural philosophy: one analyzes phenomena to discover forces (analysis), then synthesizes to establish principles.

Analysis vs. Synthesis in the Calculus Priority Dispute: The Newton-Leibniz priority dispute (1700s) also had a nationalistic tinge that affected the meaning of analysis. The British mathematicians (following Newton) stuck longer to “geometrical fluxions” and were suspicious of the “foreign analysis” (i.e., Leibnizian calculus). Continentals (followers of Leibniz, like the Bernoullis and Euler) touted their “analytic” methods as superior. In 1712, the Royal Society report accusing Leibniz uses the phrase “art of analysis” referring to calculus and hints that Newton had it first. Meanwhile, Johann Bernoulli famously said of the British, “He who can integrate (solve) equations, he has the [true] analysis.” The implication: the Continent had the powerful analytic methods, while England was lagging with fluxions. Indeed, by mid-18th century, English mathematicians acknowledged their isolation and strove to import continental “analysis.” This led to the 1810s Analytical Society in Cambridge, which translated Lacroix’s calculus text and replaced fluxion notation with $d/dx$ notation. They explicitly framed it as bringing “the light of analysis” to England.

Early 18th Century Textbooks: The word analysis appears in many early titles, underlining how authors conceived it:

  • Newton (unpublished until later) Analysis per æquationes numero terminorum infinitas (1669): analysis via equations with infinitely many terms (power series).

  • Leibniz’s school: Wolff’s Beginner’s Book of Mathematical Analysis (German, 1710s) which was actually about algebra; analysis still used broadly.

  • The Bernoullis: Jacob Bernoulli’s Ars Conjectandi (1713) wasn’t analysis but in its preface he talks about the analytic art in probability.

  • Euler: Analysin Infinitorum (Analysis of the infinite, 1748) as mentioned.

  • Maria Gaetana Agnesi’s Instituzioni di Analisi (1748, Italian) – an early calculus textbook for students, again using analysis in title.

All these instances show analysis = calculus and allied algebraic methods.

Analysis vs. Geometry Dichotomy: By 18th century’s latter half, it became common to label works either “analytic” or “geometric.” For example, Euler’s works on mechanics (1760s) are analytic (using calculus), whereas some geometers like Etienne Bézout would produce “geometric” solutions to problems for didactic reasons. Lagrange outright said in 1788 that one could do mechanics “without the ugly scaffolding of geometry,” using analysis only – illustrating analysis was seen as a purer, more powerful approach. Analysis was often equated with using equations. A telling line: “Algebra is analysis, Geometry is synthesis,” as a stereotype.

Boundary of Analysis in 1700: If we asked a mathematician circa 1700, “what counts as analysis?”, they’d likely say: - The calculus of differentials and fluxions – yes, that’s the new analysis. - Infinite series expansions – yes, analysis. - Algebraic manipulation of equations (like solving polynomials) – borderline: they might call it algebra, but algebra was often subsumed under analysis as the general art of solving. - Classical synthetic geometry (Euclid, Apollonius) – definitely not, that’s synthesis. - Projective geometry (e.g. Pascal, Desargues) – they might consider that more geometry unless equations are used.

So analysis in this era was method-focused: if you used symbols and series, you were doing analysis.

Influence of Physics and Astronomy: A huge driver in this era was that analysis solved problems in astronomy and physics that geometry couldn’t. Newton’s law of gravitation and differential equations of motion demanded analysis (the calculus) to predict orbits – which he did synthetically in Principia to some extent, but the “analytical” approach eventually dominated celestial mechanics (Laplace, Lagrange in late 1700s). This success gave “analysis” enormous prestige. Analytic methods were synonymous with progress in mathematical physics. This was encapsulated by the French Académie des Sciences in the 1740s–1780s, where debates like “What is the true analytical solution of the problem of planetary perturbations?” took place. Euler, D’Alembert, Clairaut: all engaged in analytical treatments of mechanics.

Rigor (or lack thereof): Notably, the early analytic methods were not rigorous by modern standards. Infinitesimals were used freely without a clear foundation, series were used outside their radius of convergence, etc. But the attitude of the time valued power and success in solving problems over rigor. Bishop Berkeley’s famous pamphlet “The Analyst” (1734) criticized the logical foundations of calculus (calling infinitesimals “ghosts of departed quantities”), but even he titled it “Analyst” – acknowledging calculus practitioners as analysts. This critique did sow seeds of doubt, but it took another century for rigor to become a priority. Meanwhile, the term “analysis” wasn’t tarnished by these foundations issues; if anything, the successes overshadowed the logical gaps.

Analysis vs. Synthesis in Education: In the early 1700s, one still learned geometry via Euclid (synthetic) and algebra separately. By late 1700s, many curricula (especially in France) taught calculus as “Analyse”. For instance, the École Polytechnique (founded 1794) had a course called “Analyse” which included calculus and related analytical geometry. There was a conscious break from pure classical geometry. The revolution in France brought a push for utilitarian math education – meaning analysis took center stage as it was more useful for engineering and artillery, etc. So the educational system in France pivoted strongly toward analysis around 1800, largely thanks to figures like Monge and Lacroix.

In summary, the Newton-Leibniz era established two key legacies for the semantics of analysis: 1. Analysis as Calculus: The calculus became the prototype of what analysis is – dealing with continuous change via infinitesimals or limits, solving problems algebraically. 2. Analysis vs. Synthesis Rhetoric: The idea that there are two complementary ways in math – an analytic (often algebraic/calculus) way and a synthetic (geometric/logical deduction) way. This dichotomy persisted through the 19th century, often reframed (rigorous vs. intuitive, etc.).

Thus, by 1720, the European mathematical community had essentially redefined analysis from its ancient meaning to a new one: analysis = algebraic/calculus approach to solving problems. As we move to the next era (Euler and beyond), we’ll see this meaning further solidified and broadened, and we’ll also witness the early signs of analysis becoming a formal subject with expanding content.

Eulerian Expansion and Continental Normalization (1730s–1820s) Link to heading

In the 18th century, under the towering influence of Leonhard Euler and his contemporaries, mathematical analysis underwent explosive growth. This era “normalized” the use of analysis across Continental Europe – it became the standard approach in mechanics, astronomy, and most of higher mathematics. We examine how Euler and others expanded the content of analysis and how the term was used through the late 18th to early 19th centuries.

Euler’s Comprehensive Analysis: Euler (1707–1783) was the most prolific mathematician of the 18th century, and much of his work fell under what we call analysis. Euler had an exceptionally broad view: for him, analysis encompassed the study of functions, infinite series, differential equations, and more. His two-volume Introductio in analysin infinitorum (1748) aimed to lay foundations of analysis independent of geometry. In it, Euler famously declares that the study of analysis is, fundamentally, the study of functions. He introduces the concept of a function $y = f(x)$ in general terms, shifting away from viewing a function as an algebraic expression or geometric curve, to a more abstract notion that any formula or series defining a relation is a function. Euler also systematically develops the exponential and logarithmic functions, trigonometric functions (via infinite series and products), and lays out the idea of expansions in power series. This work is often cited as “laying the foundations of modern analysis.” Indeed, the Eulerian definition of a function and his habit of freely manipulating power series significantly shaped analysis going forward.

Euler’s other major contributions to analysis: - Infinite Series and Products: Euler made dazzling uses of series: e.g. the series expansion for $e^x$, the formula $e^{ix}=\cos x + i\sin x$ (establishing the fundamental bridge between exponential and trigonometric functions), and evaluating $\sum 1/n^2 = \pi^2/6$ via what we now call Fourier series methods. He introduced the Beta and Gamma functions (analytical continuation of factorial), the zeta function (basically inventing analytic number theory by linking it to primes), etc. Each of these extended the reach of analysis into new territory (special functions, complex series). - Differential Equations: Euler wrote Institutionum calculi integralis (3 volumes, 1760s) on solving differential equations. He treated series solutions, introduced integrating factors, and the notion of linear differential operators – all analytic techniques that became core to 19th-century analysis. - Calculus of Variations: Alongside Lagrange, Euler developed variational calculus (Euler-Lagrange equation, 1755). The very name Calculus of Variations suggests an analytic process (taking limiting variations). At the time, it was seen as a part of analysis (the term analytical mechanics in Lagrange’s 1788 work includes variational principles). - Analytic Mechanics: Euler’s Mechanica (1736) wrote Newtonian mechanics in differential equation form – a triumph of analysis applied to physics. Later, Lagrange’s Mécanique Analytique (1788) completed this trend. The “analytical” in Lagrange’s title explicitly means he’s doing mechanics purely via algebra/calculus, not geometrical diagrams. Thus analysis became synonymous with the cutting-edge methods in physics.

By Euler’s time, the meaning of analysis had thus broadened: it was no longer just differentiation and integration of simple functions, but included: - Generalized functions like trigonometric, exponential, logarithmic. - Infinite processes (series, products, integrals, continued fractions). - Differential equations and their series solutions. - Techniques like partial fractions, perturbation expansions, etc. It basically described the whole toolkit for dealing with continuous quantitative problems. Euler himself considered algebra as just a part of analysis (the finite part), while the calculus was the infinite part – both unified under analysis.

Normalization Across Europe: Euler worked in the Academy of Sciences in St. Petersburg and later in the Berlin Academy. Through these institutions and vast correspondence, Euler’s analytic methods spread widely. By mid-century, every major mathematical center in Continental Europe had embraced analysis: - In France, after some early resistance, by 1750s the likes of D’Alembert, Clairaut, and later Laplace and Lagrange, all adopted and advanced Eulerian analysis. The term “Analyse” was proudly used (e.g. by Laplace in describing his probability theory or celestial mechanics). - In the German states and Russia, Euler and the Bernoullis set the tone. Crelle’s Journal (Berlin, founded 1826) originally titled Journal für die reine und angewandte Mathematik – heavy emphasis on analysis in early volumes (with contributions from Abel, Jacobi in function theory). - In Italy, by late 1700s, analysts like Riccati and later Ruffini or Brunacci taught calculus known as analisi. - In Britain, however, analysis had stagnated post-Newton (due to the fluxional vs. Leibniz notation split). It wasn’t until around 1800 that the British fully joined the analysis mainstream. The Analytical Society (1810s) and figures like Herschel and Peacock helped update British education to continental analysis. By the 1820s, Cambridge’s curriculum was reformed to include continental analysis methods.

Analyse vs. Algebra vs. Geometry: Euler’s success did highlight a split: geometry was being left behind in cutting-edge research. Analytical geometry (Descartes) had been integrated into analysis by Euler, who freely used coordinate geometry but always via equations. Algebra was the skeleton of analysis – Euler’s analysis textbooks often began with algebraic identities and finite differences before going “to the limit.” But algebra as theory (e.g. polynomial equations theory, which later became group theory) was not yet separate; that happens mostly in the 19th century (with Galois, etc.).

So in Euler’s time, a pragmatic division might be: - Analysis (or higher analysis): everything involving calculus, infinite series, and advanced algebraic computation. - Elementary algebra and arithmetic: solving finite polynomial equations, number crunching – considered lower level. - Geometry: synthetic Euclidean geometry (still taught but seen as less potent for new discoveries). This is why many late 18th-century works used titles like “Elements of Algebra” vs “Elements of Analysis.” Euler himself wrote an Elements of Algebra (1765) as a separate basic text, whereas his analysis texts covered what we consider advanced topics.

Laplace and Fourier – Analysis as lingua franca of physics: By the turn of the 19th century, Pierre-Simon Laplace in France exemplified analysis’ power in his Traité de Mécanique Céleste (1799–1825) and Théorie Analytique des Probabilités (1812). Laplace’s work on planetary motion introduced spherical harmonics and potential theory – deeply analytical approaches (solving Laplace’s equation, etc.). He rarely drew a diagram, using generating functions and expansions. Likewise, Joseph Fourier’s treatise on heat (1822) applied trigonometric series to heat flow, presuming that any function can be expanded in a series – a bold analytical assertion. The success of Fourier’s method in solving the heat equation made Fourier analysis a permanent part of analysis (even though it took later work by Dirichlet, Riemann, Lebesgue to fully justify).

Fourier’s work also spurred a famous epistemological debate: he claimed any “arbitrary” function could be represented by a trigonometric series. Initially, many (e.g. Poisson) accepted this as generally true, but skeptics like Dirichlet later provided examples where Fourier series behave oddly. This started a line of questioning: what is a function? what functions are admissible in analysis? – leading to the rigorous definitions of the 19th century. But in Euler and Fourier’s era, these questions were secondary to formal calculation. They pragmatically assumed everything nice enough.

Did “analysis” include complex numbers yet? Yes and no. Euler used complex exponentials freely (e.g. $e^{i\pi} + 1 = 0$). Complex numbers were accepted as an analytical tool by the late 18th century (especially by Euler and the Bernoullis in solving polynomial equations or trigonometric integrals). The term “analysis” could include complex operations, but a formal theory of complex analysis (as a separate subject) really developed in early 19th (with Cauchy, Gauss’s 1831 paper on integral theorem, etc.). Euler did consider expansion of $\ln(1 + z)$, etc., for complex $z$. In 1746, d’Alembert articulated the fundamental theorem of algebra (every polynomial has a root, implying complex numbers are necessary) – this is an analytic statement linking algebra and analysis.

Education and Institutions: In the late 18th century: - The École Polytechnique (Paris), founded 1794, quickly became an elite training ground emphasizing analysis for engineering and science. Professors like Lacroix, Fourier (briefly), Ampère taught analysis. Their course notes/texts spread throughout Europe. - The École Normale (Paris) (briefly in 1795, reestablished 1808) trained teachers with an emphasis on analysis (to propagate the Revolutionary curriculum of rational mechanics and calculus). - Universities in Germany/Austria: e.g., in 1780s Vienna, 1790s Göttingen, analysis became part of standard higher education. The term “Calculus” might still be used, but increasingly they’d say “Analysis.” - Learned academies (Paris Academy, Petersburg Academy, Berlin Academy) often posed prize questions that basically required analytic solutions (e.g., the Borda Prize problems on Earth shape or planetary motion).

Printed Journals and Papers: By the early 19th century, we see journals with analysis in the title: e.g., Journal de l’École Polytechnique published many “analysis” articles (Lagrange on calculus of functions, etc.). The first issue of Crelle’s Journal (1826) had Abel’s work on functional equations (analysis), and though it was “pure and applied math,” in practice it was heavy on analysis and number theory. Another is the Italian Annali di Matematica Pura ed Applicata (started 1850s, but roots in earlier Italian periodicals) – again analysis was a key theme.

Summing up this era’s semantics: By 1820, analysis firmly meant: - The calculus in its various forms (differential, integral, variations). - The theory of series and expansions (including complex power series, etc.). - The solution of differential equations and other infinite-process problems. - The mathematical support of physical theories (elasticity, astronomy, optics etc. were all done “analytically”).

This was the golden age of analytic triumphs. The major concern wasn’t to rigorously justify everything (that comes next), but to solve problems. Therefore, analysis had a somewhat pragmatic meaning: it was what you used to solve continuous problems.

However, glimmers of the next phase appeared. For example, Fourier’s series paradoxes raised questions. Abel, near his death in 1829, criticized divergent series: “Divergent series are the invention of the devil,” presaging the drive for rigor. Cauchy is already present at this time (1820s) starting to enforce stricter reasoning. So as we cross into the 19th century proper, we see analysis ready to undergo a foundational reformation.

Finally, one shouldn’t overlook that in everyday language among educated people, analysis might also simply mean calculation or reasoning. For instance, one might refer to an astronomer doing an orbit computation as “performing an analysis.” The term had broad connotations of serious mathematical work.

In the next section, we’ll focus on how Cauchy and his peers around 1820–1860 changed analysis by injecting rigor and defining the field more sharply, essentially creating what we now call real and complex analysis as distinct sub-disciplines with formal definitions.

Cauchy’s Rigor and the Foundations of Real/Complex Analysis (1820s–1860s) Link to heading

The period from the 1820s to mid-19th century was transformative: Augustin-Louis Cauchy spearheaded a drive for rigor that fundamentally changed “analysis,” and the field bifurcated into what we now distinguish as real analysis and complex analysis, along with a growing concern for foundational issues (limits, continuity, convergence). This chapter examines how the meaning of analysis shifted with Cauchy, and how others like Dirichlet, Riemann, Abel, Bolzano, and Weierstrass further developed its foundations up to around 1860.

Cauchy’s Revolution (1821): Cauchy’s Cours d’Analyse (1821) is often regarded as the first modern analysis textbook. In it, Cauchy explicitly aimed to put calculus on rigorous footing. He introduced formal definitions for fundamental concepts: - Limit: “When the values successively attributed to a particular variable indefinitely approach a fixed value ... this fixed value is called the limit of all the others.” This $\epsilon$-$\delta$ style definition (without the symbols) was new to many readers; earlier authors used more intuitive language. - Infinitesimal: He defined an infinitesimal in terms of limits: a variable that becomes infinitely small is one whose limit is 0. By doing so, he tried to keep the useful intuitive idea of infinitesimal but anchor it in limit concept. - Continuity: Cauchy defined a function $f(x)$ as continuous if a small change in $x$ produces an arbitrarily small change in $f(x)$ – essentially a $\delta$-$\epsilon$ definition (though he phrased it with infinitesimals). - Derivative: via the limit of a difference quotient (as earlier), but with attention to existence. - Integral: Cauchy defined the definite integral as a limit of sums (the Riemann sum concept), which was a departure from viewing an integral primarily as an antiderivative. He actually gave the definition of what we now call the Riemann integral (though Riemann would refine it in 1854).

Cauchy’s emphasis was on rigor: “As for the methods, I have sought to give them all the rigor that one demands in geometry”. This was a direct response to critiques like Berkeley’s and to the general feeling that calculus had become logically shaky. Cauchy’s work didn’t eliminate all issues (he still allowed some arguments from generality of algebra, which Abel and others criticized), but it was a giant step.

Crucially, Cauchy’s approach also delineated the field: his textbook is explicitly an analysis course for future engineers and mathematicians at the École Polytechnique, covering series, real functions, complex numbers, and so on. The title “Analyse algébrique” (Algebraic analysis) on the cover indicates this was considered one part (the foundational, series and limits part) of analysis, distinct from say “transcendental analysis” (which would include integration, etc. – a Part II that Cauchy never published as such, though his Residue calculus and Exercises fill that gap).

Cauchy’s rigor helped cement analysis as a discipline with standards of proof, not just heuristic problem solving. Students now had to prove convergence of series, check error bounds, etc., not only solve equations.

Real vs. Complex Analysis Origins: In Cauchy’s era, the theory of functions of a complex variable also took shape. Cauchy himself made seminal contributions: his Cauchy Integral Theorem and Integral Formula (1825–27) gave birth to complex function theory – showing that $\oint f(z)\,dz=0$ under certain conditions, etc. Initially, this was just considered another part of analysis (not separate courses yet), often called “the theory of functions”. Cauchy’s 1820s research papers and later Bernhard Riemann’s work (1851 thesis on complex functions) turned complex analysis into a profound theory distinct from real analysis in flavor (emphasizing holomorphicity, contour integration, etc.). By the 1850s, you have books like Puiseux (1851) on complex functions, and later Riemann (1850s) linking complex analysis with topology (via Riemann surfaces).

So mid-century, one sees the term “analysis” sometimes split: e.g., a university might offer “Cours d’analyse (fonctions d’une variable réelle)” and a separate “Cours de la théorie des fonctions (d’une variable complexe)”. The latter might not always be called analysis, but in essence they are subdomains of the broad field of analysis.

Abel, Dirichlet, and Rigor in Series: Around the same time, Niels Henrik Abel (Norway) and Peter G. Lejeune Dirichlet (Germany) were reinforcing rigor: - Abel (1826) proved the convergence of the binomial series carefully and studied the convergence of power series—famously stating results about the radius of convergence. Abel also introduced the notion of uniform convergence in 1828 (though not with that name) in the context of interchanging limit operations, warning that term-by-term limit of series might not be continuous if not uniformly convergent. These were analytic truths going beyond Cauchy. - Dirichlet (1829) gave the first rigorous proof of the convergence of a Fourier series under certain conditions (piecewise monotonic functions). In doing so, he gave the modern definition of function mentioned earlier (mapping from reals to reals without a formula restriction). Dirichlet’s clarity on what a function could be (something like: any correspondence such that each $x$ has a unique $y$) was a shift – it allowed “general” (even discontinuous) functions into analysis, which earlier some (like Fourier or even Cauchy) assumed functions were nice and piecewise analytic. Dirichlet’s theorem on Fourier series (published 1829) essentially created the rigorous study of trigonometric series – a new part of analysis requiring careful limiting arguments and integration theory.

Emergence of Analysis as a Foundation for All Math: Interestingly, as analysis became rigorous, other fields started to borrow analytical ideas. For instance, number theory saw Dirichlet use analytical tools (like Dirichlet $L$-series in 1837 to prove existence of primes in arithmetic progressions), launching analytic number theory. Though number theory was considered separate (belonging to “arithmetic” traditionally), by using analysis it blurred lines. Similarly, probability – which Laplace had treated analytically (with generating functions and integrals) – got more analytical attention by Poisson and Chebyshev, who used limits and integrals to prove early limit theorems.

Institutional and Pedagogical Aspects (1820–1860): - The École Polytechnique continued to champion analysis: textbooks by Cauchy, then Gabriel Lamé and Joseph Liouville (both wrote on differential equations and harmonic analysis in mid-1830s). Liouville, who founded the important Journal de Mathématiques Pures et Appliquées in 1836, ensured analysis articles (including Sturm-Liouville theory on eigenfunctions, Liouville’s own complex analysis results) were prominent. - Universities in Berlin, Göttingen, etc., taught analysis with increased emphasis on rigor by the 1850s. For example, Weierstrass’s lectures in Berlin (from 1850s onward) were famous for their thorough $\epsilon$-$\delta$ approach; students would transcribe them into treatises – that’s how many learned proper real and complex analysis. - Cambridge and Oxford: by mid-century, thanks to the influx of ideas from the Continent (and people like William Thomson (Lord Kelvin) going to Paris then returning), the British also started adopting rigorous analysis. Though famously, even by 1860, British texts like De Morgan’s or Todhunter’s still lacked the rigor of Cauchy/Weierstrass, focusing more on computation.

Weierstrass and Arithmetization (1850s–60s): Though Weierstrass’ major publications came in the 1870s, his influence began earlier through lectures. Weierstrass insisted on building analysis purely from arithmetic properties of $\mathbb{R}$ (real numbers), avoiding any geometric or intuitive arguments (he distrusted reasoning based on pictures or physical intuition, calling them “geometers’ intuition”). This movement is known as arithmetization of analysis. It culminated in: - Precise $\epsilon$-$\delta$ definitions of limit, continuity, derivative, integral (refining Cauchy’s which sometimes still relied on infinitesimals or claimed “evidence”). - Construction of real numbers from rationals (done by others like Dedekind and Cantor in 1870s, but with Weierstrass’s encouragement – he lectured on irrational numbers by decimal expansions around 1860). - Emphasis on sequences (Weierstrass defined limit of a sequence rigorously; his notion of Cauchy sequence formalized convergence in metric terms). - Counterexamples and pathologies to show necessity of hypotheses (Weierstrass function of 1872 being the classic example that continuous $\not\Rightarrow$ differentiable).

The mid-century also saw Bernhard Riemann (1826–1866) contributing foundationally: - Riemann’s 1854 habilitation on the foundations of geometry indirectly influenced analysis by introducing topological ideas and integration on manifolds, etc. - In 1851, Riemann’s PhD thesis on complex variables introduced Riemann surfaces, giving a very geometric (but rigorous in its way) picture to multi-valued analytic functions. This was a different approach than Weierstrass’s: Riemann was more intuitive, relying on conceptual arguments (Dirichlet principle for example) that were not fully rigorous by Weierstrass’s standards (indeed, Weierstrass found a gap in Riemann’s reasoning on Dirichlet principle). - Riemann defined what we now call the Riemann integral (in an 1854 paper “On the representation of a function by trigonometric series”), creating a more generalized integration theory than Cauchy’s. He formulated the criterion for Riemann integrability (that the set of discontinuities has measure zero, though measure concept wasn’t formal yet). He also gave examples of weird functions (like the Dirichlet function which is 1 on rationals, 0 on irrationals, which is discontinuous everywhere and not integrable). These sharpen the concept of what is integrable or not – tightening the definition of analysis’s scope. - With Riemann and Dirichlet, Fourier analysis became rigorous (they set conditions when Fourier series converge). Yet, new questions arose: can one construct a continuous function that is nowhere differentiable? Weierstrass answered yes (1872), surprising many including Riemann’s teacher, Carl Gustav Jacobi, who thought any continuous function must be differentiable except maybe a few points. This result was part of what one might call the “vibrant culture of counterexamples” that solidified analysis – showing that intuition can be wrong and rigorous definitions matter.

Changing Boundaries with Other Fields: By 1860, analysis was absorbing set theory and topology beginnings. Although not recognized as separate fields yet, the work of Bolzano (on continuity, 1817), Riemann, and Cantor (started 1870s) would soon bring set theory language (points, sets, countability, etc.) into analysis. For instance, Cantor originally was studying Fourier series (an analytic topic) when he developed set theory to handle convergence issues. So set theory was born as a tool for analysis. Similarly, topology (analysis situs) as Poincaré and others developed in late 1800s was often to serve analysis (e.g. classifying Riemann surfaces, understanding continuity in abstract spaces).

Probability too, while largely separate as a field, had an analytic side in Laplace, and later Chebyshev (who introduced Chebyshev’s inequality in 1867) – which is an analytic type estimate. The formal merger of probability and analysis awaited measure theory (1900s), but the seeds were in these 19th-century works bridging them.

Summary of Semantic Shift by 1860: - Analysis by now unequivocally meant a rigorous mathematical field dealing with limits and functions. The looseness of earlier times was gone among leading mathematicians (though perhaps not in all textbooks yet). - There was now a concept of “pure analysis” – doing analysis for its own sake, as opposed to using it solely for solving physical problems. Weierstrass exemplified pure analysis; Riemann and Dirichlet, though motivated by some physics or geometry, ended up doing pure math. - The scope of analysis included: real analysis (real functions, series, integrals, differential equations on $\mathbb{R}$), complex analysis (holomorphic functions, contour integrals, etc.), Fourier and other integral transforms (harmonic analysis seeds), calculus of variations, and beginnings of differential geometry (which Riemann’s work touched, although later this becomes more geometric). - Not included in analysis were things like abstract algebra, number theory (except analytic number theory portion), synthetic geometry, and emerging things like logic/set theory (though set theory soon integrated). - Analysis had also subdivided: one starts to see specialists identifying as, say, “analysts (analystes)” versus “geometers” or “algebraists.” For instance, in France, Chasles (a geometer) and Liouville (an analyst) were contemporaries around 1840 with very different styles; a divide was apparent.

As we move forward, the next big change (1860s–1900) will be the full arithmetization and the birth of what we call modern analysis, including set-theoretic foundations, measure theory’s precursors, and further blur with other fields like number theory and topology.

Arithmetization, Set Theory, and the Rise of Function Theory (1860s–1900) Link to heading

By the late 19th century, the quest for rigor in analysis reached its culmination in the arithmetization of analysis – the reduction of analysis to purely arithmetic (and set-theoretic) principles, eliminating any remaining intuitive notions of motion or infinitesimals. Simultaneously, new subfields such as set theory and point-set topology emerged largely from analytical considerations, and “analysis” further expanded to include a rich theory of functions and integrals. This chapter explores how figures like Weierstrass, Cantor, Dedekind, Heine, Cantor, and Baire hardened the foundations of analysis and how the boundaries of the field shifted as a result.

Weierstrass’s Legacy: Karl Weierstrass (1815–1897) is often hailed as “the father of modern analysis”. He insisted that analysis be built upon an arithmetic understanding of the continuum. Key accomplishments and influences: - He gave rigorous $\epsilon$-$\delta$ definitions for limit and continuity in his lectures (some of this was already in Cauchy, but Weierstrass standardized it and removed reliance on infinitesimals completely). - Defined irrational numbers by series: Weierstrass taught constructing real numbers via decimal expansions or Cauchy sequences of rationals (though he didn’t publish a paper; his students like Cantor, and contemporaries like Dedekind, carried that to press). In effect, he helped answer the question, “What are real numbers, precisely?” – which was crucial to firming up analysis. - Proved the Intermediate Value Property rigorously (which Bolzano had earlier done in 1817, but unknown to most). - Developed the theory of uniform convergence (a term coined by his student Heine in 1872). Uniform convergence became a fundamental concept separating the interchange of limits from continuity, integration, differentiation conditions – an essential in real analysis courses. - Gave the example of a continuous nowhere-differentiable function (Weierstrass function) in 1872, which had enormous impact: it forced mathematicians to accept that intuition must be tempered by rigorous definitions, as surprising pathological examples do exist in analysis if one is not careful. - Work on power series: Weierstrass proved deep results like any function analytic in a neighborhood can be represented by a power series (if certain conditions are met), formalizing much of what had been heuristic in complex analysis. He introduced the concept of analytic continuation and natural boundaries, etc., making the theory of analytic functions rigorous and systematic. - Dirichlet’s Principle controversy: Weierstrass famously found a flaw in Riemann’s use of Dirichlet’s Principle (minimization argument lacking a proper minimum) and offered alternative proofs. This underscores a shift: what Riemann did with physical intuition (thinking of an energy functional finding a minimum), Weierstrass recast analytically by requiring a minimizing sequence’s existence (leading to his own Weierstrass Approximation Theorem on polynomials approximating continuous functions, 1885).

By around 1870, because of Weierstrass and his followers, the standard of rigor in analysis was set very high. No appeals to intuition – everything needed $\epsilon$-$\delta$ clarity. A quote often attributed to Weierstrass is, “You must never forget that, as long as we don’t prove it, a mathematical statement is an hypothesis, not a theorem.” This became ethos.

Arithmetization Achievements (1870s): The year 1872 was remarkable: - Dedekind’s cut (Continuity and Irrational Numbers, 1872): Richard Dedekind published the construction of real numbers by Dedekind cuts, ensuring $\mathbb{R}$ is complete (no gaps). This provided a clean arithmetic model of continuity, fulfilling the dream of arithmetization. Dedekind explicitly framed this as giving analysis a secure foundation (no longer needing geometric intuition about “continuous line” – you can build it set-theoretically from rationals). - Cantor’s theory of reals (1872): Georg Cantor published a paper on real numbers defined via Cauchy sequences of rationals (essentially equivalent to Weierstrass’s lecture content). In another 1872 paper, he introduced the concept of pointwise vs uniform convergence and sequences of functions – exploring Fourier series convergence deeply. Cantor’s work grew from analysis problems and soon became set theory. - Heine (1872): Eduard Heine formalized uniform convergence and also what we now call the Heine–Cantor theorem: a continuous function on a closed bounded interval is uniformly continuous (he was building on earlier notions by Dirichlet and Thomae). - Thomae’s function (1875): Carl Johannes Thomae gave an example of a function (the “popcorn” function: 0 on irrationals, $1/q$ on rationals $p/q$ in lowest terms) that is discontinuous at rationals and continuous at irrationals. This was another example of a strange beast in analysis (even though Riemann had an earlier example of a function discontinuous on rationals). These served pedagogical roles to refine understanding of continuity.

Set Theory and Topology as Children of Analysis: - Georg Cantor (1870s–1880s): Starting with trying to solve questions about uniqueness of Fourier series, Cantor invented set theory. He defined what it means for sets to have the same cardinality, proved the real numbers are uncountable (1874), and developed the theory of transfinite numbers by 1880s. All early examples in Cantor’s papers are subsets of reals – i.e., topics directly relevant to analysis (like Cantor set: a set of measure zero but uncountable, 1883). This gave analysts new language: “point set”, “derived set”, “perfect set”, “cantor set”, etc., to discuss convergence and continuity. By 1890, the phrase “set of points” in an analysis context was common. - Boundedness and Completeness: Discussions on completeness of reals (completeness is the property that every Cauchy sequence converges, or every bounded set has a least upper bound) became part of analysis foundations. This crystallized with Bolzano–Weierstrass Theorem – every bounded sequence has a convergent subsequence (Bolzano proved it 1817 unknown, Weierstrass popularized via lectures). That theorem is now core to real analysis.

  • Metric and Topological notions: Although Maurice Fréchet would coin the term metric space only in 1906, 19th-century analysts already thought in those terms for $\mathbb{R}^n$. For instance, the idea of an open set (neighborhood) was used implicitly by Bolzano and explicitly by Cantor. Osgood in 1890s and others start formalizing these topological notions for function convergence and continuity.

Function Theory (Complex Analysis) Flourishing: After Cauchy and Riemann, complex analysis matured further with Weierstrass: - Weierstrass’ factorization theorem (expressing entire functions as products, 1870s). - Understanding of singularities, classification into essential, pole, removable (Weierstrass and Casorati). - The concept of analytic continuation and natural boundaries (had roots in Riemann’s multi-valued approach and Weierstrass’s power series approach). - By 1890, Felix Klein and Sophie Kowalevski etc. were applying complex analysis to solve real differential equations (like Kowalevski’s work on spinning top used complex function theory). - Journals like Acta Mathematica (founded 1882 by Mittag-Leffler) published many complex analysis papers; Journal für Math (Crelle) and Mathematische Annalen likewise.

Analytic vs. Synthetic Geometry Revisited: At the end of 19th century, the old feud had new forms. Projective and algebraic geometry had advanced (in synthetic style by Pappus’s followers and in analytic style by Plücker, etc.). However, a new “synthetic” competitor to analysis emerged in the foundation crisis: debates between analysts (Cantor, Dedekind, Hilbert) and intuitionists (Kronecker, later Brouwer). Kronecker famously said, “God made the integers, all else is the work of man,” objecting to Cantor’s set theory and continuous uncountable sets. This was, in a sense, a challenge to analysis: are we sure those analytical notions like uncountable sets are meaningful? While that fight would peak in early 20th century, it started with Kronecker (1880s) decrying Cantor’s “analysis of the infinite” as metaphysics.

Broadening of “Analysis”: By 1900, analysis encompassed: - Classical analysis: Real and complex function theory, differential/integral calculus basics. - Theory of integrals: Riemann integration, improper integrals, criteria for integrability. - Differential equations: Both ordinary and partial differential equations (via Fourier, etc.) – although PDEs were often seen also as “mathematical physics,” they were taught in analysis courses (e.g. in France, courses d’analyse would include solving heat or wave equations). - Fourier series and harmonic analysis: Now quite developed with Dirichlet’s conditions, Fourier integrals (integral transforms were used by Hankel, etc., late 1800s). - Calculus of variations: By 1870s revived (Weierstrass’s work on sufficiency conditions, etc.). Variational principles were considered part of higher analysis. - Analytic number theory: thanks to Riemann (zeta function), Dedekind/Dirichlet (L-series), Hadamard and de la Vallée-Poussin (who would in 1896 use complex analysis to prove Prime Number Theorem). This remained a niche connecting field, but by 1900 results like PNT strongly validated analysis as a tool in pure number theory. - Potential theory: The study of harmonic functions (solutions to Laplace’s equation) was a meeting ground of analysis and geometry. It was pursued by Gauss (1840), Riemann (1850s), then steadily by others as part of both analysis and physics. - Set theory and beginnings of topology: While these would become separate subjects, around 1900 they were often taught within analysis or as appendices to analysis texts, because their motivation was to clarify analysis concepts (continuity, measure zero sets, etc.).

Education & Texts c. 1900: - We see analysis textbooks fully reflecting rigor: e.g., Goursat’s Cours d’analyse (French, 1902) is an example of a rigorous text influenced by Cauchy/Weierstrass, widely used. - Felix Klein in his famous Erlangen program address (1872) had a broad definition of geometry, which interestingly demoted the analysis-vs-geometry distinction (he categorized geometries by symmetry groups). But analysis in curricula remained separate: one learned analysis (real/complex) then maybe an applied analysis (like potential or Fourier), and separately geometry or algebra. - University chairs: by now, major universities often had distinct chairs of analysis, geometry, etc. For example, in Göttingen, Hilbert took over Klein’s position but Hilbert himself straddled analysis and other fields (Hilbert’s early work was integral equations – analysis; later also algebraic number theory and geometry axioms). - Doctoral training: Weierstrass supervised many (incl. Sonya Kovalevsky) in analysis; so did Hilbert and others slightly later.

Rigor Mortis? By 1900, some feared analysis had become too abstract and removed from intuition. This partly fueled alternative approaches: e.g., Poincaré often criticized excessive rigor that gave no new insights, though he himself used and contributed to analysis (celestial mechanics, topology). Another partial reaction was the rise of asymptotic and numerical analysis for practical needs (engineers needed expansions for approximations more than $\epsilon$-$\delta$ proofs). While this was more a shift in focus, not a philosophical rebellion like intuitionism, it did mean the meaning of analysis had to accommodate both pure rigor and pragmatic approximation techniques. Terms like “applied analysis” or “numerical analysis” were entering usage (for instance, Lord Rayleigh’s work on sound (1877) or stability of traveling waves involved heavy asymptotic expansions – considered analysis, but not in the Weierstrassian style).

In summary, the period 1860–1900 solidified the identity of analysis as the science of functions and limits, built on arithmetic and set theory. The boundaries of analysis now firmly encompassed things like: - Continuous functions (with or without formulas), - Infinite processes (series, integrals) grounded in rigorous convergence tests, - Abstract constructs like function spaces (though that term wasn’t yet used – except for occasional notion like Cantor’s space of sequences, etc. – but soon will be in 20th century), - General principles like completeness, compactness (the latter concept was formulated by Borel in 1895 for countable covers in $\mathbb{R}$ context).

As we turn to the 20th century, analysis is poised to further branch out: measure theory, functional analysis, abstract harmonic analysis, etc., and also to be challenged by new foundational philosophies (intuitionism, formalism) and computational demands. But it enters the 1900s as a mature, rigorously defined domain of pure mathematics with broad applications, arguably the central pillar of mathematical research at that time (Hilbert’s problems attest to that, with many analysis-related problems listed).

Measure, Integration, and Early Functional Analysis (1900–1930) Link to heading

The turn of the 20th century saw another leap in the evolution of analysis: the introduction of measure theory and the Lebesgue integral, which generalized the concept of integration and resolved many of the shortcomings of the Riemann integral. In parallel, the nascent field of functional analysis emerged, treating functions as points in infinite-dimensional vector spaces, and analysis extended into new realms like abstract topological spaces and rigorously developed probability theory. This chapter focuses on 1900–1930, highlighting Lebesgue, Borel, Hilbert, Banach, Fréchet, and Kolmogorov and examining how “analysis” expanded its meaning through their contributions.

Lebesgue’s Measure and Integral (1902): Henri Lebesgue (1875–1941) fundamentally changed integration. By 1900, as noted, Riemann’s integral was standard but had well-known limitations: it couldn’t handle certain highly discontinuous functions (like the Dirichlet function, or oscillatory functions with dense discontinuities), and it had awkward convergence theorems (one often needed uniform convergence to swap sums/integrals). Lebesgue’s solution was to invent measure theory: - A measure generalizes length/area/volume to very irregular sets. Lebesgue in his 1902 thesis defined what is now called Lebesgue measure on $\mathbb{R}^n$, which assigns a non-negative number to any “measurable” set, consistent with our usual length for intervals and countably additive over unions. - Using this, he defined the Lebesgue integral as the integral of simple functions (step functions) approximating the given function. In effect, instead of partitioning the domain into intervals as Riemann did, Lebesgue partitions the range of the function into slices and measures the size of preimage of those slices. - The Lebesgue integral could integrate functions that Riemann could not, and major theorems became cleaner and more powerful: monotone convergence, dominated convergence theorems, Fubini’s theorem for double integrals, etc., all appeared in Lebesgue’s work or soon after.

The impact was immediate and huge: Lebesgue answered “most of the questions that had been asked of integration” and resolved the long-standing debate about the “correct” notion of integral. It extended Fourier analysis: now one could legitimately talk about Fourier series or integrals converging in $L^2$ or $L^1$ senses. As an encyclopedia notes, “In 1902 Lebesgue broadened the scope of integration far beyond Riemann’s… his new integral could handle the pathological cases… the Lebesgue integral had far fewer cases where integration was not the inverse of differentiation”. This strengthened the Fundamental Theorem of Calculus for a much larger class of functions.

Lebesgue’s ideas spread rapidly. In 1908, he published “Leçons sur l’intégration et la recherche des fonctions primitives” (Lessons on Integration and the Search for Antiderivatives), and by the 1910s, courses on “advanced analysis” would include measure and Lebesgue integration. Notably: - Émile Borel and René Baire were precursors: Borel (1890s) had begun defining measure for special sets (like sets of Cantor type) and Baire (1899 thesis) had studied functions by category (Baire categories). Their work plus Lebesgue’s formed what is now descriptive set theory and measure theory, blending into analysis. - Soon after Lebesgue, measure theory got axiomatized by Carathéodory (1914) who gave the modern definition of outer measure and measurable sets – often taught in measure courses now.

Function Spaces and Functional Analysis Beginnings: - The concept of treating sets of functions as spaces with structure took off. David Hilbert (1862–1943) between 1902–1912 studied integral equations (inspired by physics and potential theory problems). In doing so, he considered spaces of square-integrable functions (now called $L^2$ spaces) and discovered these spaces have an inner product and behave like infinite-dimensional Euclidean spaces. By 1904, Hilbert could talk of an “infinite matrix” of a linear operator and diagonalize it (leading to the spectral theory for compact operators, analogous to diagonalizing symmetric matrices). These ideas were published e.g. in Hilbert’s 1904/1906 papers, and later termed Hilbert spaces (a term coined by Fréchet or Riesz around 1920). Hilbert’s work basically launched functional analysis, though the name came later. - Meanwhile Fréchet (1906) defined metric spaces abstractly[6], and in 1907 his thesis defined abstract functionals and studied spaces of functions (even before Banach). Fréchet’s metric space notion was broad: analysis was no longer confined to $\mathbb{R}^n$ or $\mathbb{C}$, it could be done in any space where a notion of distance (and hence limits) is defined. This is analysis merging with topology: indeed by 1914, Hausdorff defined topological spaces. - Hahn and Banach: By the 1920s, modern functional analysis was formalized. Stefan Banach and others (in Poland’s Lwów school) took spaces of functions (with norms like $|f|_{\infty}$ or $\int |f|$) as objects of study. Banach’s seminal 1922 thesis gave the general concept of a Banach space (complete normed vector space)[2] and covered the Hahn–Banach theorem (1927) extending linear functionals. Also, Banach & Stefan Mazur’s work, and Riesz’s theorem (representation of linear functional on $L^2$ by integrals) in 1907 all happened in this era. - By 1932, Banach published Théorie des opérations linéaires, which systematically treated functional analysis (almost the first book solely on that). The very term “functional analysis” (coined earlier by Fréchet in 1910s) indicated analysis had expanded to study functions-as-points in their own right, not just pointwise properties but collective properties (like completeness, convexity, etc. in function spaces).

All these extended what “analysis” meant. While earlier analysis was largely about functions on $\mathbb{R}$ or $\mathbb{C}$, now abstract analysis in metric or normed spaces was part of the field. It’s telling that Nicolas Bourbaki (the collective group starting 1930s) considered “Functions of a Real Variable” and “Topological Vector Spaces” as core parts of the Éléments de Mathématique series. They treated these as part of the edifice of analysis (though Bourbaki avoided the word analysis in titles, they covered its content).

Convergence of Probability and Analysis: - Andrey Kolmogorov in 1933 axiomatized probability theory on measure-theoretic foundations. Even slightly before (1910s-20s), one sees analytic measure theory in works of Émile Borel and Henri Lebesgue applied to probability. E.g., the law of large numbers, central limit theorem – these got rigorous proofs through measure-based probability. Kolmogorov’s Grundbegriffe (1933) explicitly casts probability as an application of Lebesgue integration: a probability space is a measure space with total measure 1, expected value is an integral, etc. This integrated probability fully into analysis. As some authors noted, Kolmogorov’s achievement can be compared to Euclid’s Elements for geometry: it made probability a proper sub-branch of (applied) analysis. After that, probabilists speak of $L^p$ spaces, almost sure convergence, etc. – clearly measure-theoretic analytic concepts.

  • Ergodic theory (Poincaré 1890s, Birkhoff 1931 ergodic theorem) also merges probability, analysis, and dynamics. Birkhoff’s ergodic theorem is essentially a statement in measure theory about time averages vs space averages. This further solidified that analysis now also covered the nascent theory of stochastic processes and dynamical systems in a measure sense.

Applied Analysis (Differential Equations, Approximation, Numerical methods): 1900–1930 also had analysis in service of physics with new rigour: - Partial Differential Equations: The Hilbert school attacked integral equations which in turn solve PDEs (Fredholm theory). Courant and Hilbert’s classic Methods of Mathematical Physics (Vol I 1924, Vol II 1926) show a blend of analysis (functional analysis, Fourier series) with boundary value problems. This helped establish PDE analysis as a systematic field, not just a bag of tricks. - Calculus of Variations: Hilbert’s Problem #23 was about extending the calculus of variations. During this era, Leonida Tonelli, David Hilbert and Emmy Noether, and later Elie Cartan contributed to variational calculus with functional-analytic tools (direct method in calculus of variations was developed by Tonelli using compactness in function spaces). - Asymptotic analysis: As physics moved to new domains (like quantum mechanics in the 1920s), asymptotic expansions and special functions remained crucial. Books like Watson’s Theory of Bessel functions (1922) testify to how analysis of special functions matured (the term “analysis” in that sense meaning the study of particular analytic functions solving differential eqns). Analysts like E.T. Whittaker and G.N. Watson would still consider themselves doing analysis while producing tables of integrals or asymptotic expansions. - Numerical Analysis Institutionalized: With the advent of mechanical and electrical calculating machines (Bush’s differential analyzer ~1931, etc.), error analysis became important. In 1928, John von Neumann used functional analysis to study matrix methods for systems of linear equations (rudiment of numerical linear algebra). By late 1930s, one sees first inklings of iterative method analysis. But as an academic field, numerical analysis blossomed more after WWII with digital computers, though its roots in interpolation theory (Runge, 1901 on polynomial interpolation error) and finite differences (Courant’s 1928 finite difference scheme for minimal surfaces) were laid earlier.

Changing Attitudes and Education (1900–1930): - By 1900, it was widely accepted in curricula worldwide that one starts analysis with $\epsilon$-$\delta$ rigor (though some places took time to adapt; in the US, for example, the influence of Harvard’s Osgood and Chicago’s E.H. Moore slowly propagated rigorous analysis teaching in early 20th century). - The division into subfields: Many universities had separate courses in “Theory of Functions of a Complex Variable”, “Real Variable Theory”, “Integral Equations”, etc. The term “analysis” might encompass all, but often was shorthand for real analysis unless otherwise qualified. - Societies and journals: The London Mathematical Society and American Mathematical Society by 1910 had many analysis articles (Amer. Math. Society’s Transactions launched 1900 published e.g. plenty on differential equations and integrals). Journal “Annals of Mathematics” in the US became prominent publishing analysis among other things (Luxemburg’s integration, etc.). The German Mathematical Society (DMV) conferences often featured analysis problems (like Hilbert’s problems in 1900). - We also see the first “problem sets” for analysts: e.g. the Scottish Café problems (Lwów, 1930s, Banach’s group) which read like a list of conjectures in functional analysis – demonstrating analysis had become creative and challenging at a high level, fueling new research directions (many solved decades later).

Summary of semantics by 1930: “Analysis” by 1930 included: - Classical real and complex analysis (differentiation, integration, series). - Measure and Integration Theory – an essential part of the definition of analysis now. - Functional Analysis – though a new term, it was seen as part of pure analysis. - Differential Equations and Fourier/Harmonic Analysis – these remained central, now treated with the new tools (Fourier transform integrals formally defined by Lebesgue integration, etc.). - Analytic Number Theory – firmly a branch of analysis after Hadamard & de la Vallée-Poussin’s 1896 prime number theorem (using complex analysis). - Probability – arguably became a branch of applied analysis (often taught by analysts or in analysis textbooks as an application of measure). - Discrete vs continuous boundary: Interestingly, analysis even started informing combinatorics (analytic combinatorics uses generating functions and complex analysis to count discrete structures; though not yet a named field in 1930, precursors like Hardy-Ramanujan’s asymptotic for partition numbers in 1918 using complex contour integrals show it). - Differential Geometry – still often separated as “geometry,” but the work of people like Elie Cartan used analysis heavily (moving frames, solving PDEs for structure equations). By mid-20th century, the term “global analysis” would appear for analysis on manifolds (but in 1930, that was not yet delineated – it was just considered part of geometry or mechanics). - Rigor vs intuition vs formalism: The foundational issues led to schools: classical analysts largely ignored Brouwer’s intuitionist approach (which peaked ~1910-1920) because they found it restrictive (no nonconstructive proofs, etc.). Hilbert’s formalism defended classical analysis from these criticisms. In education, classical approach (Hilbert/Bourbaki style axioms) prevailed over intuitionism by and large.

Thus, by 1930 analysis had become immense in scope, and arguably the unifying language of all continuous mathematics. It had also acquired a more abstract flavor: an analyst might just as well prove a theorem about all separable metric spaces as solve an integral. The identity of analysts as a community broadened – one could be an “analyst” specializing in, say, Banach spaces or in complex function theory or in PDE, and they’d all consider themselves under the big tent of analysis.

Probability, Distributions, and Modern PDE (1930–1950) Link to heading

Mid-20th century analysis was marked by both synthesis and expansion. Probability theory was fully subsumed under analysis through measure theory, distribution theory (generalized functions) was developed to address limitations in classical analysis for differential equations, and the field of partial differential equations (PDEs) entered a modern phase using these new tools. Additionally, wartime needs and the dawn of computers started to shape “applied analysis” as a systematic discipline (e.g., numerical methods, optimization). In this chapter we examine roughly 1930–1950, focusing on contributions by Kolmogorov (in probability), Schwartz and Sobolev (in distributions and functional spaces), and John von Neumann, Norbert Wiener, Stanislaw Ulam (analysis in war and computation contexts), as well as how these influenced the notion of “analysis.”

Probability as Analysis (Kolmogorov’s Axiomatization 1933): We’ve touched on this above: with Kolmogorov’s Grundbegriffe, probability was no longer a philosophically separate endeavor about randomness; it became an application of integration. Key consequences: - Almost every result in probability theory could be rephrased and proved as a theorem in analysis (often measure or functional analysis). For example, the law of large numbers became an instance of the ergodic theorem or a statement about $L^2$ convergence. - New probabilistic processes (Markov chains, Brownian motion) were studied by analytical means. Norbert Wiener in 1923 had already constructed the Wiener measure (the distribution of Brownian motion paths) as a measure on function space – an infinite-dimensional analysis problem. This kind of work continued with Kolmogorov and others, leading to Ito’s stochastic calculus in 1940s (which again is analysis, developing integrals with respect to stochastic measures). - Probability’s inclusion expanded analysis’s scope into stochastic analysis. For example, in 1944, Doob’s martingale convergence theorems used measure theory cleverly. Many analysts who might have worked on classical real analysis moved into probability, or vice versa, with no conceptual barrier now.

Schwartz’s Distributions (1945): Laurent Schwartz (1915–2002), building on earlier ideas by Sergey Sobolev (Russia, 1936) and others, developed the theory of distributions around 1945–1950. A distribution (generalized function) is a continuous linear functional on a space of test functions (like smooth functions of compact support). This provided rigorous meaning to objects like the Dirac delta $\delta(x)$ (which is not an actual function, but a distribution). Why was this needed? - Classical analysis struggled with derivatives of non-differentiable functions or solutions of PDE that were not classically differentiable. For instance, the fundamental solution of the wave equation might be non-smooth or a measure; distribution theory can handle that. - In physics, Dirac (1930) introduced the delta “function” to simplify equations. Before Schwartz, analysts either brushed it aside or tried ad-hoc justification. Schwartz gave a firm footing: $\delta$ is a distribution defined by $\delta(\varphi) = \varphi(0)$ for any test function $\varphi$. - Distribution theory made every linear PDE theoretically solvable (at least in the sense that it has a solution in the distribution sense, under mild conditions like existence of fundamental solutions). This revolutionized PDE theory from the classical approach (which required strong solutions). - Schwartz’s work (for which he got the Fields Medal in 1950[3]) is said to have provided “a revolutionary new approach to partial differential equations”, making distribution theory “the calculus of today.”. Indeed, distributions allow differentiation and Fourier transform to be extended to all distributions, massively generalizing classical analysis.

For example, the derivative of the Heaviside step function (0 for negative, 1 for positive) is the Dirac delta in distribution sense – such statements are rigorous now. The Fourier transform can be applied to polynomially growing functions (as tempered distributions), which enlarged the applicability of harmonic analysis (leading to breakthroughs like Hörmander’s work on Fourier integral operators in 1960s).

Sobolev Spaces (1930s): Sergei Sobolev introduced what are now called Sobolev spaces $W^{k,p}$ in 1930s while studying variational problems in the Soviet aerospace context. These spaces consist of functions with weak derivatives up to order $k$ in $L^p$. Sobolev spaces bridged functional analysis and PDE. A key outcome was the Sobolev embedding theorem, giving conditions under which a function in a certain space has a continuous representative, etc. The concept of weak solution to a PDE – one that satisfies the integral form rather than classical differentiation – became central. This concept relies on distributions or at least integration by parts in Sobolev spaces, and became a standard approach to PDEs after 1950. So by mid-century, “analysis” in the context of PDE largely meant functional analysis and Sobolev-space methods, rather than classical series or transform tricks (though those remained too).

Applied and Computational Analysis (Wartime and Postwar): - John von Neumann contributed analysis in quantum mechanics (he axiomatized quantum theory with Hilbert spaces in 1932) and in computing. During WWII, he worked on approximating nonlinear PDEs (like shock fronts in fluids) and co-developed the Monte Carlo method (random sampling for integrals) with Ulam – blending probability and analysis for computing. He also was pivotal in the development of the first electronic computers and recognized the importance of numerical stability (his analysis of the discrete schemes for weather prediction, c. 1946, gave the famous CFL condition in numerical PDE). - Stanley Wiener (1894–1964) – Norbert Wiener’s work on cybernetics and harmonical analysis deserves mention. In 1942, Wiener published Extrapolation, Interpolation, and Smoothing of Stationary Time Series (theory behind the Wiener filter), heavily using Fourier analysis and probability (power spectral density etc.). He effectively merged harmonic analysis with stochastic processes, an area now called signal processing – which is applied analysis. His earlier work, e.g. Generalized Harmonic Analysis (1930), extended Fourier analysis to arbitrary Abelian groups, preluding abstract harmonic analysis (which was codified by Pontryagin, 1930s, and later by others). - Optimization and Control: During the war and after, calculus of variations was revived in new forms: optimal control theory (initiated by Pontryagin in 1950s) and linear programming (which uses convex analysis, a part of analysis albeit finite-dimensional). Also, in 1940s, Kantorovich in USSR developed functional analytic methods for optimization and for numerical analysis (getting a Nobel in Economics eventually, but clearly working in applied analysis).

Splintering and Unification: - The mid-20th century also saw analysis splinter into many sub-disciplines, each with specialized techniques and journals: e.g., harmonic analysis, several complex variables (distinct from one complex variable), functional analysis, operator theory, ergodic theory, numerical analysis, etc. Yet, these all fell under the broad tent of “analysis.” - Institutions reflected this: Some math departments created distinct “analysis” seminars for real/complex analysis, “applied mathematics” for numerical/PDE, etc. But often there was cross-fertilization (someone like Zygmund, a pure harmonic analyst, interacting with applied people on Fourier method issues). - In 1948, the first Fields Medal given to a woman was to Kovalevsky (Kovalevskaia) but she had died; the first living analyst medaled was Laurent Schwartz (1950). Others soon after: Atle Selberg (Fields 1950) for analytic number theory (the Selberg trace formula uses harmonic analysis on groups), Kiyoshi Ito (not a Fields but huge contributions to stochastic analysis, 1940s). This period proved analysis was the core of many recognized advances.

Academic Shifts: By 1950, a typical graduate program in math required thorough real and complex analysis training (often using textbooks like Titchmarsh for real analysis 1948, Ahlfors for complex 1953, or soon Royden 1968 for measure theory). The influence of Bourbaki also peaked mid-century – Bourbaki’s Integration volume (1949) made abstract measure theory canonical, and their Topological Vector Spaces volume (1953) became the bible of functional analysts. Some criticized Bourbaki’s style as too austere and unmotivated, but it undeniably shaped the education of analysts for a generation.

In summary, by 1950 analysis had assimilated many new ideas: it was not just about classical functions or solving integrals, but about general spaces, operators, probability measures, distributions, etc. The rhetoric of analysis vs synthesis (from centuries earlier) was largely obsolete, replaced by analysis vs algebra vs topology as main categories. Yet even these lines blurred: for instance, Grothendieck in 1950s applied functional analysis flair to algebraic geometry, and conversely algebraic techniques (like group representations) to analysis.

The forthcoming period (1950–1975) will see harmonic analysis flourish (Calderón-Zygmund etc.), further development of global and microlocal analysis (Hörmander, Atiyah-Singer), and interplay with other domains like number theory and combinatorics (via analytic methods). We now turn to that.

Harmonic Analysis, Operator Theory, and Microlocal Analysis (1950–1975) Link to heading

During the postwar decades, analysis experienced further specialization and deepening. Harmonic analysis – the art and science of representing functions as superpositions of basic waves or characters – underwent a renaissance, particularly in the real-variable direction pioneered by Calderón and Zygmund. Operator theory matured, connecting functional analysis with algebra and physics (notably via von Neumann algebras and $C^$-algebras). At the same time, microlocal analysis* emerged as a refinement of Fourier analysis to study localized frequency behavior, crucial for understanding singularities of PDE solutions. In this chapter, covering roughly 1950–1975, we highlight key developments and their impact on the concept of “analysis.”

Calderón–Zygmund and Real Harmonic Analysis: - In 1950s, Alberto Calderón (Argentina) and Antoni Zygmund (Poland/US) collaborated at the University of Chicago, producing seminal work on singular integrals. Their 1952 paper “On the existence of certain singular integrals” gave conditions under which singular integral operators (like principal value integrals) are bounded on $L^p$ spaces. This work generalized classical Fourier series convergence issues and provided tools to attack PDE in a new way – by understanding the boundedness of convolution operators with singular kernels. - They introduced what is now called the Calderón–Zygmund theory, including the Calderón–Zygmund decomposition lemma (a technique to split an integrable function into “good” and “bad” parts) which one source calls “one of the most influential results in mathematics.” This method is fundamental in modern analysis for proving inequalities. - The impact was that many classical conjectures in Fourier analysis were solved or progressed: e.g., the convergence almost everywhere of Fourier series for $L^2$ functions (proved by Carleson in 1966, building on these tools), or the development of Littlewood–Paley theory (a way to characterize function spaces via frequency projections). - Real-variable harmonic analysis became a robust, independent thread of analysis. Zygmund’s school produced many students (e.g. Elias Stein, who further advanced the field in the 60s–70s with Calderón–Zygmund operators, and wrote influential textbooks; and the so-called “Chicago school” of analysis). - The Calderón–Zygmund techniques also entered partial differential equations: for example, the solution of elliptic PDE via singular integrals (Calderón’s work on Cauchy integrals in complex analysis, and the Calderón–Zygmund theory gave fundamental estimates for fundamental solutions). - By 1975, harmonic analysis had expanded to include Fourier transform on locally compact groups (Weil’s work earlier, then references like Loomis 1953), maximal function techniques (Hardy-Littlewood maximal function is a key object in differentiation theory), and Harmonic analysis on homogeneous spaces (starting with Helgason, etc.). All these remained under “analysis.”

Complex Analysis in Several Variables: Another significant mid-century development: complex analysis extended to $\mathbb{C}^n$. Henri Cartan, Kiyoshi Oka, Donald Spencer, etc., studied spaces of holomorphic functions of several complex variables, encountering phenomena absent in one variable (like nontrivial domains of holomorphy, the $\bar\partial$-problem). Techniques from partial differential equations and topology entered – e.g., the use of sheaf theory (which Cartan and Oka used) or PDE methods (Hörmander’s $L^2$ estimates for $\bar\partial$, 1965). This cross-fertilization integrated analytic and algebraic methods and is a prime example of how “analysis” collaborated with other areas. Several complex variables, though arguably a field in itself, is definitely considered a part of analysis (often taught in advanced analysis courses).

Operator Theory and $C^*$-Algebras: - John von Neumann had earlier (1930s) founded the study of operator algebras (von Neumann algebras) motivated by quantum physics. In 1940s–50s, Israel Gelfand in the USSR introduced $C^*$-algebras (norm-closed -algebras of operators) and showed they can be understood via maximal ideal spaces (a very analysis-meets-algebra viewpoint: using Banach algebra techniques and topology). - Functional calculus for normal operators, spectral theorem improvements, etc., were established. This work connected analysis with topology (for instance, Gelfand-Naimark theorem states every commutative $C^$-algebra is isomorphic to continuous functions on some space – linking to topology). - By 1970s, James Glimm and Elliott Lieb (and others) used operator algebra techniques in analysis and physics (like quantum statistical mechanics). Also Alain Connes (Fields 1982) began work on noncommutative geometry, which is essentially analysis (operator algebras) in service of geometric intuition. - The notion of “analysis” here clearly stretches: a functional analyst working on Banach spaces or operator algebras is considered an analyst, even though their work might look abstract and algebraic. For example, Banach’s work was called “functional analysis” but a lot of his results (like Banach fixed-point theorem, Hahn-Banach theorem) are staple results in analysis courses today.

Microlocal Analysis (1960s–70s): - This is a subfield that combines Fourier analysis with geometry to study PDEs. Lars Hörmander is a central figure: his four-volume Analysis of Linear Partial Differential Operators (published 1983 but work done earlier) and many papers introduced tools like Fourier integral operators and wave front sets (concept to track singularities of distributions through transformations). Microlocal analysis, initiated by Hörmander and Maslov, Sato, Kohn, Nirenberg, etc., formalized what physicists did in WKB approximation (a method of approximating PDE solutions with highly oscillatory integrals). - Hörmander’s work (Fields Medal 1962 for earlier contributions to pseudo-differential operators) epitomizes modern analysis: heavy use of distribution theory, functional analysis, and classical symplectic geometry – all integrated. For example, solving a PDE by analyzing the propagation of singularities of its solution via the characteristic variety is a concept from microlocal analysis. It is indispensable in the study of hyperbolic PDEs and in scattering theory. - Boutet de Monvel, Duistermaat, Gilkey, and others extended these ideas. They allowed analysts to tackle problems like the Atiyah-Singer Index Theorem (1963) with analytical tools (Atiyah-Bott used heat kernel methods – essentially analysis – in 1960s to prove index theorem analytically, whereas original proof used topology). - Microlocal analysis is a good example of how analysis kept expanding to meet new challenges – merging with geometry. Terms like “global analysis” and “microlocal analysis” became common by 1970s, indicating analysis done on manifolds globally or in phase space locally.

Harmonic Analysis meets Number Theory: - Mid-century also saw classical analytic number theory advanced by analytic tools: Atle Selberg and others developing trace formula, Hecke operators connections – these use harmonic analysis on groups (Selberg used analysis on hyperbolic surfaces, which are non-Euclidean but you can apply Fourier analysis via eigenfunctions on the surface). - Eventually, the Langlands program (from 1967 onward) explicitly brought harmonic analysis of reductive groups into number theory – a highly sophisticated synergy. So analysis was penetrating deep number theory at a structural level (beyond just primes and Riemann zeta, to automorphic forms and $L$-functions).

Pedagogy and Texts (1950–75): - Textbooks reflected these developments: Walter Rudin’s Principles of Mathematical Analysis (1953) became a standard for rigorous undergrad analysis (covering basics plus metric spaces). His Real and Complex Analysis (1966) included measure theory, integration, and analytic function theory – something unthinkable as one volume a century prior. - Elias Stein authored with Guido Weiss Fourier Analysis on Euclidean Spaces (1971) – giving a comprehensive modern view of Calderón-Zygmund theory and more, effectively canonizing “modern harmonic analysis” as a core graduate subject. - Courses diversified: separate courses in “Functional Analysis”, “Harmonic Analysis”, “Several Complex Variables”, etc., popped up in curricula at research universities. - But the basic sequence of Real Analysis (measure/integration), Complex Analysis, and Functional Analysis by grad school became standard, reflecting the pillars that analysis now stood on.

Institutional changes: - At some universities, analysis groups split into “pure analysis” vs “applied analysis” or “real/functional” vs “complex/harmonic”, etc., but generally they interacted. - Societies: the London Math Society had for example an “Analytical Section”, and the AMS had lots of sessions on analysis in meetings. New conferences focusing on subareas (like the Baton Rouge Conference 1961 on Harmonic Analysis) indicated the vibrant specialization.

By 1975, analysis was truly many-faceted. One could scarcely find a single person expert in all of it. Yet, all these specialists – whether working on abstract Banach lattices or computing Fourier transforms for automated tomography – shared a common lineage of concepts (limit, continuity, expansion, approximation). In many ways, analysis had become “the calculus of the modern age,” where distribution theory and functional analysis played the roles that integrals and differential calculus did in the 18th century.

The next era (1975–2000 and beyond) would see analysis tackling nonlinear problems (chaos, dynamical systems, Navier-Stokes turbulence open problems), interacting with computer science and discrete math (wavelets, computational harmonic analysis, randomness in algorithms), and intersecting more with geometry (minimal surfaces, geometric measure theory, etc.), as well as feeding into the rise of new application domains like signal processing, image analysis, and data science. The semantic stretch of “analysis” thus continues into high-dimensional and applied contexts, which we will now explore.

Nonlinear Dynamics, Computational Turn, and High-Dimensional Analysis (1975–2000) Link to heading

In the final quarter of the 20th century, analysis confronted the challenges of nonlinear phenomena, the explosion of computing power, and the demands of high-dimensional data and systems. These years saw the rise of chaos theory and dynamical systems as mainstream (with analysis at their core), the development of wavelet theory and other new harmonic analysis tools particularly suited for computation and digital signal processing, and a blending of analysis with other fields like combinatorics, number theory, and computer science. We also see analysis techniques pervade emerging fields in probability (like stochastic differential equations for finance) and statistics/data (Fourier methods in machine learning). This chapter surveys how “analysis” adapted and expanded from roughly 1975 to 2000.

Nonlinear Dynamics and Chaos: - Although foundational work in dynamical systems was older (Poincaré, 1890s; Kolmogorov-Arnold-Moser (KAM) theorem ~1960), it was in the 1970s–80s that chaos theory captured broad attention. The Feigenbaum constants (1975) in period-doubling routes to chaos, the Smale horseshoe (1960s) – these results used analysis (e.g., functional iteration, fractal dimensions – an analytic concept albeit in a discrete system). - Entropy in dynamical systems (Kolmogorov-Sinai entropy, 1959) and ergodic theory blossomed. Notably, Ya. Sinai, D. Ruelle, R. Bowen applied analysis to chaotic systems, establishing results like measure-predictability (SRB measures). - Many of these studies relied on heavy analysis: solving functional equations (Feigenbaum’s constant comes from a functional renormalization equation), using spectral theory of transfer operators (Perron-Frobenius operators) – these are integral operators requiring functional analysis. So even chaos theory, often presented graphically with logistic maps, is underpinned by serious analysis. - The term “analysis” might not appear in public descriptions of chaos, but in math departments these developments often fell under ergodic theory or analysis seminars.

Wavelets and Time-Frequency Analysis: - In the 1980s, wavelet transforms emerged as an alternative to Fourier transforms for analyzing signals, especially nonstationary ones. Pioneers included Jean Morlet (a geophysicist), Alex Grossmann, Ingrid Daubechies (who gave the first orthonormal wavelet basis with nice properties, 1988), Stéphane Mallat, and Yves Meyer. - Wavelet theory is fundamentally analysis: constructing function bases in $L^2(\mathbb{R})$ that are localized in both time and frequency. It drew on Calderón’s earlier work (the Calderón reproducing formula is an integral part of continuous wavelet theory), and on ideas from multiresolution analysis (Mallat’s work connected wavelets to filter banks in signal processing). - By late 90s, wavelets were standard in both pure and applied contexts: they solved problems in harmonic analysis (Meyer used them to tackle certain singular integrals), and found use in JPEG image compression (the JPEG2000 standard is wavelet-based). - The rise of wavelets demonstrated analysis responding to computational needs: it’s an example where engineers and physicists developed something and mathematicians (mostly analysts) quickly provided rigorous foundations and expansions.

Discrete and Computational Harmonic Analysis: - Following wavelets, the 90s saw generalizations: frames and Gabor analysis (time-frequency localized functions), etc. Compressed sensing concept (late 2000s, a bit beyond our timeframe) emerged from harmonic analysis and optimization blending: e.g., Emmanuel Candès and Terence Tao’s work uses heavy analysis (uncertainty principles, $\ell^1$ minimization). - Fourier analysis on graphs and groups also began merging with computer science (by 2000, spectral graph theory was a key part of theoretical CS – solving problems via eigenvalues of matrices, which is linear analysis). - At the same time, classical numerical analysis matured: For solving PDEs, finite element methods (Courant’s idea from 1943, but mathematically developed in 60s-70s by analysts like Ciarlini, Lions) became rigorous with error estimates. The finite element method’s analysis required Sobolev spaces and interpolation inequalities – again analysis at core. - Iterative methods for linear systems (CG method by Hestenes 1952, but analysis of convergence, preconditioning etc. blossomed by 80s). - Pseudo-spectral methods (using FFT on PDEs) also were popularized, connecting to Fourier analysis intimately. - All these solidified a subfield often called “numerical analysis” distinct from “analysis”, but the two had large overlap in personnel and techniques. The analysis community produced specialists in numerical analysis who used functional analysis, approximation theory, etc.

Analysis in High-Dimensional Phenomena: - Geometric Measure Theory: The work of Herbert Federer and Wendell Fleming (1960s) and later F. Almgren and others on minimal surfaces, varifolds, etc., extended analysis (calculus of variations) into highly geometric contexts. The famous solution of the Plateau problem (existence of minimal surface spanning a contour) by Douglas and Rado in 1930s was earlier, but regularity results were much later. This field requires intricate measure-theoretic analysis. By 1975, Ennio De Giorgi and John Nash had independently solved regularity of minimal surfaces and elliptic PDEs (De Giorgi, 1957, solved a conjecture on regularity of solutions to certain elliptic equations – an analysis triumph). - Optimal Transport: In 1781 Monge posed the transport problem, but it got modern analytic formulation by Leonid Kantorovich (Nobel prize-winning economist mathematician) in 1942 via linear programming. In the late 90s, Cédric Villani and others revived it analytically, linking to PDE (the Monge-Ampère equation) and Riemannian geometry. By 2000, optimal transport was an analysis field bridging probability and geometry – culminating in Villani’s Fields Medal 2010. - Combinatorial and Extremal Problems: Analysts like Szemerédi used ergodic theory (Furstenberg’s proof of Szemerédi’s theorem, 1977) to solve problems in arithmetic combinatorics. Also Elekes and others used incidence geometry with analysis for combinatorial geometry results. There's also Additive combinatorics where Fourier analysis on groups is key (e.g., Roth’s theorem on 3-term arithmetic progressions used Fourier analysis on $\mathbb{Z}/N\mathbb{Z}$). So analysis was instrumental in discrete settings as well.

Broadening Application Domains: - Control Theory (starting 1950s with Pontryagin’s maximum principle) – essentially an analysis optimization problem for ODEs. - Signal Processing and Communications – Shannon’s information theory (1948) wasn’t directly analysis but used Fourier transforms; error-correcting codes were more algebraic, but analog communications and filter design were Fourier (analysis) heavy. - Quantitative Finance – the Black-Scholes PDE (1973) and stochastic calculus (Itô’s 1944 work) meant that by 1980s, an “analyst” might well be working in a bank applying stochastic differential equations. - Computer Graphics and Vision – using Fourier/wavelet analysis for image compression, solving Laplace’s equations for lighting (radiosity method in CG), etc., analysis found new roles in computer science realms.

Cultural and Institutional Aspects (1975–2000): - Bourbaki’s decline: By 1980s Bourbaki’s influence waned; a more applied and example-driven style in textbooks emerged (e.g., Stein & Shakarchi’s four-volume analysis series in early 2000s aims to balance pure/applied). - Interdisciplinary institutes: Places like Institute for Advanced Study and MSRI (Math Sciences Research Institute) often ran programs in analysis topics (like harmonic analysis, complex geometry, etc.), showing it remained a hotbed of research. - The classification of mathematics research in MSC (Mathematics Subject Classification) lists dozens of categories under analysis. For example, 42 (Fourier analysis), 43 (abstract harmonic analysis), 47 (operator theory), 49 (calc of variations), 53 (global analysis on manifolds), 60 (probability) which was historically under analysis category, etc. This underscores that “analysis” was not one thing but a constellation of subfields.

By 2000, analysis was everywhere – in pure math, it tackled problems in geometry, number theory, topology, logic (e.g., continuous model theory); in applied math, it underpinned differential equations solving the millennium’s big unsolved PDEs (Navier-Stokes existence, for instance, is an analysis question), and in the world of applications, it was the engine behind signal and image technology, quantitative finance, and more.

The semantics of “analysis” thus became extremely context-dependent. To a pure mathematician, “analysis” might evoke abstract functional analysis or deep PDE theory, while to an engineer it might mean practical Fourier or wavelet analysis of data. Educationally, a “Real Analysis” course remained focused on measure, integration, etc., but an “Analysis” seminar at a university could be on anything from analytic number theory to nonlinear wave equations.

One unifying thread still remained, echoing Sonar’s description: analysis is about understanding continuum and infinity – whether it’s the continuum of real numbers, the infinite dimensional function spaces, or continuous symmetries. Even as analysis methods extended to discrete contexts, they often did so by embedding discrete problems into continuous ones (like using integration or spectral methods on graphs). Thus the spirit of analysis – breaking problems into pieces, taking limits, and using the power of continuity – persisted as its semantic core.

2000–Present: Analysis in the Information Age (High-Dimensional Data, Geometry, and Randomness) Link to heading

Entering the 21st century, analysis continues to evolve, meeting the demands of a world awash in data and computational complexity. In the last two decades, there has been a surge of high-dimensional analysis techniques (for data science, machine learning), a fruitful merging of geometric and analytic methods (optimal transport, compressed sensing), and analysis has penetrated into theoretical computer science (randomized algorithms, learning theory) and quantum physics (analysis on large networks, random matrices). Meanwhile, classical analysis remains vibrant, solving long-standing open problems and branching into new questions (like Navier-Stokes regularity, the Riemann Hypothesis approaches via analysis, etc.). In this final chapter, we sketch how “analysis” from 2000 to the mid-2020s has been defined and used, highlighting continuity with its long tradition and new twists that further stretch its meaning.

Geometric Analysis and Optimal Transport: - The term “geometric analysis” often refers to the use of analytical tools to solve geometric problems. A prime example is Perelman’s proof of the Poincaré Conjecture (2003) using Ricci flow – a PDE approach where one shows a 3-manifold can be deformed (analytically) to a round sphere. Perelman’s work built on Richard Hamilton’s parabolic PDE theory from the 1980s. This was a triumph of analysis in pure geometry. - Minimal surfaces and geometric measure theory solved longstanding problems: e.g., the Willmore conjecture (2014) about minimal tori, solved via variational calculus and Fourier analysis on loop spaces. - Optimal transport theory blossomed (Villani, Fields 2010): it brought a powerful viewpoint to geometry (leading to new proofs of geometric inequalities using transport), and found applications in economics and machine learning (computing optimal couplings between datasets, etc.). It is highly analytical, involving PDE (Monge-Ampère and continuity equations), convex analysis, and probability measures. - Harmonic maps and mean curvature flow are other examples where analysis tackles geometric evolution; big progress was made in understanding singularities in these flows, relying on PDE and functional analysis techniques.

Analysis in High-Dimensional Data and Machine Learning: - Machine learning algorithms like neural networks are essentially compositions of simple analytic functions. Understanding their behavior leads to analytic questions (e.g., how does a loss function’s landscape look? can we prove convergence of gradient descent?). This spawned a field sometimes called “mathematical data science” where one often uses analysis (Fourier, optimization, probability) to understand high-dimensional phenomena like concentration of measure (Levina, Vershynin’s works etc.). - Compressed sensing (Donoho, Candès, Tao ~2004-2006): Provided that a signal has a sparse representation, one can recover it from surprisingly few measurements by solving an $\ell^1$-minimization (convex analysis problem). This result combined probability (random matrices), functional analysis (in $\ell^n$ spaces), and geometry (polytope geometry). It’s a showcase of modern analysis feeding directly into engineering (sensing, MRI reconstruction). - Random matrix theory and high-dimensional probability matured, finding use in statistics, quantum physics, etc. Wigner’s semicircle law (1950s) and later results like Voiculescu’s free probability (1990s) all are parts of analysis (functional analysis in operator algebras or probability theory). Terence Tao (Fields 2006) worked in this area as well as many others (from combinatorics to PDEs), embodying the breadth of modern analysis. - Graphical models and Markov chains on large state spaces: mixing times analysis uses Fourier analysis on groups or coupling arguments – these too are analytical at heart, albeit in a discrete context.

Interplay with Number Theory and Algebra: - The proof of Fermat’s Last Theorem (1994, Wiles) was mostly algebraic geometry, but subsequent proofs and generalizations (Modularity conjecture, etc.) occasionally used analytic input (like L-functions). The role of analysis in number theory persists: e.g., the Green-Tao theorem (primes contain arbitrarily long arithmetic progressions, 2004) used ergodic theory and Fourier analysis on finite groups, continuing the tradition of Hardy-Littlewood and Furstenberg. - Langlands Program: relates Galois representations and automorphic forms. Automorphic forms are highly analytic objects (solutions of certain PDEs on symmetric spaces, eigenfunctions of Laplacian on arithmetic manifolds). Langlands duality is partly established via trace formula (an analytic tool). This area sees deep collaboration of analysis and algebra.

Quantum and Applied Fields: - Quantum field theory (QFT) still lacks a complete analytic foundation (hence one of Clay’s Millennium Problems: Yang-Mills existence). But progress is made using analysis: constructive QFT in lower dimensions requires sobolev estimates and renormalization group (which is an analytic iterative scheme). - Quantum computing and information have spurred study of large unitary matrices and entropies – analysis helps characterize capabilities and limits of algorithms. - In biology and other complex systems, dynamical systems analysis (nonlinear ODE/PDE) is a staple to model and understand patterns (like Turing’s reaction-diffusion models for pattern formation, now studied with nonlinear analysis and PDE bifurcation theory).

Rhetoric – Rigor vs. Experiment: In an era of big data and computer experiment, one might ask: has the balance shifted away from rigor? To some extent, fields like AI often proceed heuristically. However, in response, there’s been a push for “explainable AI” and theoretical understanding – often turning to analysis (e.g., approximation theory to understand neural networks as function approximators, which connects to classical analysis results like universal approximation theorems, essentially an analysis question in $L^p$ spaces). So the rigor vs intuition debate continues: analysis represents the rigorous side ensuring that what we infer from machines is on solid ground.

Current Education and Identity: - Undergraduate: “Analysis” courses remain foundational (limits, metric spaces, etc.). Some curricula incorporate computational aspects (e.g. Fourier analysis in a class that might be cross-listed with signal processing). - Graduate: an analysis qualifier is required almost everywhere; it usually covers real and complex analysis core topics (Lebesgue integration, Banach/Hilbert space basics, analytic functions, etc.). Then specialized electives branch to PDE, functional analysis, harmonic analysis, etc. - Researchers identify with finer labels: e.g., “I’m a harmonic analyst focusing on PDE” or “I’m an algebraic geometer using analytic methods.” But departmental divisions rarely separate analysis further – usually one big “Analysis/PDE” group exists, since cross-talk is rich.

Public Perception: - In colloquial scientific press, you don’t hear “analyst solves X” as much as “mathematician solves X,” but within math, someone might be known as an “analyst” meaning they solve problems using analysis techniques as opposed to combinatorics or algebraic ones. - The term “analytic solution” still means a solution in closed-form expressions (contrasted with numeric simulation), which traces back to the old analysis vs synthesis idea of solving by algebra/calculus manipulation.

Summarizing, in the 21st century analysis is as much a methodology as a field: a vast collection of tools and viewpoints applicable across mathematics and science. It ranges from the very abstract (like category-theoretic analysis in derived categories – even integration cohomology theories) to the extremely applied (like tuning an FFT algorithm or analyzing an algorithm’s error). The boundaries between “analysis” and “other fields” have become productively blurred: analysis infiltrates combinatorics (through entropy and spectral methods), algebra (through analytic number theory and functional identities), geometry (through flows and metric measure spaces), and computer science (through continuous optimization and learning theory).

Yet at its heart, analysis remains the mathematics of the continuous – of limits, of smooth change, of breaking apart and reassembling – an ever-adapting foundation that, since Newton’s time, has been the language of mathematical science. Its semantics have broadened unimaginably from Newton’s “infinite series solving all problems”, but each new generation’s expansion – rigor, generality, abstraction, application – built on that core idea. Analysis today is not one field but many, yet all share the DNA of those ancestral ideas: analysis as the pursuit of solutions “by infinite equations” with ultimate rigor and generality.

  • Analysis (Ancient & Early Modern Sense): A method of problem-solving or discovery. Analysis meant starting from the desired conclusion and working backward to known truths. In geometry, it was opposed to synthesis (the forward, formal proof). This sense influenced Newton and Leibniz, who spoke of analysis as the “method of invention,” not necessarily as a specific subject.

  • Analysis (Modern Sense / Mathematical Analysis): The branch of pure mathematics dealing with limits and continuity, differentiation and integration, infinite series, and related constructions. Essentially synonymous with theory of calculus and its extensions. By the 19th century, “analysis” meant the rigorous study of real and complex-valued functions, sequences, series, and the structures built from them (functions spaces, integrals, etc.).

  • Infinitesimal Analysis: 18th-century term for calculus using infinitesimals. Leibniz and Euler often said “analysis of the infinitely small” to mean differential calculus, and “analysis of the infinitely great” for summing series to infinity. This contrasts with finite algebra or synthetic methods. After the 19th-century rigorization (epsilons and deltas replacing infinitesimals), the phrase fell out of use in mainstream texts, but the concept lives on via nonstandard analysis (which reintroduces infinitesimals rigorously).

  • Mathematical Analysis vs. Analytical Chemistry/Philosophy: Non-math fields use “analysis” to mean breaking things into components (chemical analysis, or logical analysis in philosophy). In the 19th century, one had to say “mathematical analysis” to distinguish. E.g., Whewell in 1830s referred to “the calculus, or mathematical analysis” to clarify.

  • Analytical Method: Often used historically to mean an algebraic or calculus-based approach (as opposed to a geometric one). For instance, Lagrange’s analytical mechanics means he used algebraic equations (energy, etc.) rather than geometric diagrams or synthetic proofs. Similarly, “analytic solution” today implies a closed-form solution via known functions/operations, not numerical approximation.

  • Analysis vs. Algebra vs. Geometry: Broad divisions of mathematics formalized in the 19th–20th centuries. Analysis concerns the continuum (real numbers, functions, limits). Algebra concerns discrete or symbolic structures (equations, groups, rings). Geometry concerns shapes, space, and their properties. However, these fields overlap heavily (e.g., analytic geometry uses algebra in geometry; algebraic geometry uses both; geometric analysis applies analysis to solve geometric problems). Over time, new fields like topology or combinatorics didn’t fit neatly into the old triple division, but the trio remains a common mental model.

  • Analytic Geometry: Introduced by Descartes (1637), meaning geometry done via algebraic equations (coordinates). “Analytic” here highlights use of the analytical (algebraic) method. Not to be confused with analytic function in complex analysis. By the 19th century, “analytic geometry” was standard in education (coordinate geometry). Today, analytic geometry is just part of algebra or geometry courses.

  • Analytic Function (complex analysis): A function that is complex-differentiable in a neighborhood of each point in its domain; equivalent to being representable by a convergent power series (holomorphic). The term was solidified by Cauchy/Weierstrass. Historically, “analytic function” sometimes meant any function given by an analytic expression (formula) or power series, even in real context (e.g., Fourier called some functions “analytic” if they had a convergent expansion). But modern usage restricts it to complex differentiability.

  • Analysis Situs: An old term (from Leibniz) for the “analysis of position,” essentially an early idea of topology. Poincaré around 1890 used it to mean what we call topology now. It fell out of use as “topology” became the term, but you’ll find 19th-century references classifying “analysis situs” separately from analysis proper.

  • Real Analysis: A subfield of analysis focusing on real numbers and real-valued functions – includes sequences, series, continuity, differentiation, Riemann/Lebesgue integration, measure theory, etc., on $\mathbb{R}$ (and $\mathbb{R}^n$). The term gained currency once complex analysis (theory of functions of a complex variable) distinguished itself. E.g., one takes “real analysis” and “complex analysis” as two core grad courses. Real analysis often encompasses measure and Lebesgue integration as well.

  • Complex Analysis (Theory of Functions): Analysis dealing with complex numbers and complex-valued functions. In 19th century, commonly called “Theory of Functions” (German: Funktionentheorie) without always specifying complex, since real-variable function theory was often just called “analysis.” Today, complex analysis refers specifically to holomorphic function theory.

  • Functional Analysis: Born in early 20th century – analysis of infinite-dimensional vector spaces and linear operators. Essentially, treating functions as points in spaces like $L^p$, $\ell^2$, $C(X)$, etc., and studying properties like completeness, compactness, spectra of operators. Name from considering “functionals” (linear functionals on function spaces) by Hadamard, Riesz, etc. It’s a major branch of analysis now.

  • Harmonic Analysis: Historically means Fourier analysis and generalizations. The term emphasizes the decomposition of functions into basic waves (“harmonics”). Originally about representing periodic functions by Fourier series, it now includes Fourier transforms on groups, singular integrals, and time-frequency analysis (wavelets). It’s a core area within analysis.

  • Numerical Analysis: The study of algorithms for approximately solving analytical problems (integration, differential equations, linear systems, etc.) with attention to error, stability, and efficiency. Though “analysis” is in the name, it’s considered a part of applied mathematics – but fundamentally uses analytical understanding (e.g., error analysis uses Taylor series, stability uses spectral radius etc.). It separated as a discipline post WWII with computing, but remains grounded in classical analysis.

  • Analyst (profession): Someone who specializes in mathematical analysis. Depending on context, could mean a pure mathematician in analysis or, outside academia, could confuse with data analyst or financial analyst (non-math usage). In a math department circa 1920, an “analyst” vs “algebraist” division would be clear.

  • Analytic vs. Synthetic (philosophy of math): Also ties into the debate on how we gain mathematical knowledge. Kant had said elementary geometry is synthetic a priori. 19th-century logicists like Frege tried to show arithmetic is analytic (in philosophical sense of truth by definitions). This is tangentially related to math: e.g., analytic propositions vs synthetic in logic – but not to be confused with mathematical analysis.

  • Analytic Solution / Closed-Form Solution: In applied contexts, an “analytic solution” means an exact expression in terms of elementary or well-understood functions, as opposed to a numerical approximation. The irony: one might need a lot of analysis to find an “analytic solution.” This use harks back to “analytic expression” meaning made of algebraic operations, exponentials, logs, etc. So here “analytic” is more like “symbolic.”

  • Analytic Number Theory: Branch of number theory that uses analysis (complex analysis, Fourier, etc.) to study integers (distribution of primes, etc.). Began mid-19th century (Dirichlet, Riemann). Distinguished from algebraic number theory which uses field theory, etc. For example, the Prime Number Theorem (1896) is an analytic NT result (proved via complex analysis of zeta function).

  • Analytic vs. Meromorphic vs. Entire: Terms in complex analysis – analytic means holomorphic on an open set; entire means analytic on all of $\mathbb{C}$; meromorphic means analytic except poles. Historically, Weierstrass called meromorphic functions “uniform analytic functions with only pole singularities.”

  • Non-Analytic (function): Could mean a function that isn’t given by a convergent power series (e.g., $e^{-1/x^2}$ at 0 is $C^\infty$ but not analytic at 0, so it’s a smooth but non-analytic function). Or simply not analytic in sense of not holomorphic or not expressible in closed form.

  • Bourbaki’s use of Analysis: Bourbaki’s Éléments series avoided the word in titles: they had “Functions of One Real Variable,” “Topological Vector Spaces,” “Integration,” etc. But collectively, those correspond to classical analysis topics. Bourbaki tended to use “analysis” informally to encompass calculus and differential equations, etc., but emphasized structure over classical problem-solving.

  • Analysis in Chemistry: Mentioning to avoid confusion – chemical analysis is determining chemical composition (qualitative or quantitative analysis), no relation to math analysis except metaphor of breaking into parts.

  • Analyst in archaic sense: Sometimes 18th-century writers used “analyst” simply to mean mathematician, especially one using calculus. e.g., Berkeley’s “The Analyst” (1734) was addressed to “an Analyst” meaning a proponent of calculus (in fact, Newton or his followers).

Prosopography: Key Figures in Analysis and How They Shaped its Meaning Link to heading

(Below we profile selected mathematicians, linking their careers to shifts in the concept of “analysis.”)

  • Isaac Newton (1642–1727): Co-inventor of calculus (fluxions). Early in his career, Newton championed analysis as infinite series and algebraic methods to solve problems – “the new analysis”. Later, he advocated geometric synthesis, but his legacy in analysis is profound: his Method of Fluxions and infinite series expansions expanded the domain of analysis to calculus of motion. Newton’s work exemplified analysis as power over problems – using infinite processes (series, fluxions) to “reach all problems”. He also inadvertently started the rigor quest: critics like Berkeley targeted Newton’s analytical foundations, prompting later analysts to firm them up.

  • Gottfried Wilhelm Leibniz (1646–1716): Co-inventor of calculus (differentials). Leibniz popularized the term analysis in math – for him it encompassed the blind symbolic calculation that frees the mind. He introduced notation ($\frac{dy}{dx}$) that became universal in analysis. Leibniz also dreamed of a generalized analysis situs (topology) and a characteristica universalis, reflecting a broad view of analysis as universal reasoning method. He founded academies and fostered a community (Bernoullis, Euler down the line) that made analysis mainstream in Europe. His work reinforced analysis as an algebraic, symbol-driven approach distinct from geometric styles.

  • Leonhard Euler (1707–1783): The most prolific analyst of the 18th century. Euler’s Introductio in analysin infinitorum (1748) essentially defined analysis as the study of functions and infinite processes. He treated trigonometric and exponential functions analytically and laid many foundations (introduced the notion of function, solved countless series and integrals). Euler blurred lines between discrete and continuous (using series for number theory, etc.). He had a somewhat formal approach (ingenious manipulations, sometimes nonrigorous), which later analysts had to justify. Euler’s career established analysis as the principal tool of mathematical physics and significantly expanded its repertoire (beta, gamma functions, etc.). He trained or influenced virtually all of Europe (through correspondence and students), making analysis the lingua franca of educated mathematicians.

  • Joseph-Louis Lagrange (1736–1813): A leading analytic mechanician. In his Mécanique Analytique (1788), Lagrange declared no diagrams needed, only algebraic equations – signaling analysis (calculus, series) had superseded geometric reasoning in mechanics. Lagrange also attempted to rid calculus of infinitesimals by founding it on series expansions (his Théorie des fonctions analytiques, 1797). Though that attempt failed (later supplanted by epsilon-delta), it shows how he saw analysis as algebra made infinite. Lagrange chaired mathematics in Paris and influenced the curriculum at École Polytechnique, ensuring analysis (in rigorous form) was taught to every future engineer and mathematician in France, thereby institutionalizing the subject.

  • Augustin-Louis Cauchy (1789–1857): The key reformer of rigor. As professor at École Polytechnique, he wrote Cours d’Analyse (1821) to give calculus a firm foundation[1]. Cauchy introduced precise definitions (limit, continuity, convergence) and proofs in analysis, effectively transforming it into a theorem-proof discipline on par with geometry in rigor. He taught generations (more indirectly, as he was eventually exiled for Royalist politics, but his books were widely used). Cauchy also expanded complex analysis (residue theorem). Owing to Cauchy, analysis by mid-19th century meant something more restrictive but also more respected: no longer formal symbol pushing, but a careful logical edifice. Many of his students and followers (e.g., Joseph Liouville) continued this rigor drive and problem-solving in analysis (Liouville founded a journal focusing on analysis).

  • Niels Henrik Abel (1802–1829): A short-lived but influential Norwegian analyst. Abel’s insistence on rigorous reasoning in infinite series (he proved binomial series convergence, studied the radius of convergence concept) influenced the direction of analysis towards rigor. His famous quote on divergent series (“the devil’s invention” – though context suggests a bit tongue-in-cheek) symbolized the caution analysts adopted. Abel solved the long-standing problem of the insolubility of the general quintic, using analysis and algebra. He and Carl Gustav Jacob Jacobi pioneered elliptic function theory (inverting elliptic integrals), broadening analysis into a new class of complex functions. Abel’s work and tragic early death spurred colleagues (like Cauchy, who published some of Abel’s results posthumously) to further pursue the rigorous development of power series and complex function theory.

  • Karl Weierstrass (1815–1897): Often called “the father of modern (rigorous) analysis.” Weierstrass systematically eliminated intuition from analysis: he formalized $\epsilon$-$\delta$ definitions across the board (though Cauchy had limits, Weierstrass gave precise definitions for concepts like uniform convergence, upper and lower limits, etc.). He constructed pathological examples (like nowhere-differentiable continuous functions) to show why rigor is necessary. As a professor in Berlin, his lectures trained a generation of analysts (including Sonya Kovalevskaya, H. A. Schwarz, Georg Cantor initially in number theory, etc.). Weierstrass’s insistence that analysis be built on arithmetic of real numbers led to the arithmetization program (his students’ works, e.g., Heine’s on uniform continuity, and influence on Dedekind, Cantor). He also contributed significantly to complex analysis (Weierstrass factorization theorem, theory of elliptic and Abelian functions) and the theory of differential equations (via power series). By his retirement, the image of an analyst was one of someone extremely precise and perhaps a bit abstract – thanks largely to Weierstrass.

  • Bernhard Riemann (1826–1866): Though he died young, Riemann’s contributions spanned analysis, geometry, and number theory, often blurring their boundaries. In analysis: he defined the Riemann integral (1854), introduced Riemann surfaces uniting multi-valued complex functions with topology, solved the Fourier series convergence problem for many cases, and posed the Riemann Hypothesis linking analysis and number theory. Riemann’s approach was often intuitive/geometric (like using Dirichlet’s principle in complex analysis), which left gaps that others (Weierstrass) filled with rigor. But Riemann greatly expanded the scope of analysis: he made complex analysis geometric, analysis situs (topology) analytic, and the study of $\zeta(s)$ (the Riemann zeta function) created a new direction in analytic number theory. His Habilitationsvortrag (1854) on the foundations of geometry, while mostly in differential geometry, was also a blueprint for what later became global analysis (analysis on manifolds). Riemann thus pulled analysis into new territory and inspired later blending of analysis with geometry and physics.

  • Georg Cantor (1845–1918): Creator of set theory, initially motivated by questions in Fourier analysis. Cantor’s work on point sets (1870s) – introducing countability, uncountability, derived sets, and the Cantor ternary set – fundamentally altered the logical framework of analysis. By showing there are different sizes of infinity and that real numbers were uncountable, he clarified the continuum’s nature. Cantor’s results like the Cantor set (nowhere dense, measure zero, uncountable) provided extreme examples for analysts, e.g., a set of points where a function could be continuous or differentiable or not, affecting integration theory and function theory. While Cantor’s set theory became its own branch of math (with contentious reception like Kronecker’s opposition), it was quickly integrated into analysis (via Borel, Lebesgue, etc.). Cantor himself was an “analyst” early in career (Fourier series papers). His legacy ensured that by 1900, any profound analyst also had to be something of a set theorist – the language of sets, ordinals, etc., became part of analysis foundations.

  • Émile Borel (1871–1956), René Baire (1874–1932), Henri Lebesgue (1875–1941): A trio of French analysts who built measure theory and topology out of analysis problems. Borel introduced countable additivity and “Borel sets” (1890s) in context of probability and measure. Baire classified functions by points of continuity (Baire classes) and came up with the Baire Category Theorem (1899) – an important tool in analysis (showing, e.g., “most” continuous functions are nowhere differentiable in the sense of category). Lebesgue crowned this development with his 1902 integral, solving the problem of integrating highly discontinuous functions and interchanging limits and integrals rigorously. They were all students or influenced by Henri Poincaré and others at Paris, and they collectively professionalized French analysis in the 20th century. Lebesgue’s students (like Fréchet) carried it on. These men redefined analysis to include measure and category – beyond sequences and series, now properties of sets themselves were part of analysis. Their work underpins modern real analysis education.

  • David Hilbert (1862–1943): A giant who contributed to many areas, Hilbert influenced analysis through his work on integral equations (origin of functional analysis), calculus of variations, and through his famous 23 problems (1900) which included several analysis problems (continuum hypothesis – though set-theoretic, it’s about the continuum; Riemann Hypothesis; equivalence of different definitions of integrals; axiomatization of physics which led to distribution theory, etc.). Hilbert’s work on Hilbert spaces (concept not named thus until later) around 1906 established a new paradigm: infinite-dimensional analysis with inner products, which later was crucial for quantum mechanics. As a teacher at Göttingen, he guided many in analysis (such as Schmidt, Weyl in early spectral theory, and Banach and others indirectly). Hilbert’s axiomatic method also inspired Nicolas Bourbaki in the 1930s, which in turn shaped how analysis was presented mid-century (structural, general). Hilbert blurred lines – e.g., using analysis to solve a number theory problem (the Waring problem via what we’d call analysis or combinatorics). He is emblematic of the idea that an analyst should be versatile and that analysis is central to solving problems across math.

  • Stefan Banach (1892–1945): A founder of modern functional analysis. Working in Lwów, Poland, Banach’s 1932 book defined normed linear spaces (Banach spaces)[2] and systematically studied linear operators on them. He proved key theorems (Hahn-Banach, Banach-Steinhaus uniform boundedness, Banach Fixed-Point which is a basis for metric space analysis). Banach’s school (including Hugo Steinhaus, Stanisław Mazur) applied functional analysis to solve concrete problems (Fourier series, integrals, etc.), showing the power of abstraction. Banach’s work solidified analysis as a structural field: no longer just about real or complex numbers but about abstract spaces of functions. He was also an applied analyst – his fixed-point theorem is fundamental in differential equation existence theory (Picard iterations). Banach’s life (tragically cut short in WWII) also highlights how analysis communities were affected by world events – Lwów’s famous “Scottish Café” where Banach’s group posed problems got disbanded due to war, dispersing analysts worldwide. Many problems from that café led to further research (e.g., the Banach-Tarski paradox proof).

  • Norbert Wiener (1894–1964): An American mathematician known for harmonic analysis and cybernetics. He proved the Wiener Tauberian Theorem (1932) in harmonic analysis, linking Fourier transforms and function approximation – a result with implications in number theory (density of primes) and signal processing. He also pioneered Brownian motion analysis and gave a rigorous construction of the Wiener measure (1923) on continuous paths. Wiener’s Cybernetics (1948) popularized the idea of feedback and control – essentially applied analysis of dynamical systems – influencing diverse fields. The Wiener filter (for signal noise reduction) was an early triumph of random signal analysis (1942). Wiener thus represents the broadening of analysis into engineering and biology – by mid-20th century, an analyst could well be solving problems in completely different domains under the same principles (Fourier analysis, stochastic processes). He also corresponded with and influenced complex analysts and even philosophers, bridging academic divides.

  • Laurent Schwartz (1915–2002): French mathematician who created distribution theory (generalized functions) around 1945. By providing rigorous meaning to objects like Dirac’s delta, he enabled the solution of differential equations that classical methods couldn’t handle. For this, he received a Fields Medal in 1950[3] – one of the first major recognitions of work squarely in analysis. Schwartz was also a prominent teacher (influencing Grothendieck among others) and wrote lucid texts that trained generations in modern analysis. His work changed the language of analysis: by 1960s, one talks about “tempered distributions,” “Fourier transforms of distributions,” etc., in solving PDEs. His contributions underscore analysis as the adaptable framework for new phenomena in math and physics (like quantum field singularities). He was also politically active, showing that many analysts of the 20th century were engaged citizens (analysis had grown so central that its practitioners often had broad influence).

  • Alexander Grothendieck (1928–2014): Though best known as an algebraic geometer, Grothendieck began in functional analysis (topological tensor products, nuclear spaces in the 1950s) making major contributions (his Thèse in 1953 solved problems in Schwartz’s nuclear spaces theory). He integrated categorical thinking into analysis. Later, his approach to algebraic geometry was very much influenced by analytic abstraction (cohomology theories had analytic analogues). In some sense, he represents the unity of math beyond fields: he used tools from analysis in algebraic contexts and vice versa. His early work extended analysis’ abstract side (Bourbaki style extreme: he even wrote a treatise “Topological Vector Spaces” as Bourbaki member). Grothendieck’s life also highlights the social side: he eventually left institutional math in part due to political convictions, showing an example of an analyst deeply concerned with society, perhaps echoing earlier analysts like Wiener or Schwartz.

  • Alan Turing (1912–1954) and John von Neumann (1903–1957): Represent a strand of analysts turned computing pioneers. Turing, though logician, solved PDEs related to biological morphogenesis (reaction-diffusion equations, 1952) – applying analysis to biology. Von Neumann, as discussed, brought functional analysis to quantum physics and spearheaded numerical analysis. They illustrate that by mid-20th century, analysis was foundational in the new field of computer science and physics models.

  • Terence Tao (1975– ): A contemporary figure who has made contributions across numerous analysis subfields: harmonic analysis (Tsao’s work on wavelet bases, restriction conjecture), PDE (Navier-Stokes partial results), combinatorics (Gowers norm and Szemerédi’s theorem proof with Gowers, Green-Tao primes in AP), even algebraic geometry (topic like expander graphs from linear groups). Tao’s breadth (and Fields Medal 2006) exemplifies the modern analyst who is not confined – he uses analysis as a toolkit to approach problems anywhere in math. Also, Tao’s public profile (blogging about current research, authoring problem-solving guides) has helped demystify analysis to broader audiences. He stands on the shoulders of those like Hardy and Littlewood (20th c. analysts who also ranged widely) but in an even more connected world of mathematics.

Each of these individuals (and many others omitted for brevity: e.g., Gauss, Hardy, Littlewood, Kolmogorov, Calderón, Zygmund, Stein, Lions, Fefferman, Villani, etc.) has left an imprint on what “analysis” means. Through them, we see analysis transform: from Newton’s personal tool, to Euler’s universal method, to Cauchy’s rigorous subject, to Hilbert’s structural foundation, to Schwartz’s generalized universe, and now to Tao’s cross-disciplinary playground. Their careers collectively chart the journey described in this report: how “analysis” grew from a technique to the central domain of modern mathematics, continually enriched and redefined by its practitioners.

Method Atlas: Key Analytical Techniques Across Eras Link to heading

This section provides an illustrated (conceptually, as text) guide to representative methods that have been emblematic of “analysis” in various periods. Each method is described in context, with a short example highlighting how it’s used and when it became prominent.

  • Infinite Series Expansion (17th–18th c.): Method: Represent functions or quantities as infinite sums of simpler terms (often powers). Signature use: Newton’s binomial series: $(1 + x)^{\alpha} = 1 + \alpha x + \frac{\alpha(\alpha - 1)}{2!}x^{2} + \cdots$. Historical impact: Allowed computation of approximate values (e.g. $\sin, \cos$ tables via series) and solving equations (Newton solved $e^x=2$ by series inversion). Example: Euler’s evaluation of $\sum_{n=1}^\infty \frac{1}{n^2}$ – by expanding $\sin x$ as power series and comparing coefficients, he found $\pi^2/6$. Evolution: Power series remain fundamental (analytic functions). Eventually, concerns about convergence led to precise radius of convergence tests (Abel, Cauchy).

  • Integral Calculus (quadrature) (17th–18th c.): Method: Finding areas under curves or solving differential equations via antiderivatives. Signature use: The Fundamental Theorem of Calculus: $\int_a^b f'(x)dx = f(b)-f(a)$. Historical impact: Solved classical geometry problems (area, volume) and enabled analysis of motions via solving $\int v(t)dt$ to get distance. Example: In 1761, Lagrange integrated the tautochrone problem differential equation to find the curve on which a bead slides under gravity to the bottom in equal time (result: cycloid). Evolution: Riemann formalized integration (partitions, sums), then Lebesgue generalized it to highly irregular functions. Modern integrals handle probabilities and divergences far beyond geometric areas.

  • Differential Equations & Series Solutions (18th–19th c.): Method: Formulate laws (in mechanics, etc.) as ODE/PDE and solve via power series or special functions. Signature use: Fourier’s heat equation solution by trigonometric series (1822) – representing temperature as an infinite sine series. Example: Bessel’s equation (from planetary motion problem) solved by series to define Bessel functions $J_\nu(x) = \sum_{m=0}^\infty \frac{(-1)^m}{m!\,\Gamma(m+\nu+1)}\left(\frac{x}{2}\right)^{2m+\nu}$. These functions became standard “analytical” tools. Evolution: Existence theory got systematic (Picard iteration, Peano existence in 1890s – early functional analysis). Later, qualitative methods (phase plane, stability – Poincaré, Lyapunov) complemented explicit solutions. Today, solving PDE often uses functional analysis (weak solutions, variational principles).

  • Fourier Analysis (19th c.): Method: Express periodic functions as sums of sines and cosines (or general functions via Fourier transform). Signature use: Fourier series $f(x) \sim \frac{a_0}{2} + \sum_{n=1}^\infty [a_n\cos(nx)+b_n\sin(nx)]$. Historical impact: Solved the heat equation; later used in signal processing (breaking signals into frequencies). Example: Dirichlet’s theorem: for a $2\pi$-periodic piecewise monotonic function, the Fourier series converges to $\frac{f(x+)+f(x-)}{2}$. Evolution: Riemann and Lebesgue clarified convergence with integrals. In 20th c., generalized to non-periodic via Fourier transform $\hat f(\xi)=\int_{-\infty}^\infty f(x)e^{-ix\xi}dx$. Fourier methods now solve PDEs (via separation of variables, spectral analysis) and analyze systems (filtering, convolution). Modern offshoots include wavelet analysis (localized Fourier) and time-frequency distributions.

  • Complex Contour Integration (19th c.): Method: Evaluate real integrals or sums using complex integrals and residues. Signature use: Cauchy’s Residue Theorem: $\oint_{\gamma} f(z)dz = 2\pi i \sum \operatorname{Res}(f, a_k)$ (sum of residues at poles inside $\gamma$). Historical impact: Revolutionized evaluation of definite integrals that were intractable by real methods, and solved summation problems (via contour summation). Example: To sum $\sum_{n=-\infty}^{\infty}\frac{1}{1+n^2}$, one uses a contour integral of $\pi\cot(\pi z)$ which has simple poles at integers, to get the sum equals $\pi\coth(\pi)$. Evolution: Complex analysis became indispensable in number theory (Tauberian theorems, analyticity of zeta function), in differential equations (monodromy, Laplace transform), and remains a key tool in mathematical physics (evaluating integrals in quantum field theory via contour deformation).

  • Epsilon-Delta Rigor (19th c.): Method: Formal definition of limits: $\lim_{x\to a}f(x)=L$ if for every $\epsilon>0$ there exists $\delta>0$ such that $0<|x-a|<\delta$ implies $|f(x)-L|<\epsilon$. Signature use: Weierstrass’s formal proofs that every continuous function on [a,b] achieves a maximum (using $\epsilon$-$\delta$ and sequential compactness arguments). Impact: Provided a foundation that resolved paradoxes and allowed confident extension of analysis to new contexts (like infinite-dimensional spaces later). Example: Heine (1872) gave formal $\epsilon$-$\delta$ definition of uniform continuity and proved that a continuous function on [a,b] is uniformly continuous (the Heine-Cantor theorem). Evolution: Epsilon-delta is still taught as basis of analysis, though advanced work often uses equivalent notions (open sets, sequences). The spirit persists: any new analytical concept (limit in metric spaces, convergence in distribution, etc.) uses analogous quantification for rigor.

  • Set-Theoretic Topology & Measure (Late 19th–20th c.): Method: Using sets and topology to generalize continuity, convergence, measure of size. Signature use: Borel $\sigma$-algebra and Lebesgue measure – measure is defined for all Borel sets with desired completeness properties. Impact: Extended analysis to highly irregular sets and functions; enabled probability theory to be rigorous. Example: A non-measurable set existence (Vitali 1903) – by assuming a choice of representatives mod 1 in [0,1), one forms a subset that no consistent measure can assign a value to. This showed the necessity of restricting to measurable sets in Lebesgue theory. Evolution: Today topology and measure theory are foundational in analysis: e.g. every functional analyst uses Borel sets; every discussion of continuity can be in terms of open sets. New notions like fractal dimension (Hausdorff measure) were created, bridging analysis and geometry.

  • Functional Analysis & Operator Theory (20th c.): Method: Treat functions as points in Banach/Hilbert spaces; apply linear algebra intuition to infinite dimensions. Signature use: The Spectral Theorem: a self-adjoint compact operator on a Hilbert space has an orthonormal basis of eigenvectors with real eigenvalues (Hilbert-Schmidt theory). Impact: Underlies quantum mechanics (operators as observables), stability analysis (spectrum of linearized operators), and many PDE solution methods (eigenfunction expansion). Example: Solving Laplace’s equation on a domain via eigenfunction expansion – by viewing Laplacian as an operator on $L^2$ of domain, find its eigenvalues $\lambda_n$ and eigenfunctions $\phi_n(x)$, then represent any solution as $u(x)=\sum c_n \phi_n(x)$. This approach generalizes Fourier series to irregular domains. Evolution: Functional analysis now includes Banach space geometry (isomorphic theory), $C^*$-algebras, etc. Techniques like Hahn-Banach (extension of functionals) and fixed-point theorems (Banach, Schauder) are staple tools to prove existence theorems in analysis.

  • Distribution Theory (mid-20th c.): Method: Extend notion of function to generalized functionals that allow differentiation and Fourier transform without classical convergence issues. Signature use: $\delta$ “function” satisfying $\int_{-\infty}^{\infty}\delta(x)\varphi(x)dx = \varphi(0)$ for all test functions $\varphi$. Differentiation: $\delta'(\varphi) = -\varphi'(0)$. Impact: Made it possible to solve PDEs like $u''=f$ even when $f$ is a distribution (modeling concentrated sources) – crucial in physics (Green’s functions, fundamental solutions). Example: The Heaviside step $H(x)$ is not differentiable in classical sense; in distributions, $H'(x) = \delta(x)$. This formalizes what engineers used informally. Evolution: Distribution theory is now standard in advanced analysis and PDE courses. It influenced other fields – e.g., number theory’s use of distributions in explicit formulas linking zeros of zeta to prime distributions, and geometry (currents in geometric measure theory, a generalization of distributions to vector-valued forms).

  • Singular Integrals & PDE (mid-late 20th c.): Method: Estimate convolution-type operators with kernels having singularity at 0 (e.g., Hilbert transform, Newtonian potential). Signature use: Calderón-Zygmund decomposition lemma to control such integrals on $L^p$. Impact: Provided robust a priori estimates for PDE solutions (ensuring boundedness of operators like Riesz transforms associated with Laplacian), and solved problems like Lusin’s conjecture on differentiability a.e. of integral functions. Example: Hilbert transform $Hf(x) = \text{p.v.}\frac{1}{\pi}\int f(y)\frac{1}{x-y}dy$ is bounded on $L^p(\mathbb{R})$ for $1<p<\infty$ – a nontrivial result from Calderón & Zygmund (1952) enabling one to infer, e.g., if $f\in L^p$, then $f$ has a well-defined “conjugate function” $Hf$. Evolution: This theory became part of standard harmonic analysis. It extended to non-Euclidean spaces (e.g., Stein’s work on Lie groups) and to nonlinear analogs. It remains active (recent progress on endpoints cases, etc.).

  • Microlocal & Fourier Integral Methods (late 20th c.): Method: Analyze functions in both space and frequency, tracking how singularities propagate. Signature use: Wave front set $\operatorname{WF}(u)$, which tells where and in what directions a distribution $u$ fails to be smooth. Impact: Revolutionized understanding of linear and nonlinear PDEs (especially hyperbolic equations). E.g., one can prove propagation of singularities along characteristic curves of a PDE (Hörmander 1971). Example: In tomography, microlocal analysis explains which singular features of an object can be reconstructed from X-ray projections – the foundation of CT scan mathematics. It uses that the Radon transform is a Fourier integral operator that sends singularities to singularities. Evolution: Microlocal analysis now enters many fields (semi-classical analysis in quantum chaos, inverse problems, even analytic number theory via the study of exponential sums as Fourier integral analogs).

  • Iterative Approximation & Computational Analysis (various eras): Method: Use successive approximations to reach a desired solution within an error tolerance. Historical examples: Newton’s method (17th c.) for solving equations by iterating $x_{n+1}=x_n - f(x_n)/f'(x_n)$. 20th c. example: Krylov subspace methods for large linear systems (like Conjugate Gradient, 1952) – iteratively minimize error. Impact: Though less glamorous in pure math, iterative methods enabled solving problems at scale in science and engineering. Analysis provides the convergence proofs and error bounds. Example: The Picard iteration for $y'=f(y)$: set $y_{n+1}(t)=y_0 + \int_0^t f(y_n(s))ds$. If $f$ is Lipschitz, $y_n$ converges to the unique solution. This uses Banach’s fixed-point theorem. Evolution: These methods are at the core of numerical analysis. Modern research ensures stability of algorithms (floating point rounding analysis is a form of analysis). With computers, “analysis” also comes to mean sensitivity analysis – how changes in input affect output, which is essentially derivative estimation in high dimensions (common in robust design, deep learning training which uses gradient descent, etc.).

Each method above not only solved certain problems but also shaped the perception of analysis in its time: e.g., power series gave analysis an algebraic flavor, epsilon-delta gave it a rigorous reputation, functional analysis gave it abstract power, and computational methods show its practicality.

Images one could associate (conceptually, since this is text-only): - Newton’s binomial series expansion graphically approximating $(1+x)^\alpha$. - A Cauchy epsilon-delta diagram on a limit. - A Fourier series approximating a square wave (Gibbs phenomenon). - A complex integral path bypassing poles for residue calculation. - A Cantor set construction (demonstrating a pathological set important in measure/category). - A visual of wavelet basis functions at different scales. - A phase space diagram illustrating chaos (analysis in dynamics). - These serve to illustrate the breadth of analysis techniques.

Syllabi & Textbook Dossier: Landmark Courses and Texts in Analysis Link to heading

Mathematical analysis has been transmitted through teaching as much as through research. Here we compare a selection of influential textbooks and curricula across eras, to see what “analysis” meant to each generation of students:

  • Euler’s Introductio in Analysin Infinitorum (1748): Content: Defined concept of function, covered infinite series (exponential, logarithmic, trigonometric in series form), introduced Euler’s formula $e^{ix}=\cos x + i\sin x$, and infinite product expansions (like $\sin x = x\prod (1-\frac{x^2}{n^2\pi^2})$). Pedagogical style: Heuristic, example-driven. No $\epsilon$-$\delta$; freely swapped series and limits. Significance: It essentially set the syllabus for “analysis” for a century: expansions, introduction of new transcendental functions, and techniques of series and product manipulation. It assumed algebra and geometry basics, pushing into new territory of the infinite.

  • Lacroix’s Calculus Texts (1797–1800): (Sylvestre Lacroix published comprehensive textbooks on differential and integral calculus, taught at École Polytechnique). Content: Synthesis of Euler, Lagrange, etc. Still used infinitesimals but more systematically; included many applications to geometry and mechanics. Significance: Trained the likes of Cauchy, Fourier in their student days. Represented the peak of pre-rigorous analysis pedagogy – very exhaustive and formal but not yet “rigorous” by later standards.

  • Cauchy’s Cours d’Analyse (1821): Content: Built from first principles: defined real numbers informally, rigorously defined limit, continuity[7], derivative, series convergence, etc. Excluded geometry or physical intuition, focusing purely on analysis concepts (the subtitle “Analyse Algébrique” indicates it was part I, dealing with series and algebraic analysis). Exercises: Provided in “Exercises de mathématiques” separate volumes – included counterexamples to show necessity of hypotheses (e.g., series rearrangement can diverge). Impact on curriculum: Established the template for analysis courses focusing on fundamental definitions and proofs. French and later global curricula adopted Cauchy’s rigor (though British lagged until later). It was a difficult book for students initially (breaking with intuitive teaching), but seminal in creating the notion of an “analysis course” as we know it.

  • William Thomas Brinkley’s Calculus (1850s, Cambridge) vs. Continental texts: British texts mid-19th century were still largely non-rigorous, focusing on calculation techniques and applications to geometry, with fluxion-style or limit intuitive arguments. In contrast, say, Heinrich Eduard Heine’s Vorlesungen über die Kugelfunctionen (1878) which included a solid chapter on Fourier series convergence (Heine’s theorem). This dichotomy shows by 1850, analysis curricula in Europe diverged: France/Germany emphasizing rigor and theory, England emphasizing applied computation. This gap closed after 1870s (with teachers like Hardy bringing rigorous analysis to Cambridge).

  • Weierstrass’s Lectures (1860s Berlin, transcribed by students): Not published officially until much later, but students’ notes circulated. Content: Epsilon-delta definitions from ground up, arithmetical theory of irrationals (though Dedekind published cuts first), heavy focus on power series and their radius, careful development of complex function theory. Classroom style: Legendary for clarity and rigor. Students like Mittag-Leffler, Hurwitz, etc., spread his approach in new courses across Europe (Mittag-Leffler in Sweden, for example, established modern analysis teaching there). Essentially, by 1880, any advanced analysis course would follow Weierstrass’s style: start with real number properties, sequences, series, then differentiation, etc., all rigorously.

  • G. H. Hardy’s A Course of Pure Mathematics (1908, UK): Content: First rigorous analysis text in English for undergrads. Covers limits, continuity, differentiability, Riemann integration, series, and touches on functions of several variables and the convergence of improper integrals. Style: Engaging and witty, but firmly epsilon-delta. Hardy explicitly aimed to align British teaching with continental rigor, down to quoting $\epsilon$-$\delta$ arguments. Influence: It revolutionized analysis teaching in the UK and Commonwealth, being the primary text for decades, and is still praised for its clear exposition. It basically created the template for a “first analysis course” in English (similar to what Courant did in the US slightly later with What is Mathematics? which included some analysis, and then others like Spivak or Apostol in mid 20th c.).

  • Emil Borel’s Leçons sur la théorie des fonctions (1898) & Lebesgue’s Leçons sur l’Integration (1904): These French course notes introduced measure and integrals shortly after their discovery. Borel’s lessons taught power series, analytic continuation, etc., but also first treatments of point set measure-zero ideas. Lebesgue’s text taught Lebesgue integration to a broader audience. These quickly made it into curricula as graduate or advanced undergraduate topics, starting in Paris and then worldwide as translation or adoption (e.g., Young’s 1910 English treatise on Lebesgue integration). They show how new research (measure theory) entered teaching within a decade – reflecting analysis’ fast consolidation of fundamentals.

  • Courant & John’s Introduction to Calculus and Analysis (1965, based on earlier German editions by Courant): Content: Balanced rigor and intuition. Covered basic analysis with applications to geometry and physics (Courant was an applied analyst as well). It didn’t emphasize measure theory (for basic calculus, it stuck to Riemann integrals but carefully), but included proofs of key theorems. Philosophy: “The purpose of mathematical analysis is not only to prove theorems, but also to solve problems” (Courant’s ethos). Impact: In the US, it provided an alternative to purely epsilon-delta texts by showing usefulness – influenced textbooks that followed in blending rigor with application (e.g., Apostol’s Calculus (1967) included linear algebra and some proofs, or later texts like Strang’s more applied calculus).

  • Walter Rudin’s Principles of Mathematical Analysis (1st ed. 1953): Content: Succinct, axiomatic development of $\mathbb{R}$ (via completeness axiom), sequences, series, continuity, differentiation, Riemann-Stieltjes integration, sequences of functions (uniform convergence), multivariable analysis and implicit function theorem, and an introduction to metric spaces. Style: Very concise and rigorous (nicknamed “Baby Rudin”). Influence: Became a standard in many US universities for first-year graduate analysis or strong undergrad courses. It epitomized mid-20th-century Bourbaki-influenced pedagogy – theorem-proof style with minimal examples. While sometimes criticized for being too abstract for beginners, it trained countless analysts.

  • Nicolas Bourbaki’s Éléments (especially the books on Topology, Integration, Functions of Real Variable, 1940s–60s): Content: Extremely abstract and general – for example, integration theory for abstract measure spaces, not just $\mathbb{R}^n$; general topological vector spaces. Use in curricula: Rarely used directly as textbooks, but influenced the authors of textbooks (like Rudin, Dieudonné). Bourbaki made analysis structural: e.g., filters and nets instead of sequences, a heavy categorical tone. Result: A generation of teachers in France and elsewhere taught analysis more rigorously and abstractly than before. In the long term, Bourbaki’s approach was moderated by more example-driven texts, but the emphasis on structure (e.g., metric space generalizations in undergrad analysis) stuck.

  • Elias Stein & Rami Shakarchi’s Princeton Lectures in Analysis (2003): Four-volume series covering Fourier analysis, complex analysis, real analysis (measure theory), and functional analysis, aimed at advanced undergrads/beginning grads. Content: Modern, integrating things like distribution theory earlier than usual, and many connections (e.g., probability in measure theory volume). Pedagogy: Clear proofs but also many examples and problem sets bridging theory and application. Significance: Representative of a 21st-century approach – not simplifying the content but making it accessible and showing unity among subfields. It’s somewhat a response to earlier fragmentation (by having one coordinated series).

  • Online and Active Learning (2020s): Today, many courses use open resources like Terrence Tao’s Analysis I,II notes (which are free online, covering standard analysis with Tao’s problem-solving insight) or video lectures (MIT’s OCW for Real Analysis following Rudin or Strang’s Applied Math perspective). The trend is blending computational examples (plotting Fourier series on computer, etc.) to deepen understanding. The core content is still what Cauchy/Weierstrass codified, plus Lebesgue. One new component is more linear algebra and multi-variable emphasis early, reflecting the needs of applied fields (this was anticipated by Apostol and others who integrated linear algebra into calculus).

In sum, the curriculum of analysis evolved from intuitive problem-solving (18th c.) to rigorous epsilon-delta foundations (19th c.) to abstract general theory (mid-20th c.) and now to a balanced approach highlighting both theory and myriad applications (21st c.). Textbooks have both driven and reflected this evolution, each era’s seminal books shaping how analysis was understood by the new generation.

Institutional Map: How “Analysis” Became a Defined Domain Link to heading

Finally, we consider the institutional frameworks – universities, departments, journals, and societies – that have carved out “analysis” as a distinct domain, and sometimes policed its boundaries:

  • Academies and Early Journals (18th–19th c.): The rise of national science academies (Paris, St. Petersburg, Berlin) in the 18th century gave analysts patronage and outlets. Euler’s many papers in the Petersburg Academy Memoirs spread analytic results widely. The founding of journals like Crelle’s Journal (Journal für die reine und angewandte Mathematik, 1826) specifically encouraged analysis papers: its first volumes feature Abel (analysis of series) and Dirichlet (analytic number theory), etc. The name “pure and applied” already implies analysis as central (applied analysis was basically applied math/physics).

  • University Chairs: In the 19th century, universities began having chairs labeled with subjects. For example:

  • The École Polytechnique in Paris had a chair in “Analysis and Mechanics” (Fourier held it, then Coriolis, etc.), distinctly separate from “Geometry” or “Astronomy.”

  • Cambridge’s Sadleirian Chair of Pure Mathematics (est. 1860) was first held by Arthur Cayley – his work was more algebraic, but under him and later G. H. Hardy (1931–1942) it turned into an analysis-focused position (Hardy championed rigorous analysis in Cambridge).

  • In Germany, by late 19th c., one sees positions of “Professor of Higher Analysis” or similar, especially at technical universities.

These chairs indicate analysis was seen as a distinct subject to be taught and advanced.

  • Seminars and Schools:
  • Berlin University (Weierstrass’ seminar): Weierstrass held a famed analysis seminar in the 1860s–80s that drew students internationally (Mittag-Leffler from Sweden, etc.). Berlin became a mecca for analysis (with Weierstrass, Kummer, and Kronecker all there – though Kronecker was anti-Cantor, showing internal disputes).
  • Göttingen (Hilbert’s school): Around 1900, Göttingen was top in many fields. Hilbert’s analysis seminar and lectures on integral equations (1905–1910) propelled functional analysis. His collaboration with Richard Courant led to the influential Methods of Mathematical Physics text bridging pure and applied analysis.

Other notable hubs: - Paris (École Normale/College de France): Had Hadamard, Montel, Lebesgue, etc. teaching analysis. The Seminaire Hadamard in 1920s was a big deal. - Warsaw/Lwów (Poland between wars): Lwów’s Scottish Café (Banach, Ulam) was informal but highly productive in functional analysis. - Moscow (Luzin’s school): In early 20th c., Nikolai Luzin led the “Luzitania” group, developing descriptive set theory and real function theory (the Russian school of analysis including Sobolev later). They had Moscow university seminars, out of which came major contributions (e.g., Sobolev spaces, the Lebesgue differentiation theorem proved by his student Lebesgue independently, etc.).

  • Departmental Splits:

  • In the 20th c., some universities split pure vs applied math. For example, in the US, an "Applied Mathematics" division might handle numerical analysis, whereas “Mathematics” dept covers pure analysis. However, many applied analysis folks remained integrated in math departments.

  • Some places split by subject: e.g., separate “Department of Computational and Applied Math” (like at some tech institutes).

  • But analysis broadly understood often spans these divisions; e.g., an applied math program might still rely on functional analysis (for PDE), while a pure math program might collaborate with an engineering one on wavelet research.

  • Professional Societies and Conferences:

  • International Congress of Mathematicians (ICM): From its start (1897), analysis topics have been prominent. E.g., 1908 ICM: Had keynote talks by Pringsheim on divergent series (analysis) and by Émile Picard on entire functions (analysis). Fields medals related to analysis have been frequent (some listed earlier: Schwartz, Hörmander, Caffarelli recently, etc.).

  • Societies: The London Mathematical Society (founded 1865) was dominated by analysts in early decades (since British algebraists were weaker then), publishing many calculus and differential eqn works. The American Mathematical Society (founded 1888) similarly had analysis heavy content in Transactions. Over time, they formed special interest groups: e.g., AMS and SIAM both have “Analysis of PDE” or “Orthogonal Polynomials” conferences – showing sub-areas in analysis coalescing.

  • Journals Specializing: By mid-20th century, the proliferation of journals mirrored specialization.

  • Annals of Mathematics (Princeton) often published analysis breakthroughs (e.g., Carleson’s theorem on Fourier series).

  • Journal d’Analyse Mathématique (Jerusalem, founded 1951 by Gelbart, Piatetski-Shapiro, etc.) explicitly for analysis.

  • Journal of Functional Analysis (1967, R. S. Phillips first editor) carved out functional analysis as a distinct area requiring its own journal.

  • SIAM Journal on Mathematical Analysis (1968) aimed at applied analysis.

  • Inventiones Mathematicae often carried major analysis papers (Fefferman’s work, etc.) along with algebraic ones.

This journal landscape shows analysis both broad and subdivided; still, generalist journals like Annals, Acta, etc., have a significant fraction of content that’s analysis-related, reflecting analysis as central.

  • Bourbaki and Institutional Influence: Bourbaki (mostly French analysts like Dieudonné, Schwartz in early membership) had a big say in postwar curriculum worldwide. They helped set standards: e.g. recommending measure theory be taught before Riemann integration (influencing texts like Dieudonné’s Foundations of Modern Analysis, 1960). Some universities (especially in Europe) adopted a very Bourbaki style in the 50s–70s (abstract first). By late 20th c., curricula softened to include more concrete context.

  • Cross-Disciplinary Institutes:

  • The Courant Institute at NYU (founded 1935 by Richard Courant) explicitly bridged pure and applied analysis focusing on PDE, mathematical physics, and computation. It became a model of an institutional home for analysis with real-world ties (Courant alumni include Peter Lax, Louis Nirenberg).

  • The Institute for Advanced Study (IAS) in Princeton often had analysis programs (with figures like Atle Selberg or Elias Stein guiding).

  • MSRI in Berkeley held semester programs like “Harmonic Analysis and Applications” (2004) etc., strengthening networks among analysts and between analysis and other areas (like computer science in cryptography or string theory in physics, which have analysis content).

  • In the Soviet Union, “V.A. Steklov Institute of Mathematics” in Moscow and similar institutes in Leningrad were hubs where analysis, particularly in PDE and mathematical physics, was heavily pursued.

  • Analytical vs. Abstract Culture: Historically, some universities became known for emphasis in certain approaches. For instance:

  • Paris (École Normale) under Hadamard was known for classical analysis and PDE (Hadamard solved PDE, created method of characteristics).

  • Cambridge, until Hardy, was more applied (the Tripos problems were computational), after Hardy became very rigorous.

  • Chicago under Zygmund and later Stein was harmonic analysis powerhouse.

  • Leningrad (St. Petersburg) under Tikhonov and Sobolev had a strong applied analysis/numerical PDE focus.

  • UCLA and Texas nowadays are known for nonlinear PDE and applied analysis groups (with Fields medalists like Tao, or experts like Luis Caffarelli at Texas).

These “cultures” shape how analysis is defined locally – e.g., an “analysis qualifying exam” at one place might lean to real analysis measure theory, another might include complex, another might include functional analysis basics.

  • Interdisciplinary boundaries: There's sometimes friendly tension: for example, number theorists vs. analysts regarding approaches (an analytic number theorist bridging both, versus a pure algebraic number theorist who might consider analysis an ancillary tool). Or combinatorics vs. analysis (like Szemerédi’s theorem had both an ergodic proof and a purely combinatorial proof by Gowers; which is considered more enlightening can depend on departmental tradition). But increasingly, these boundaries are porous, with analysis techniques infiltrating many fields.

  • Funding and influence: In mid-20th-century Cold War, analysis (especially applied) got big government support due to its role in computing, physics, engineering – e.g., RAND Corporation employed analysts for fluid dynamics, etc. The Clay Millennium Prizes announced in 2000 include analytic problems (Navier-Stokes, Riemann Hypothesis) showing that the top unsolved problems have analysis at their core, guiding funding to those areas.

In conclusion, the institutional “carving” of analysis shows it growing from an aspect of mathematics everyone used (Euler’s time) to a well-defined scholarly specialty taught via dedicated courses and advanced via specialized seminars and journals. Yet, analysis always maintained bridges outward – consistent with its foundational nature. Many institutions foster interplay (e.g., an analysis seminar might welcome a logic talk if it involves computable analysis, or a probability talk on stochastic PDE). The “policing” of analysis is usually only in ensuring standards of rigor or style within analysis (like journals expecting certain conditions in theorems, seminars focusing on typical analysis topics), but the field’s interfaces are generally open.

Bibliography Link to heading

(Below we separate primary sources (historical documents by mathematicians) and secondary sources (scholarly analyses, histories) that informed this account, with brief annotations.)

Primary Sources: Link to heading

  • Newton, I. (1671, pub. 1711). Analysis per Æquationes Numero Terminorum Infinitas. – Newton’s tract on infinite series (in Latin). Gave early usage of “analysis” in title. Showcases Newton’s infinite series methods.

  • Leibniz, G.W. (1684). Nova Methodus pro Maximis et Minimis. – Leibniz’s first calculus publication (in Latin, Acta Eruditorum). Does not explicitly say “analysis,” but epitomizes infinitesimal analysis introduction to Europe.

  • Euler, L. (1748). Introductio in Analysin Infinitorum. – In Latin; English translation by J. Blanton (1988). Euler defines analysis as study of the infinite and of functions, presents fundamental expansions.

  • Lagrange, J-L. (1788). Mécanique Analytique. – French. No figures approach to mechanics. Preface boasts of using “analytical methods” exclusively.

  • Cauchy, A.-L. (1821). Cours d’Analyse de l’École Royale Polytechnique. – French. First pages define continuity and limit[1]. A foundational primary document for rigorous analysis.

  • Dirichlet, P.G. (1829). Sur la convergence des séries trigonométriques... – In French. Landmark memoir proving conditions for Fourier series convergence, with the modern notion of function (each $x$ has a unique $y$).

  • Bolzano, B. (1817, pub. 1848). Rein analytischer Beweis... (Pure analytic proof of the theorem that between any two values with opposite signs there lies at least one root). – German. Anticipated $\epsilon$-$\delta$ arguments and gave the intermediate value property proof, unknown until later.

  • Weierstrass, K. (Lectures, 1860s, published in Mathematische Werke). – German. Vorlesungen über die Theorie der Abelschen Transcendenten (complex analysis) and notes on real analysis by students (e.g., Thomae, Grundlagen der ebenen AN). These contain formal $\epsilon$-$\delta$ definitions as remembered by pupils.

  • Cantor, G. (1874). Über eine Eigenschaft des Inbegriffes aller reellen algebraischen Zahlen. – German. Proved $\mathbb{R}$ is uncountable using a diagonal argument. Genesis of set theory from analysis context.

  • Dedekind, R. (1872). Stetigkeit und irrationale Zahlen. – German. Laid out Dedekind cuts for reals. Ensured completeness axiom for analysis.

  • Lebesgue, H. (1902). Intégrale, longueur, aire. – French thesis. Defines measure and integral. Later expanded in his 1904 book Leçons.... A primary source establishing modern integration.

  • Hilbert, D. (1900). “Mathematical Problems”. – Lecture transcribed in English in Bull. AMS (1902). Problems 1 (continuum hypothesis), 6 (axiomatize physics), 7 (irrationality/transcendence), 8 (Riemann Hypothesis) all relate to analysis or its foundation. Shows what counted as core math at 1900 included many analytical questions.

  • Frechet, M. (1906). Sur quelques points du calcul fonctionnel. – French. Defines metric spaces abstractly[6]. Beginning of general topology and functional analysis.

  • Hardy, G.H. (1908). A Course of Pure Mathematics. – English textbook, multiple eds. The preface and intro lay out his vision of rigor and purity in analysis for beginners. Shaped English education in analysis.

  • Bourbaki, N. (1939–1969). Éléments de Mathématique, several volumes (Topologie, Fonctions d’une variable réelle, Intégration, Espaces vectoriels topologiques). – French. A collective primary source for mid-century formalization. E.g., Intégration (1965) systematically builds Lebesgue integration abstractly, EVT (Topological Vector Spaces, 1953) for functional analysis.

  • Schwartz, L. (1945). Théorie des distributions (published 1950). – French. First comprehensive treatise of distribution theory by its creator. Fields medal lecture (1950) also outlines motivation. This allowed analysis to incorporate generalized functions rigorously.

  • Kolmogorov, A.N. (1933). Grundbegriffe der Wahrscheinlichkeitsrechnung. – German. Axiomatic foundation of probability. A primary reference for measure-theoretic probability in analysis.

  • Calderón, A.P., & Zygmund, A. (1952). “On the Existence of Certain Singular Integrals”. – Acta Math. 88:85-139. Solves $L^2$ boundedness of singular integrals. Key paper in modern harmonic analysis.

  • Hörmander, L. (1965). “Linear differential operators”, and (1971) “Fourier integral operators I”. – Developed pseudo-differential and Fourier integral operator theory for PDE. Primary sources for microlocal ideas.

(Also relevant primary works: Riemann’s 1859 paper on primes, Fourier’s 1822 book, etc., though due to space I have listed more representative ones.)

Secondary Sources (Histories & Analyses): Link to heading

  • Boyer, C. (1959). The History of the Calculus and Its Conceptual Development. – Classic account of ideas from Greek “method of exhaustion” to Cauchy’s rigor. Emphasizes evolving meaning of infinitesimal analysis.

  • Grabiner, J. (1981). The Origins of Cauchy’s Rigorous Calculus. – Analyzes how Cauchy’s approach arose from prior work (Lagrange, Euler) and its reception[1].

  • Hawkins, T. (1975). Lebesgue’s Theory of Integration: Its Origins and Development. – Details the path from Riemann to Borel to Lebesgue, covering the measure theory genesis.

  • Kline, M. (1972). Mathematical Thought from Ancient to Modern Times. – Broad history; chapters on “The rise of analysis” (Newton to 18th c.), “The drive toward rigor” (19th c.) give context to analysis vs other fields and cultural shifts.

  • Katz, V. (1986). A History of Mathematics: An Introduction. – A well-regarded textbook-like history. The sections on 19th-century analysis and 20th-century (Hilbert to present) are concise and informative, connecting developments to education.

  • Dauben, J. (1979). Georg Cantor: His Mathematics and Philosophy of the Infinite. – Deep dive into Cantor’s motivations (Fourier problems) and how set theory changed analysis.

  • Rodriguez, C., ed. (1999). History of Functional Analysis. – Collection of articles on the emergence of functional analysis (e.g., role of Hilbert, Banach). Useful for institutional side of things (Lwów school etc.).

  • Siegmund-Schultze, R. (2003). Mathematicians Fleeing from Nazi Germany: Individual Fates and Global Impact. – Contains case studies (Courant, von Neumann, Noether) illustrating the transplantation of European analysis tradition to the US, influencing war and postwar development (like at Courant Institute).

  • Steele, J. (2005). The Cauchy-Schwarz Master Class. – While about an inequality, it gives historical commentary on analysis techniques (integral inequalities from Cauchy to modern times) – reflects how a concept traveled across subfields.

  • Nachmanson, E., et al. (2002). “Who Gave You the Epsilon? Cauchy and Weierstrass” (UMAP Journal). – A pedagogical article exploring how rigor developed, explaining differences between Cauchy and Weierstrass approaches clearly, with original quotes.

  • Aczel, A. (2007). The Artist and the Mathematician. – Semi-popular but historically informative account of Bourbaki’s influence on modern math, including analysis education mid-20th century.

  • Siegel, D. (2004). “Rigor and Expediency: Applied Mathematics in the Joseph Henry Era”. – Discusses 19th-century American math, showing analysis considered less “pure” at times in US until later. Good for analysis vs applied context in institutions.

  • Jahnke, H. N. (ed.) (2003). A History of Analysis. – A comprehensive scholarly collection covering analysis from ancient Greeks to 20th century, with chapters by experts (e.g., Klein on 19th c rigor, Siegmund-Schultze on 20th c functional analysis). A key secondary source that aligns with our periodization.

  • Manders, K., & Moktefi, A. (2017). “Descartes, Leibniz, and Newton on Analysis and Synthesis.” – In Traditions of Analysis and Synthesis (Springer). Illuminates how each early modern figure interpreted analysis.

  • Grattan-Guinness, I. (1990). The Rainbow of Mathematics. – A broad history; has insightful parts on analysis in context of other fields and society (e.g., how analysis was taught in different countries).

(Together, these sources underpin the narrative and analysis given in this report, evidencing the evolving semantics and institutionalization of “analysis.”)


[1] [7] Cours d'analyse - Wikipedia

https://en.wikipedia.org/wiki/Cours_d%27analyse

[2] Mathematical analysis - Wikipedia

https://en.wikipedia.org/wiki/Mathematical_analysis

[3] Laurent Schwartz - Wikipedia

https://en.wikipedia.org/wiki/Laurent_Schwartz

[4] [6] Mathematical Analysis at University | SpringerLink

https://link.springer.com/chapter/10.1007/978-3-030-76791-4_23?error=cookies_not_supported&code=e7fa3e38-daaa-41d4-8dc4-ce542a7b0d4d

[5] Descartes, Leibniz, and Newton on Analysis and Synthesis | SpringerLink

https://link.springer.com/chapter/10.1007/978-3-031-76398-4_7