Graph theory

From Wikipedia, the free encyclopedia

Contents

1

Computer science

1

1.1

History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1

1.1.1

Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

4

Philosophy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

5

1.2.1

Name of the ﬁeld . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

5

Areas of computer science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

6

1.3.1

Theoretical computer science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

7

1.3.2

Applied computer science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

8

1.4

The great insights of computer science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

10

1.5

Academia . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

11

1.5.1

Conferences . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

11

1.5.2

Journals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

11

1.6

Education . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

11

1.7

See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

12

1.8

Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

13

1.9

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

13

1.10 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

15

1.11 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

16

Discrete mathematics

18

2.1

Grand challenges, past and present . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

19

2.2

Topics in discrete mathematics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

19

2.2.1

Theoretical computer science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

19

2.2.2

Information theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

21

2.2.3

Logic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

21

2.2.4

Set theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

21

2.2.5

Combinatorics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

23

2.2.6

Graph theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

23

2.2.7

Probability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

24

2.2.8

Number theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

24

2.2.9

Algebra . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

25

2.2.10 Calculus of ﬁnite diﬀerences, discrete calculus or discrete analysis . . . . . . . . . . . . . .

25

2.2.11 Geometry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

25

1.2

1.3

2

i

ii

3

CONTENTS

2.2.12 Topology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

25

2.2.13 Operations research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

25

2.2.14 Game theory, decision theory, utility theory, social choice theory . . . . . . . . . . . . . .

26

2.2.15 Discretization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

27

2.2.16 Discrete analogues of continuous mathematics . . . . . . . . . . . . . . . . . . . . . . . .

27

2.2.17 Hybrid discrete and continuous mathematics . . . . . . . . . . . . . . . . . . . . . . . . .

27

2.3

See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

28

2.4

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

28

2.5

Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

28

2.6

External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

29

Glossary of graph theory

30

3.1

Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

30

3.1.1

Subgraphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

32

3.1.2

Walks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

33

3.1.3

Trees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

34

3.1.4

Cliques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

34

3.1.5

Strongly connected component . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

35

3.1.6

Hypercubes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

36

3.1.7

Knots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

36

3.1.8

Minors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

36

3.1.9

Embedding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

37

Adjacency and degree . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

37

3.2.1

Independence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

38

3.3

Complexity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

38

3.4

Connectivity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

38

3.5

Distance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

39

3.6

Genus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

39

3.7

Weighted graphs and networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

40

3.8

Direction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

40

3.8.1

Directed acyclic graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

41

Colouring . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

41

3.10 Various . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

42

3.11 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

42

3.12 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

42

Graph (mathematics)

44

4.1

Deﬁnitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

45

4.1.1

Graph . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

45

4.1.2

Adjacency relation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

45

Types of graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

45

4.2.1

45

3.2

3.9

4

4.2

Distinction in terms of the main deﬁnition . . . . . . . . . . . . . . . . . . . . . . . . . .

CONTENTS

4.2.2

5

iii

Important graph classes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

47

4.3

Properties of graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

48

4.4

Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

49

4.5

Important graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

50

4.6

Operations on graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

50

4.7

Generalizations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

51

4.8

See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

51

4.9

Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

52

4.10 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

52

4.11 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

53

4.12 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

53

Graph theory

54

5.1

Deﬁnitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

55

5.1.1

Graph . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

55

5.2

Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

55

5.3

History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

57

5.4

Graph drawing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

58

5.5

Graph-theoretic data structures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

58

5.6

Problems in graph theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

59

5.6.1

Enumeration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

59

5.6.2

Subgraphs, induced subgraphs, and minors . . . . . . . . . . . . . . . . . . . . . . . . . .

59

5.6.3

Graph coloring . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

59

5.6.4

Subsumption and uniﬁcation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

60

5.6.5

Route problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

60

5.6.6

Network ﬂow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

60

5.6.7

Visibility problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

60

5.6.8

Covering problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

60

5.6.9

Decomposition problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

61

5.6.10 Graph classes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

61

See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

61

5.7.1

Related topics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

61

5.7.2

Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

62

5.7.3

Subareas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

63

5.7.4

Related areas of mathematics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

63

5.7.5

Generalizations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

63

5.7.6

Prominent graph theorists . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

63

5.8

Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

64

5.9

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

65

5.10 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

65

5.10.1 Online textbooks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

65

5.7

iv

6

7

CONTENTS

Loop (graph theory)

66

6.1

Degree . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

67

6.2

Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

67

6.3

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

67

6.4

External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

67

6.5

See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

67

Mathematics

69

7.1

History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

70

7.1.1

Evolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

70

7.1.2

Etymology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

72

Deﬁnitions of mathematics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

73

7.2.1

Mathematics as science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

73

7.3

Inspiration, pure and applied mathematics, and aesthetics . . . . . . . . . . . . . . . . . . . . . . .

76

7.4

Notation, language, and rigor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

77

7.5

Fields of mathematics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

77

7.5.1

Foundations and philosophy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

78

7.5.2

Pure mathematics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

79

7.5.3

Applied mathematics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

81

7.6

Mathematical awards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

81

7.7

See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

82

7.8

Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

82

7.9

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

84

7.10 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

85

7.11 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

85

Matrix (mathematics)

87

8.1

Deﬁnition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

88

8.1.1

Size . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

88

8.2

Notation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

89

8.3

Basic operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

89

8.3.1

Addition, scalar multiplication and transposition . . . . . . . . . . . . . . . . . . . . . . .

89

8.3.2

Matrix multiplication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

90

8.3.3

Row operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

91

8.3.4

Submatrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

91

8.4

Linear equations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

92

8.5

Linear transformations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

92

8.6

Square matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

92

8.6.1

Main types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

93

8.6.2

Main operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

95

8.7

Computational aspects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

96

8.8

Decomposition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

97

7.2

8

CONTENTS

v

8.9

Abstract algebraic aspects and generalizations . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

98

8.9.1

Matrices with more general entries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

98

8.9.2

Relationship to linear maps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

99

8.9.3

Matrix groups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

99

8.9.4

Inﬁnite matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100

8.9.5

Empty matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100

8.10 Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101

8.10.1 Graph theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101

8.10.2 Analysis and geometry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101

8.10.3 Probability theory and statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102

8.10.4 Symmetries and transformations in physics . . . . . . . . . . . . . . . . . . . . . . . . . . 103

8.10.5 Linear combinations of quantum states . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104

8.10.6 Normal modes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104

8.10.7 Geometrical optics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105

8.10.8 Electronics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105

8.11 History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105

8.11.1 Other historical usages of the word “matrix” in mathematics . . . . . . . . . . . . . . . . . 106

8.12 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106

8.13 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107

8.14 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110

8.14.1 Physics references . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112

8.14.2 Historical references . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113

8.15 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113

9

Vertex (graph theory)

115

9.1

Types of vertices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116

9.2

See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116

9.3

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116

9.4

External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116

9.5

Text and image sources, contributors, and licenses . . . . . . . . . . . . . . . . . . . . . . . . . . 117

9.5.1

Text . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117

9.5.2

Images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122

9.5.3

Content license . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128

Chapter 1

Computer science

Computer science deals with the theoretical foundations of information and computation, together with practical

techniques for the implementation and application of these foundations

Computer science is the scientiﬁc and practical approach to computation and its applications. It is the systematic

study of the feasibility, structure, expression, and mechanization of the methodical procedures (or algorithms) that

underlie the acquisition, representation, processing, storage, communication of, and access to information, whether

such information is encoded as bits in a computer memory or transcribed in genes and protein structures in a biological

cell.[1] An alternate, more succinct deﬁnition of computer science is the study of automating algorithmic processes

that scale. A computer scientist specializes in the theory of computation and the design of computational systems.[2]

Its subﬁelds can be divided into a variety of theoretical and practical disciplines. Some ﬁelds, such as computational

complexity theory (which explores the fundamental properties of computational and intractable problems), are highly

abstract, while ﬁelds such as computer graphics emphasize real-world visual applications. Still other ﬁelds focus on the

challenges in implementing computation. For example, programming language theory considers various approaches

to the description of computation, while the study of computer programming itself investigates various aspects of

the use of programming language and complex systems. Human–computer interaction considers the challenges in

making computers and computations useful, usable, and universally accessible to humans.

1.1 History

Main article: History of computer science

The earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating ﬁxed numerical tasks such as the abacus have existed since antiquity, aiding in computations such as multiplication and division. Further, algorithms for performing computations have existed since

antiquity, even before sophisticated computing equipment were created. The ancient Sanskrit treatise Shulba Sutras,

or “Rules of the Chord”, is a book of algorithms written in 800 BCE for constructing geometric objects like altars

using a peg and chord, an early precursor of the modern ﬁeld of computational geometry.

Blaise Pascal designed and constructed the ﬁrst working mechanical calculator, Pascal’s calculator, in 1642.[3] In

1673 Gottfried Leibniz demonstrated a digital mechanical calculator, called the 'Stepped Reckoner'.[4] He may be

considered the ﬁrst computer scientist and information theorist, for, among other reasons, documenting the binary

number system. In 1820, Thomas de Colmar launched the mechanical calculator industry[note 1] when he released

his simpliﬁed arithmometer, which was the ﬁrst calculating machine strong enough and reliable enough to be used

daily in an oﬃce environment. Charles Babbage started the design of the ﬁrst automatic mechanical calculator, his

diﬀerence engine, in 1822, which eventually gave him the idea of the ﬁrst programmable mechanical calculator, his

Analytical Engine.[5] He started developing this machine in 1834 and “in less than two years he had sketched out

many of the salient features of the modern computer. A crucial step was the adoption of a punched card system

derived from the Jacquard loom”[6] making it inﬁnitely programmable.[note 2] In 1843, during the translation of a

French article on the analytical engine, Ada Lovelace wrote, in one of the many notes she included, an algorithm

to compute the Bernoulli numbers, which is considered to be the ﬁrst computer program.[7] Around 1885, Herman

Hollerith invented the tabulator, which used punched cards to process statistical information; eventually his company

1

2

CHAPTER 1. COMPUTER SCIENCE

Charles Babbage is credited with inventing the ﬁrst mechanical computer.

became part of IBM. In 1937, one hundred years after Babbage’s impossible dream, Howard Aiken convinced IBM,

which was making all kinds of punched card equipment and was also in the calculator business[8] to develop his giant

programmable calculator, the ASCC/Harvard Mark I, based on Babbage’s analytical engine, which itself used cards

and a central computing unit. When the machine was ﬁnished, some hailed it as “Babbage’s dream come true”.[9]

During the 1940s, as new and more powerful computing machines were developed, the term computer came to

refer to the machines rather than their human predecessors.[10] As it became clear that computers could be used for

more than just mathematical calculations, the ﬁeld of computer science broadened to study computation in general.

Computer science began to be established as a distinct academic discipline in the 1950s and early 1960s.[11][12] The

world’s ﬁrst computer science degree program, the Cambridge Diploma in Computer Science, began at the University

of Cambridge Computer Laboratory in 1953. The ﬁrst computer science degree program in the United States was

formed at Purdue University in 1962.[13] Since practical computers became available, many applications of computing

have become distinct areas of study in their own rights.

Although many initially believed it was impossible that computers themselves could actually be a scientiﬁc ﬁeld of

study, in the late ﬁfties it gradually became accepted among the greater academic population.[14][15] It is the now wellknown IBM brand that formed part of the computer science revolution during this time. IBM (short for International

Business Machines) released the IBM 704[16] and later the IBM 709[17] computers, which were widely used during the

exploration period of such devices. “Still, working with the IBM [computer] was frustrating ... if you had misplaced

as much as one letter in one instruction, the program would crash, and you would have to start the whole process

1.1. HISTORY

3

Ada Lovelace is credited with writing the ﬁrst algorithm intended for processing on a computer.

over again”.[14] During the late 1950s, the computer science discipline was very much in its developmental stages,

and such issues were commonplace.[15]

Time has seen signiﬁcant improvements in the usability and eﬀectiveness of computing technology. Modern society

has seen a signiﬁcant shift in the users of computer technology, from usage only by experts and professionals, to

a near-ubiquitous user base. Initially, computers were quite costly, and some degree of human aid was needed for

4

CHAPTER 1. COMPUTER SCIENCE

eﬃcient use - in part from professional computer operators. As computer adoption became more widespread and

aﬀordable, less human assistance was needed for common usage.

1.1.1

Contributions

The German military used the Enigma machine (shown here) during World War II for communications they wanted kept secret. The

large-scale decryption of Enigma traﬃc at Bletchley Park was an important factor that contributed to Allied victory in WWII.[18]

1.2. PHILOSOPHY

5

Despite its short history as a formal academic discipline, computer science has made a number of fundamental

contributions to science and society - in fact, along with electronics, it is a founding science of the current epoch of

human history called the Information Age and a driver of the Information Revolution, seen as the third major leap

in human technological progress after the Industrial Revolution (1750-1850 CE) and the Agricultural Revolution

(8000-5000 BCE).

These contributions include:

• The start of the "digital revolution", which includes the current Information Age and the Internet.[19]

• A formal deﬁnition of computation and computability, and proof that there are computationally unsolvable and

intractable problems.[20]

• The concept of a programming language, a tool for the precise expression of methodological information at

various levels of abstraction.[21]

• In cryptography, breaking the Enigma code was an important factor contributing to the Allied victory in World

War II.[18]

• Scientiﬁc computing enabled practical evaluation of processes and situations of great complexity, as well as

experimentation entirely by software. It also enabled advanced study of the mind, and mapping of the human genome became possible with the Human Genome Project.[19] Distributed computing projects such as

Folding@home explore protein folding.

• Algorithmic trading has increased the eﬃciency and liquidity of ﬁnancial markets by using artiﬁcial intelligence,

machine learning, and other statistical and numerical techniques on a large scale.[22] High frequency algorithmic

trading can also exacerbate volatility.[23]

• Computer graphics and computer-generated imagery have become ubiquitous in modern entertainment, particularly in television, cinema, advertising, animation and video games. Even ﬁlms that feature no explicit CGI

are usually “ﬁlmed” now on digital cameras, or edited or postprocessed using a digital video editor.

• Simulation of various processes, including computational ﬂuid dynamics, physical, electrical, and electronic

systems and circuits, as well as societies and social situations (notably war games) along with their habitats,

among many others. Modern computers enable optimization of such designs as complete aircraft. Notable

in electrical and electronic circuit design are SPICE, as well as software for physical realization of new (or

modiﬁed) designs. The latter includes essential design software for integrated circuits.

• Artiﬁcial intelligence is becoming increasingly important as it gets more eﬃcient and complex. There are many

applications of AI, some of which can be seen at home, such as robotic vacuum cleaners. It is also present in

video games and on the modern battleﬁeld in drones, anti-missile systems, and squad support robots.

1.2 Philosophy

Main article: Philosophy of computer science

A number of computer scientists have argued for the distinction of three separate paradigms in computer science.

Peter Wegner argued that those paradigms are science, technology, and mathematics.[24] Peter Denning's working

group argued that they are theory, abstraction (modeling), and design.[25] Amnon H. Eden described them as the

“rationalist paradigm” (which treats computer science as a branch of mathematics, which is prevalent in theoretical

computer science, and mainly employs deductive reasoning), the “technocratic paradigm” (which might be found in

engineering approaches, most prominently in software engineering), and the “scientiﬁc paradigm” (which approaches

computer-related artifacts from the empirical perspective of natural sciences, identiﬁable in some branches of artiﬁcial

intelligence).[26]

1.2.1

Name of the ﬁeld

Although ﬁrst proposed in 1956,[15] the term “computer science” appears in a 1959 article in Communications of the

ACM,[27] in which Louis Fein argues for the creation of a Graduate School in Computer Sciences analogous to the

6

CHAPTER 1. COMPUTER SCIENCE

creation of Harvard Business School in 1921,[28] justifying the name by arguing that, like management science, the

subject is applied and interdisciplinary in nature, while having the characteristics typical of an academic discipline.[27]

His eﬀorts, and those of others such as numerical analyst George Forsythe, were rewarded: universities went on to

create such programs, starting with Purdue in 1962.[29] Despite its name, a signiﬁcant amount of computer science

does not involve the study of computers themselves. Because of this, several alternative names have been proposed.[30]

Certain departments of major universities prefer the term computing science, to emphasize precisely that diﬀerence.

Danish scientist Peter Naur suggested the term datalogy,[31] to reﬂect the fact that the scientiﬁc discipline revolves

around data and data treatment, while not necessarily involving computers. The ﬁrst scientiﬁc institution to use the

term was the Department of Datalogy at the University of Copenhagen, founded in 1969, with Peter Naur being

the ﬁrst professor in datalogy. The term is used mainly in the Scandinavian countries. Also, in the early days of

computing, a number of terms for the practitioners of the ﬁeld of computing were suggested in the Communications

of the ACM – turingineer, turologist, ﬂow-charts-man, applied meta-mathematician, and applied epistemologist.[32]

Three months later in the same journal, comptologist was suggested, followed next year by hypologist.[33] The term

computics has also been suggested.[34] In Europe, terms derived from contracted translations of the expression “automatic information” (e.g. “informazione automatica” in Italian) or “information and mathematics” are often used,

e.g. informatique (French), Informatik (German), informatica (Italy, The Netherlands), informática (Spain, Portugal), informatika (Slavic languages and Hungarian) or pliroforiki (πληροφορική, which means informatics) in Greek.

Similar words have also been adopted in the UK (as in the School of Informatics of the University of Edinburgh).[35]

A folkloric quotation, often attributed to—but almost certainly not ﬁrst formulated by—Edsger Dijkstra, states that

“computer science is no more about computers than astronomy is about telescopes.”[note 3] The design and deployment of computers and computer systems is generally considered the province of disciplines other than computer

science. For example, the study of computer hardware is usually considered part of computer engineering, while the

study of commercial computer systems and their deployment is often called information technology or information

systems. However, there has been much cross-fertilization of ideas between the various computer-related disciplines.

Computer science research also often intersects other disciplines, such as philosophy, cognitive science, linguistics,

mathematics, physics, biology, statistics, and logic.

Computer science is considered by some to have a much closer relationship with mathematics than many scientiﬁc

disciplines, with some observers saying that computing is a mathematical science.[11] Early computer science was

strongly inﬂuenced by the work of mathematicians such as Kurt Gödel and Alan Turing, and there continues to be

a useful interchange of ideas between the two ﬁelds in areas such as mathematical logic, category theory, domain

theory, and algebra.[15]

The relationship between computer science and software engineering is a contentious issue, which is further muddied by disputes over what the term “software engineering” means, and how computer science is deﬁned.[36] David

Parnas, taking a cue from the relationship between other engineering and science disciplines, has claimed that the

principal focus of computer science is studying the properties of computation in general, while the principal focus of

software engineering is the design of speciﬁc computations to achieve practical goals, making the two separate but

complementary disciplines.[37]

The academic, political, and funding aspects of computer science tend to depend on whether a department formed

with a mathematical emphasis or with an engineering emphasis. Computer science departments with a mathematics

emphasis and with a numerical orientation consider alignment with computational science. Both types of departments

tend to make eﬀorts to bridge the ﬁeld educationally if not across all research.

1.3 Areas of computer science

As a discipline, computer science spans a range of topics from theoretical studies of algorithms and the limits of

computation to the practical issues of implementing computing systems in hardware and software.[38][39] CSAB,

formerly called Computing Sciences Accreditation Board – which is made up of representatives of the Association for

Computing Machinery (ACM), and the IEEE Computer Society (IEEE-CS)[40] – identiﬁes four areas that it considers

crucial to the discipline of computer science: theory of computation, algorithms and data structures, programming

methodology and languages, and computer elements and architecture. In addition to these four areas, CSAB also

identiﬁes ﬁelds such as software engineering, artiﬁcial intelligence, computer networking and telecommunications,

database systems, parallel computation, distributed computation, computer-human interaction, computer graphics,

operating systems, and numerical and symbolic computation as being important areas of computer science.[38]

1.3. AREAS OF COMPUTER SCIENCE

1.3.1

7

Theoretical computer science

Main article: Theoretical computer science

The broader ﬁeld of theoretical computer science encompasses both the classical theory of computation and a wide

range of other topics that focus on the more abstract, logical, and mathematical aspects of computing.

Theory of computation

Main article: Theory of computation

According to Peter J. Denning, the fundamental question underlying computer science is, “What can be (eﬃciently)

automated?" [11] The study of the theory of computation is focused on answering fundamental questions about what

can be computed and what amount of resources are required to perform those computations. In an eﬀort to answer

the ﬁrst question, computability theory examines which computational problems are solvable on various theoretical

models of computation. The second question is addressed by computational complexity theory, which studies the

time and space costs associated with diﬀerent approaches to solving a multitude of computational problems.

The famous "P=NP?" problem, one of the Millennium Prize Problems,[41] is an open problem in the theory of

computation.

Information and coding theory

Main articles: Information theory and Coding theory

Information theory is related to the quantiﬁcation of information. This was developed by Claude E. Shannon to ﬁnd

fundamental limits on signal processing operations such as compressing data and on reliably storing and communicating data.[42] Coding theory is the study of the properties of codes (systems for converting information from one

form to another) and their ﬁtness for a speciﬁc application. Codes are used for data compression, cryptography, error

detection and correction, and more recently also for network coding. Codes are studied for the purpose of designing

eﬃcient and reliable data transmission methods.

Algorithms and data structures

Algorithms and data structures is the study of commonly used computational methods and their computational eﬃciency.

Programming language theory

Main article: Programming language theory

Programming language theory is a branch of computer science that deals with the design, implementation, analysis,

characterization, and classiﬁcation of programming languages and their individual features. It falls within the discipline of computer science, both depending on and aﬀecting mathematics, software engineering and linguistics. It is

an active research area, with numerous dedicated academic journals.

Formal methods

Main article: Formal methods

Formal methods are a particular kind of mathematically based technique for the speciﬁcation, development and

veriﬁcation of software and hardware systems. The use of formal methods for software and hardware design is

motivated by the expectation that, as in other engineering disciplines, performing appropriate mathematical analysis

can contribute to the reliability and robustness of a design. They form an important theoretical underpinning for

8

CHAPTER 1. COMPUTER SCIENCE

software engineering, especially where safety or security is involved. Formal methods are a useful adjunct to software

testing since they help avoid errors and can also give a framework for testing. For industrial use, tool support is

required. However, the high cost of using formal methods means that they are usually only used in the development

of high-integrity and life-critical systems, where safety or security is of utmost importance. Formal methods are best

described as the application of a fairly broad variety of theoretical computer science fundamentals, in particular logic

calculi, formal languages, automata theory, and program semantics, but also type systems and algebraic data types to

problems in software and hardware speciﬁcation and veriﬁcation.

1.3.2

Applied computer science

Applied computer science aims at identifying certain computer science concepts that can be used directly in solving

real world problems.

Artiﬁcial intelligence

Main article: Artiﬁcial intelligence

This branch of computer science aims to or is required to synthesise goal-orientated processes such as problemsolving, decision-making, environmental adaptation, learning and communication found in humans and animals.

From its origins in cybernetics and in the Dartmouth Conference (1956), artiﬁcial intelligence (AI) research has been

necessarily cross-disciplinary, drawing on areas of expertise such as applied mathematics, symbolic logic, semiotics,

electrical engineering, philosophy of mind, neurophysiology, and social intelligence. AI is associated in the popular

mind with robotic development, but the main ﬁeld of practical application has been as an embedded component in

areas of software development, which require computational understanding and modeling such as ﬁnance and economics, data mining and the physical sciences. The starting-point in the late 1940s was Alan Turing's question “Can

computers think?", and the question remains eﬀectively unanswered although the "Turing test" is still used to assess

computer output on the scale of human intelligence. But the automation of evaluative and predictive tasks has been

increasingly successful as a substitute for human monitoring and intervention in domains of computer application

involving complex real-world data.

Computer architecture and engineering

Main articles: Computer architecture and Computer engineering

Computer architecture, or digital computer organization, is the conceptual design and fundamental operational structure of a computer system. It focuses largely on the way by which the central processing unit performs internally and

accesses addresses in memory.[43] The ﬁeld often involves disciplines of computer engineering and electrical engineering, selecting and interconnecting hardware components to create computers that meet functional, performance,

and cost goals.

Computer performance analysis

Main article: Computer performance

Computer performance analysis is the study of work ﬂowing through computers with the general goals of improving

throughput, controlling response time, using resources eﬃciently, eliminating bottlenecks, and predicting performance under anticipated peak loads.[44]

Computer graphics and visualization

Main article: Computer graphics (computer science)

1.3. AREAS OF COMPUTER SCIENCE

9

Computer graphics is the study of digital visual contents, and involves synthese and manipulations of image data.

The study is connected to many other ﬁelds in computer science, including computer vision, image processing, and

computational geometry, and is heavily applied in the ﬁelds of special eﬀects and video games.

Computer security and cryptography

Main articles: Computer security and Cryptography

Computer security is a branch of computer technology, whose objective includes protection of information from

unauthorized access, disruption, or modiﬁcation while maintaining the accessibility and usability of the system for

its intended users. Cryptography is the practice and study of hiding (encryption) and therefore deciphering (decryption) information. Modern cryptography is largely related to computer science, for many encryption and decryption

algorithms are based on their computational complexity.

Computational science

Computational science (or scientiﬁc computing) is the ﬁeld of study concerned with constructing mathematical models

and quantitative analysis techniques and using computers to analyze and solve scientiﬁc problems. In practical use, it

is typically the application of computer simulation and other forms of computation to problems in various scientiﬁc

disciplines.

Computer networks

Main article: Computer network

This branch of computer science aims to manage networks between computers worldwide.

Concurrent, parallel and distributed systems

Main articles: Concurrency (computer science) and Distributed computing

Concurrency is a property of systems in which several computations are executing simultaneously, and potentially

interacting with each other. A number of mathematical models have been developed for general concurrent computation including Petri nets, process calculi and the Parallel Random Access Machine model. A distributed system

extends the idea of concurrency onto multiple computers connected through a network. Computers within the same

distributed system have their own private memory, and information is often exchanged among themselves to achieve

a common goal.

Databases

Main article: Database

A database is intended to organize, store, and retrieve large amounts of data easily. Digital databases are managed

using database management systems to store, create, maintain, and search data, through database models and query

languages.

Health informatics

Main article: Health informatics

Health Informatics in computer science deals with computational techniques for solving problems in health care.

10

CHAPTER 1. COMPUTER SCIENCE

Information science

Main article: Information science

Software engineering

Main article: Software engineering

Software engineering is the study of designing, implementing, and modifying software in order to ensure it is of

high quality, aﬀordable, maintainable, and fast to build. It is a systematic approach to software design, involving the

application of engineering practices to software. Software engineering deals with the organizing and analyzing of

software— it doesn't just deal with the creation or manufacture of new software, but its internal maintenance and

arrangement. Both computer applications software engineers and computer systems software engineers are projected

to be among the fastest growing occupations from 2008 and 2018.

See also: Computer programming

1.4 The great insights of computer science

The philosopher of computing Bill Rapaport noted three Great Insights of Computer Science[45]

• Leibniz's, Boole's, Alan Turing's, Shannon's, & Morse's insight: There are only two objects that a computer

has to deal with in order to represent “anything”

All the information about any computable problem can be represented using only 0 and 1 (or any other

bistable pair that can ﬂip-ﬂop between two easily distinguishable states, such as “on"/"oﬀ”, “magnetized/demagnetized”, “high-voltage/low-voltage”, etc.).

See also: Digital physics

• Alan Turing's insight: There are only ﬁve actions that a computer has to perform in order to do “anything”

Every algorithm can be expressed in a language for a computer consisting of only ﬁve basic instructions:

* move left one location

* move right one location

* read symbol at current location

* print 0 at current location

* print 1 at current location

See also: Turing machine

• Böhm and Jacopini's insight: There are only three ways of combining these actions (into more complex ones)

that are needed in order for a computer to do “anything”

Only three rules are needed to combine any set of basic instructions into more complex ones:

sequence:

ﬁrst do this; then do that

selection :

1.5. ACADEMIA

11

IF such-and-such is the case,

THEN do this

ELSE do that

repetition:

WHILE such-and-such is the case DO this

Note that the three rules of Boehm’s and Jacopini’s insight can be further simpliﬁed with the use of goto (which

means it is more elementary than structured programming).

See also: Elementary function arithmetic § Friedman’s grand conjecture

1.5 Academia

1.5.1

Conferences

Further information: List of computer science conferences

Conferences are strategic events of the Academic Research in computer science. During those conferences, researchers from the public and private sectors present their recent work and meet. Proceedings of these conferences

are an important part of the computer science literature.

1.5.2

Journals

Further information: Category:Computer science journals

1.6 Education

Academic curricula in computer science include the following areas of study:

1. Structured and Object-oriented programming[46]

2.

3. Data structures[47]

4.

5. Analysis of Algorithms[48]

6.

7. Formal languages[49] and compiler construction[50]

8.

9. Computer Graphics Algorithms[51]

10.

11. Numerical Methods,[52] Optimization and Statistics[53]

12.

13. Artiﬁcial Intelligence[54] and Machine Learning[55]

12

CHAPTER 1. COMPUTER SCIENCE

14.

Some universities teach computer science as a theoretical study of computation and algorithmic reasoning. These

programs often feature the theory of computation, analysis of algorithms, formal methods, concurrency theory,

databases, computer graphics, and systems analysis, among others. They typically also teach computer programming, but treat it as a vessel for the support of other ﬁelds of computer science rather than a central focus of high-level

study. The ACM/IEEE-CS Joint Curriculum Task Force “Computing Curriculum 2005” (and 2008 update)[56] gives

a guideline for university curriculum.

Other colleges and universities, as well as secondary schools and vocational programs that teach computer science,

emphasize the practice of advanced programming rather than the theory of algorithms and computation in their

computer science curricula. Such curricula tend to focus on those skills that are important to workers entering the

software industry. The process aspects of computer programming are often referred to as software engineering.

While computer science professions increasingly drive the U.S. economy, computer science education is absent in

most American K-12 curricula. A report entitled “Running on Empty: The Failure to Teach K-12 Computer Science

in the Digital Age” was released in October 2010 by Association for Computing Machinery (ACM) and Computer

Science Teachers Association (CSTA), and revealed that only 14 states have adopted signiﬁcant education standards

for high school computer science. The report also found that only nine states count high school computer science

courses as a core academic subject in their graduation requirements. In tandem with “Running on Empty”, a new

non-partisan advocacy coalition - Computing in the Core (CinC) - was founded to inﬂuence federal and state policy,

such as the Computer Science Education Act, which calls for grants to states to develop plans for improving computer

science education and supporting computer science teachers.

Within the United States a gender gap in computer science education has been observed as well. Research conducted

by the WGBH Educational Foundation and the Association for Computing Machinery (ACM) revealed that more

than twice as many high school boys considered computer science to be a “very good” or “good” college major

than high school girls.[57] In addition, the high school Advanced Placement (AP) exam for computer science has

displayed a disparity in gender. Compared to other AP subjects it has the lowest number of female participants,

with a composition of about 15 percent women.[58] This gender gap in computer science is further witnessed at the

college level, where 31 percent of undergraduate computer science degrees are earned by women and only 8 percent

of computer science faculty consists of women.[59] According to an article published by the Epistemic Games Group

in August 2012, the number of women graduates in the computer science ﬁeld has declined to 13 percent.[60]

A 2014 Mother Jones article, “We Can Code It”, advocates for adding computer literacy and coding to the K-12

curriculum in the United States, and notes that computer science is not incorporated into the requirements for the

Common Core State Standards Initiative.[61] In fact, there has been a trend in the direction of removing advanced

placement tests and classes in American schools.[62][63]

1.7 See also

Main article: Outline of computer science

• Academic genealogy of computer scientists

• Informatics (academic ﬁeld)

• List of academic computer science departments

• List of computer science conferences

• List of computer scientists

• List of publications in computer science

• List of pioneers in computer science

• Technology transfer in computer science

• List of software engineering topics

1.8. NOTES

13

• List of unsolved problems in computer science

• Turing Award

• Women in computing

Computer science – Wikipedia book

1.8 Notes

[1] In 1851

[2] “The introduction of punched cards into the new engine was important not only as a more convenient form of control than

the drums, or because programs could now be of unlimited extent, and could be stored and repeated without the danger of

introducing errors in setting the machine by hand; it was important also because it served to crystallize Babbage’s feeling

that he had invented something really new, something much more than a sophisticated calculating machine.” Bruce Collier,

1970

[3] See the entry "Computer science" on Wikiquote for the history of this quotation.

1.9 References

[1] “What is Computer Science?" (PDF). Boston University Department of Computer Science. Spring 2003. Retrieved December 12, 2014.

[2] “WordNet Search - 3.1”. Wordnetweb.princeton.edu. Retrieved 2012-05-14.

[3] “Blaise Pascal”. School of Mathematics and Statistics University of St Andrews, Scotland.

[4] “A Brief History of Computing”.

[5] “Science Museum - Introduction to Babbage”. Archived from the original on 2006-09-08. Retrieved 2006-09-24.

[6] Anthony Hyman (1982). Charles Babbage, pioneer of the computer.

[7] “A Selection and Adaptation From Ada’s Notes found in Ada, The Enchantress of Numbers,” by Betty Alexandra Toole

Ed.D. Strawberry Press, Mill Valley, CA”. Retrieved 2006-05-04.

[8] “In this sense Aiken needed IBM, whose technology included the use of punched cards, the accumulation of numerical

data, and the transfer of numerical data from one register to another”, Bernard Cohen, p.44 (2000)

[9] Brian Randell, p. 187, 1975

[10] The Association for Computing Machinery (ACM) was founded in 1947.

[11] Denning, P.J. (2000). “Computer Science: The Discipline” (PDF). Encyclopedia of Computer Science. Archived from the

original (PDF) on 2006-05-25.

[12] “Some EDSAC statistics”. Cl.cam.ac.uk. Retrieved 2011-11-19.

[13] “Computer science pioneer Samuel D. Conte dies at 85”. Purdue Computer Science. July 1, 2002. Retrieved December

12, 2014.

[14] Levy, Steven (1984). Hackers: Heroes of the Computer Revolution. Doubleday. ISBN 0-385-19195-2.

[15] Tedre, Matti (2014). The Science of Computing: Shaping a Discipline. Taylor and Francis / CRC Press.

[16] “IBM 704 Electronic Data Processing System - CHM Revolution”. Computerhistory.org. Retrieved 2013-07-07.

[17] “IBM 709: a powerful new data processing system” (PDF). Computer History Museum. Retrieved December 12, 2014.

[18] David Kahn, The Codebreakers, 1967, ISBN 0-684-83130-9.

[19] http://www.cis.cornell.edu/Dean/Presentations/Slides/bgu.pdf[]

[20] Constable, R. L. (March 2000). “Computer Science: Achievements and Challenges circa 2000” (PDF).

14

CHAPTER 1. COMPUTER SCIENCE

[21] Abelson, H.; G.J. Sussman with J. Sussman (1996). Structure and Interpretation of Computer Programs (2nd ed.). MIT

Press. ISBN 0-262-01153-0. The computer revolution is a revolution in the way we think and in the way we express what

we think. The essence of this change is the emergence of what might best be called procedural epistemology — the study

of the structure of knowledge from an imperative point of view, as opposed to the more declarative point of view taken by

classical mathematical subjects.

[22] “Black box traders are on the march”. The Telegraph. August 26, 2006.

[23] “The Impact of High Frequency Trading on an Electronic Market”. Papers.ssrn.com. doi:10.2139/ssrn.1686004. Retrieved

2012-05-14.

[24] Wegner, P. (October 13–15, 1976). Proceedings of the 2nd international Conference on Software Engineering. San Francisco, California, United States: IEEE Computer Society Press, Los Alamitos, CA.

[25] Denning, P. J.; Comer, D. E.; Gries, D.; Mulder, M. C.; Tucker, A.; Turner, A. J.; Young, P. R. (Jan 1989). “Computing

as a discipline”. Communications of the ACM 32: 9–23. doi:10.1145/63238.63239.

[26] Eden, A. H. (2007). “Three Paradigms of Computer Science” (PDF). Minds and Machines 17 (2): 135–167. doi:10.1007/s11023007-9060-8.

[27] Louis Fine (1959). “The Role of the University in Computers, Data Processing, and Related Fields”. Communications of

the ACM 2 (9): 7–14. doi:10.1145/368424.368427.

[28] “Stanford University Oral History”. Stanford University. Retrieved May 30, 2013.

[29] Donald Knuth (1972). “George Forsythe and the Development of Computer Science”. Comms. ACM.

[30] Matti Tedre (2006). “The Development of Computer Science: A Sociocultural Perspective” (PDF). p. 260. Retrieved

December 12, 2014.

[31] Peter Naur (1966). “The science of datalogy”. Communications of the ACM 9 (7): 485. doi:10.1145/365719.366510.

[32] “Communications of the ACM”. Communications of the ACM 1 (4): 6.

[33] Communications of the ACM 2(1):p.4

[34] IEEE Computer 28(12):p.136

[35] P. Mounier-Kuhn, L'Informatique en France, de la seconde guerre mondiale au Plan Calcul. L'émergence d'une science,

Paris, PUPS, 2010, ch. 3 & 4.

[36] Tedre, M. (2011). “Computing as a Science: A Survey of Competing Viewpoints”. Minds and Machines 21 (3): 361–387.

doi:10.1007/s11023-011-9240-4.

[37] Parnas, D. L. (1998). “Software engineering programmes are not computer science programmes”. Annals of Software

Engineering 6: 19–37. doi:10.1023/A:1018949113292., p. 19: “Rather than treat software engineering as a subﬁeld of

computer science, I treat it as an element of the set, Civil Engineering, Mechanical Engineering, Chemical Engineering,

Electrical Engineering, [...]"

[38] Computing Sciences Accreditation Board (May 28, 1997). “Computer Science as a Profession”. Archived from the original

on 2008-06-17. Retrieved 2010-05-23.

[39] Committee on the Fundamentals of Computer Science: Challenges and Opportunities, National Research Council (2004).

Computer Science: Reﬂections on the Field, Reﬂections from the Field. National Academies Press. ISBN 978-0-309-093019.

[40] “CSAB Leading Computer Education”. CSAB. 2011-08-03. Retrieved 2011-11-19.

[41] Clay Mathematics Institute P=NP

[42] P. Collins, Graham (October 14, 2002). “Claude E. Shannon: Founder of Information Theory”. Scientiﬁc American.

Retrieved December 12, 2014.

[43] A. Thisted, Ronald (April 7, 1997). “Computer Architecture” (PDF). The University of Chicago.

[44] Wescott, Bob (2013). The Every Computer Performance Book, Chapter 3: Useful laws. CreateSpace. ISBN 1482657759.

[45] “What Is Computation?". buﬀalo.edu.

[46] Booch, Grady (1997). Object-Oriented Analysis and Design with Applications. Addison-Wesley.

1.10. FURTHER READING

15

[47] Peter Brass. (2008) Advanced Data Structures, Cambridge University Press

[48] Cormen, Thomas H.; Leiserson, Charles E.; Rivest, Ronald L. & Stein, Cliﬀord. (2001) Introduction to Algorithms, MIT

Press and McGraw-Hill.

[49] Hopcroft, John E. and Jeﬀrey D. Ullman, (1979) Introduction to Automata Theory, Languages, and Computation

[50] Aho, Alfred V., Sethi, Ravi, and Ullman, Jeﬀrey D. (1988). Compilers — Principles, Techniques, and Tools. AddisonWesley.

[51] Shirley, Peter. (2009) Fundamentals of Computer Graphics - 3rd edition

[52] Press, William H., Saul A. Teukolsky, William T. Vetterling, Brian P. Flannery. (2007) Numerical Recipes 3rd Edition:

The Art of Scientiﬁc Computing

[53] Baron, Michael. (2006) Probability and Statistics for Computer Scientists

[54] Russell, Stuart. (2009) Artiﬁcial Intelligence: A Modern Approach (3rd Edition)

[55] Mitchell, Tom. (1997) Machine Learning.

[56] “ACM Curricula Recommendations”. Retrieved 2012-11-18.

[57] “New Image for Computing Report on Market Research” (PDF). WGBH Educational Foundation and the Association for

Computing Machinery (ACM). April 2009. Retrieved December 12, 2014.

[58] Gilbert, Alorie. “Newsmaker: Computer science’s gender gap”. CNET News.

[59] Dovzan, Nicole. “Examining the Gender Gap in Technology”. University of Michigan.

[60] “Encouraging the next generation of women in computing”. Microsoft Research Connections Team. Retrieved September

3, 2013.

[61] Raja, Tasneem (August 2014). “Is Coding the New Literacy?". Mother Jones. Retrieved 2014-06-21.

[62] http://arstechnica.com/business/2014/12/to-address-techs-diversity-woes-start-with-the-vanishing-comp-sci-classroom/

Casey Johnston. Ars Technica. Dec 4 2014.

[63] http://apcentral.collegeboard.com/apc/members/exam/exam_information/1999.html

“Computer Software Engineer”. U.S. Bureau of Labor Statistics. U.S. Bureau of Labor Statistics, n.d. Web. February

5, 2013.

1.10 Further reading

Overview

• Tucker, Allen B. (2004). Computer Science Handbook (2nd ed.). Chapman and Hall/CRC. ISBN 1-58488360-X.

• “Within more than 70 chapters, every one new or signiﬁcantly revised, one can ﬁnd any kind of information and references about computer science one can imagine. [...] all in all, there is absolute nothing

about Computer Science that can not be found in the 2.5 kilogram-encyclopaedia with its 110 survey

articles [...].” (Christoph Meinel, Zentralblatt MATH)

• van Leeuwen, Jan (1994). Handbook of Theoretical Computer Science. The MIT Press. ISBN 0-262-72020-5.

• "[...] this set is the most unique and possibly the most useful to the [theoretical computer science] community, in support both of teaching and research [...]. The books can be used by anyone wanting simply

to gain an understanding of one of these areas, or by someone desiring to be in research in a topic, or by

instructors wishing to ﬁnd timely information on a subject they are teaching outside their major areas of

expertise.” (Rocky Ross, SIGACT News)

• Ralston, Anthony; Reilly, Edwin D.; Hemmendinger, David (2000). Encyclopedia of Computer Science (4th

ed.). Grove’s Dictionaries. ISBN 1-56159-248-X.

16

CHAPTER 1. COMPUTER SCIENCE

• “Since 1976, this has been the deﬁnitive reference work on computer, computing, and computer science. [...] Alphabetically arranged and classiﬁed into broad subject areas, the entries cover hardware,

computer systems, information and data, software, the mathematics of computing, theory of computation, methodologies, applications, and computing milieu. The editors have done a commendable job of

blending historical perspective and practical reference information. The encyclopedia remains essential

for most public and academic library reference collections.” (Joe Accardin, Northeastern Illinois Univ.,

Chicago)

• Edwin D. Reilly (2003). Milestones in Computer Science and Information Technology. Greenwood Publishing

Group. ISBN 978-1-57356-521-9.

Selected papers

• Knuth, Donald E. (1996). Selected Papers on Computer Science. CSLI Publications, Cambridge University

Press.

• Collier, Bruce. The little engine that could've: The calculating machines of Charles Babbage. Garland Publishing Inc. ISBN 0-8240-0043-9.

• Cohen, Bernard (2000). Howard Aiken, Portrait of a computer pioneer. The MIT press. ISBN 978-0-26253179-5.

• Tedre, Matti (2014). The Science of Computing: Shaping a Discipline. CRC Press, Taylor & Francis.

• Randell, Brian (1973). The origins of Digital computers, Selected Papers. Springer-Verlag. ISBN 3-540-06169X.

• “Covering a period from 1966 to 1993, its interest lies not only in the content of each of these papers

— still timely today — but also in their being put together so that ideas expressed at diﬀerent times

complement each other nicely.” (N. Bernard, Zentralblatt MATH)

Articles

• Peter J. Denning. Is computer science science?, Communications of the ACM, April 2005.

• Peter J. Denning, Great principles in computing curricula, Technical Symposium on Computer Science Education, 2004.

• Research evaluation for computer science, Informatics Europe report. Shorter journal version: Bertrand

Meyer, Christine Choppy, Jan van Leeuwen and Jorgen Staunstrup, Research evaluation for computer science,

in Communications of the ACM, vol. 52, no. 4, pp. 31–34, April 2009.

Curriculum and classiﬁcation

• Association for Computing Machinery. 1998 ACM Computing Classiﬁcation System. 1998.

• Joint Task Force of Association for Computing Machinery (ACM), Association for Information Systems (AIS)

and IEEE Computer Society (IEEE-CS). Computing Curricula 2005: The Overview Report. September 30,

2005.

• Norman Gibbs, Allen Tucker. “A model curriculum for a liberal arts degree in computer science”. Communications of the ACM, Volume 29 Issue 3, March 1986.

1.11 External links

• Computer science at DMOZ

• Scholarly Societies in Computer Science

• Best Papers Awards in Computer Science since 1996

1.11. EXTERNAL LINKS

17

• Photographs of computer scientists by Bertrand Meyer

• EECS.berkeley.edu

Bibliography and academic search engines

• CiteSeerx (article): search engine, digital library and repository for scientiﬁc and academic papers with a focus

on computer and information science.

• DBLP Computer Science Bibliography (article): computer science bibliography website hosted at Universität

Trier, in Germany.

• The Collection of Computer Science Bibliographies (article)

Professional organizations

• Association for Computing Machinery

• IEEE Computer Society

• Informatics Europe

• AAAI

• AAAS Computer Science

Misc

• Computer Science - Stack Exchange: a community-run question-and-answer site for computer science

• What is computer science

• Is computer science science?

Chapter 2

Discrete mathematics

For the mathematics journal, see Discrete Mathematics (journal).

Discrete mathematics is the study of mathematical structures that are fundamentally discrete rather than continuous.

6

5

4

1

2

3

Graphs like this are among the objects studied by discrete mathematics, for their interesting mathematical properties, their usefulness

as models of real-world problems, and their importance in developing computer algorithms.

In contrast to real numbers that have the property of varying “smoothly”, the objects studied in discrete mathematics

– such as integers, graphs, and statements in logic[1] – do not vary smoothly in this way, but have distinct, separated

values.[2] Discrete mathematics therefore excludes topics in “continuous mathematics” such as calculus and analysis.

Discrete objects can often be enumerated by integers. More formally, discrete mathematics has been characterized

as the branch of mathematics dealing with countable sets[3] (sets that have the same cardinality as subsets of the

natural numbers, including rational numbers but not real numbers). However, there is no exact deﬁnition of the term

“discrete mathematics.”[4] Indeed, discrete mathematics is described less by what is included than by what is excluded:

continuously varying quantities and related notions.

The set of objects studied in discrete mathematics can be ﬁnite or inﬁnite. The term ﬁnite mathematics is sometimes

applied to parts of the ﬁeld of discrete mathematics that deals with ﬁnite sets, particularly those areas relevant to

business.

18

2.1. GRAND CHALLENGES, PAST AND PRESENT

19

Research in discrete mathematics increased in the latter half of the twentieth century partly due to the development

of digital computers which operate in discrete steps and store data in discrete bits. Concepts and notations from

discrete mathematics are useful in studying and describing objects and problems in branches of computer science,

such as computer algorithms, programming languages, cryptography, automated theorem proving, and software development. Conversely, computer implementations are signiﬁcant in applying ideas from discrete mathematics to

real-world problems, such as in operations research.

Although the main objects of study in discrete mathematics are discrete objects, analytic methods from continuous

mathematics are often employed as well.

In the university curricula, “Discrete Mathematics” appeared in the 1980s, initially as a computer science support

course; its contents was somewhat haphazard at the time. The curriculum has thereafter developed in conjunction to

eﬀorts by ACM and MAA into a course that is basically intended to develop mathematical maturity in freshmen; as

such it is nowadays a prerequisite for mathematics majors in some universities as well.[5][6] Some high-school-level

discrete mathematics textbooks have appeared as well.[7] At this level, discrete mathematics it is sometimes seen a

preparatory course, not unlike precalculus in this respect.[8]

The Fulkerson Prize is awarded for outstanding papers in discrete mathematics.

2.1 Grand challenges, past and present

The history of discrete mathematics has involved a number of challenging problems which have focused attention

within areas of the ﬁeld. In graph theory, much research was motivated by attempts to prove the four color theorem,

ﬁrst stated in 1852, but not proved until 1976 (by Kenneth Appel and Wolfgang Haken, using substantial computer

assistance).[9]

In logic, the second problem on David Hilbert's list of open problems presented in 1900 was to prove that the axioms

of arithmetic are consistent. Gödel’s second incompleteness theorem, proved in 1931, showed that this was not

possible – at least not within arithmetic itself. Hilbert’s tenth problem was to determine whether a given polynomial

Diophantine equation with integer coeﬃcients has an integer solution. In 1970, Yuri Matiyasevich proved that this

could not be done.

The need to break German codes in World War II led to advances in cryptography and theoretical computer science,

with the ﬁrst programmable digital electronic computer being developed at England’s Bletchley Park with the guidance of Alan Turing and his seminal work, On Computable Numbers.[10] At the same time, military requirements

motivated advances in operations research. The Cold War meant that cryptography remained important, with fundamental advances such as public-key cryptography being developed in the following decades. Operations research

remained important as a tool in business and project management, with the critical path method being developed

in the 1950s. The telecommunication industry has also motivated advances in discrete mathematics, particularly

in graph theory and information theory. Formal veriﬁcation of statements in logic has been necessary for software

development of safety-critical systems, and advances in automated theorem proving have been driven by this need.

Computational geometry has been an important part of the computer graphics incorporated into modern video games

and computer-aided design tools.

Several ﬁelds of discrete mathematics, particularly theoretical computer science, graph theory, and combinatorics,

are important in addressing the challenging bioinformatics problems associated with understanding the tree of life.[11]

Currently, one of the most famous open problems in theoretical computer science is the P = NP problem, which

involves the relationship between the complexity classes P and NP. The Clay Mathematics Institute has oﬀered a $1

million USD prize for the ﬁrst correct proof, along with prizes for six other mathematical problems.[12]

2.2 Topics in discrete mathematics

2.2.1

Theoretical computer science

Main article: Theoretical computer science

Theoretical computer science includes areas of discrete mathematics relevant to computing. It draws heavily on

graph theory and mathematical logic. Included within theoretical computer science is the study of algorithms for

computing mathematical results. Computability studies what can be computed in principle, and has close ties to logic,

20

CHAPTER 2. DISCRETE MATHEMATICS

Much research in graph theory was motivated by attempts to prove that all maps, like this one, could be colored using only four colors

so that no areas of the same color touched. Kenneth Appel and Wolfgang Haken proved this in 1976.[9]

while complexity studies the time taken by computations. Automata theory and formal language theory are closely

related to computability. Petri nets and process algebras are used to model computer systems, and methods from

discrete mathematics are used in analyzing VLSI electronic circuits. Computational geometry applies algorithms

to geometrical problems, while computer image analysis applies them to representations of images. Theoretical

computer science also includes the study of various continuous computational topics.

2.2. TOPICS IN DISCRETE MATHEMATICS

21

Complexity studies the time taken by algorithms, such as this sorting routine.

2.2.2

Information theory

Main article: Information theory

Information theory involves the quantiﬁcation of information. Closely related is coding theory which is used to design

eﬃcient and reliable data transmission and storage methods. Information theory also includes continuous topics such

as: analog signals, analog coding, analog encryption.

2.2.3

Logic

Main article: Mathematical logic

Logic is the study of the principles of valid reasoning and inference, as well as of consistency, soundness, and

completeness. For example, in most systems of logic (but not in intuitionistic logic) Peirce’s law (((P→Q)→P)→P)

is a theorem. For classical logic, it can be easily veriﬁed with a truth table. The study of mathematical proof is particularly important in logic, and has applications to automated theorem proving and formal veriﬁcation of software.

Logical formulas are discrete structures, as are proofs, which form ﬁnite trees[13] or, more generally, directed acyclic

graph structures[14][15] (with each inference step combining one or more premise branches to give a single conclusion).

The truth values of logical formulas usually form a ﬁnite set, generally restricted to two values: true and false, but

logic can also be continuous-valued, e.g., fuzzy logic. Concepts such as inﬁnite proof trees or inﬁnite derivation trees

have also been studied,[16] e.g. inﬁnitary logic.

2.2.4

Set theory

Main article: Set theory

22

CHAPTER 2. DISCRETE MATHEMATICS

101

110

110

110

111

110

110

110

110

0111

1001

1011

1001

0000

0101

0100

1001

0001

Wikipedia

The ASCII codes for the word “Wikipedia”, given here in binary, provide a way of representing the word in information theory, as

well as for information-processing algorithms.

Set theory is the branch of mathematics that studies sets, which are collections of objects, such as {blue, white, red}

or the (inﬁnite) set of all prime numbers. Partially ordered sets and sets with other relations have applications in

several areas.

In discrete mathematics, countable sets (including ﬁnite sets) are the main focus. The beginning of set theory as a

branch of mathematics is usually marked by Georg Cantor's work distinguishing between diﬀerent kinds of inﬁnite

set, motivated by the study of trigonometric series, and further development of the theory of inﬁnite sets is outside

the scope of discrete mathematics. Indeed, contemporary work in descriptive set theory makes extensive use of

traditional continuous mathematics.

2.2. TOPICS IN DISCRETE MATHEMATICS

2.2.5

23

Combinatorics

Main article: Combinatorics

Combinatorics studies the way in which discrete structures can be combined or arranged. Enumerative combinatorics

concentrates on counting the number of certain combinatorial objects - e.g. the twelvefold way provides a uniﬁed

framework for counting permutations, combinations and partitions. Analytic combinatorics concerns the enumeration (i.e., determining the number) of combinatorial structures using tools from complex analysis and probability

theory. In contrast with enumerative combinatorics which uses explicit combinatorial formulae and generating functions to describe the results, analytic combinatorics aims at obtaining asymptotic formulae. Design theory is a study

of combinatorial designs, which are collections of subsets with certain intersection properties. Partition theory studies

various enumeration and asymptotic problems related to integer partitions, and is closely related to q-series, special

functions and orthogonal polynomials. Originally a part of number theory and analysis, partition theory is now considered a part of combinatorics or an independent ﬁeld. Order theory is the study of partially ordered sets, both ﬁnite

and inﬁnite.

2.2.6

Graph theory

Main article: Graph theory

Graph theory, the study of graphs and networks, is often considered part of combinatorics, but has grown large enough

Graph theory has close links to group theory. This truncated tetrahedron graph is related to the alternating group A4 .

and distinct enough, with its own kind of problems, to be regarded as a subject in its own right.[17] Graphs are one of

the prime objects of study in discrete mathematics. They are among the most ubiquitous models of both natural and

human-made structures. They can model many types of relations and process dynamics in physical, biological and

social systems. In computer science, they can represent networks of communication, data organization, computational

devices, the ﬂow of computation, etc. In mathematics, they are useful in geometry and certain parts of topology, e.g.

knot theory. Algebraic graph theory has close links with group theory. There are also continuous graphs, however

24

CHAPTER 2. DISCRETE MATHEMATICS

for the most part research in graph theory falls within the domain of discrete mathematics.

2.2.7

Probability

Main article: Discrete probability theory

Discrete probability theory deals with events that occur in countable sample spaces. For example, count observations

such as the numbers of birds in ﬂocks comprise only natural number values {0, 1, 2, ...}. On the other hand, continuous

observations such as the weights of birds comprise real number values and would typically be modeled by a continuous

probability distribution such as the normal. Discrete probability distributions can be used to approximate continuous

ones and vice versa. For highly constrained situations such as throwing dice or experiments with decks of cards,

calculating the probability of events is basically enumerative combinatorics.

2.2.8

Number theory

The Ulam spiral of numbers, with black pixels showing prime numbers. This diagram hints at patterns in the distribution of prime

numbers.

Main article: Number theory

2.2. TOPICS IN DISCRETE MATHEMATICS

25

Number theory is concerned with the properties of numbers in general, particularly integers. It has applications to

cryptography, cryptanalysis, and cryptology, particularly with regard to modular arithmetic, diophantine equations,

linear and quadratic congruences, prime numbers and primality testing. Other discrete aspects of number theory

include geometry of numbers. In analytic number theory, techniques from continuous mathematics are also used.

Topics that go beyond discrete objects include transcendental numbers, diophantine approximation, p-adic analysis

and function ﬁelds.

2.2.9

Algebra

Main article: Abstract algebra

Algebraic structures occur as both discrete examples and continuous examples. Discrete algebras include: boolean

algebra used in logic gates and programming; relational algebra used in databases; discrete and ﬁnite versions of

groups, rings and ﬁelds are important in algebraic coding theory; discrete semigroups and monoids appear in the

theory of formal languages.

2.2.10

Calculus of ﬁnite diﬀerences, discrete calculus or discrete analysis

Main article: ﬁnite diﬀerence

A function deﬁned on an interval of the integers is usually called a sequence. A sequence could be a ﬁnite sequence

from a data source or an inﬁnite sequence from a discrete dynamical system. Such a discrete function could be deﬁned

explicitly by a list (if its domain is ﬁnite), or by a formula for its general term, or it could be given implicitly by a

recurrence relation or diﬀerence equation. Diﬀerence equations are similar to a diﬀerential equations, but replace

diﬀerentiation by taking the diﬀerence between adjacent terms; they can be used to approximate diﬀerential equations

or (more often) studied in their own right. Many questions and methods concerning diﬀerential equations have

counterparts for diﬀerence equations. For instance where there are integral transforms in harmonic analysis for

studying continuous functions or analog signals, there are discrete transforms for discrete functions or digital signals.

As well as the discrete metric there are more general discrete or ﬁnite metric spaces and ﬁnite topological spaces.

2.2.11

Geometry

Main articles: discrete geometry and computational geometry

Discrete geometry and combinatorial geometry are about combinatorial properties of discrete collections of geometrical objects. A long-standing topic in discrete geometry is tiling of the plane. Computational geometry applies

algorithms to geometrical problems.

2.2.12

Topology

Although topology is the ﬁeld of mathematics that formalizes and generalizes the intuitive notion of “continuous deformation” of objects, it gives rise to many discrete topics; this can be attributed in part to the focus on topological invariants, which themselves usually take discrete values. See combinatorial topology, topological graph theory, topological

combinatorics, computational topology, discrete topological space, ﬁnite topological space, topology (chemistry).

2.2.13

Operations research

Main article: Operations research

Operations research provides techniques for solving practical problems in business and other ﬁelds — problems such

as allocating resources to maximize proﬁt, or scheduling project activities to minimize risk. Operations research

techniques include linear programming and other areas of optimization, queuing theory, scheduling theory, network

26

CHAPTER 2. DISCRETE MATHEMATICS

Computational geometry applies computer algorithms to representations of geometrical objects.

theory. Operations research also includes continuous topics such as continuous-time Markov process, continuoustime martingales, process optimization, and continuous and hybrid control theory.

2.2.14

Game theory, decision theory, utility theory, social choice theory

Decision theory is concerned with identifying the values, uncertainties and other issues relevant in a given decision,

its rationality, and the resulting optimal decision.

Utility theory is about measures of the relative economic satisfaction from, or desirability of, consumption of various

goods and services.

Social choice theory is about voting. A more puzzle-based approach to voting is ballot theory.

Game theory deals with situations where success depends on the choices of others, which makes choosing the best

course of action more complex. There are even continuous games, see diﬀerential game. Topics include auction

theory and fair division.

2.2. TOPICS IN DISCRETE MATHEMATICS

27

PERT charts like this provide a business management technique based on graph theory.

2.2.15

Discretization

Main article: Discretization

Discretization concerns the process of transferring continuous models and equations into discrete counterparts, often

for the purposes of making calculations easier by using approximations. Numerical analysis provides an important

example.

2.2.16

Discrete analogues of continuous mathematics

There are many concepts in continuous mathematics which have discrete versions, such as discrete calculus, discrete

probability distributions, discrete Fourier transforms, discrete geometry, discrete logarithms, discrete diﬀerential

geometry, discrete exterior calculus, discrete Morse theory, diﬀerence equations, discrete dynamical systems, and

discrete vector measures.

In applied mathematics, discrete modelling is the discrete analogue of continuous modelling. In discrete modelling,

discrete formulae are ﬁt to data. A common method in this form of modelling is to use recurrence relation.

In algebraic geometry, the concept of a curve can be extended to discrete geometries by taking the spectra of

polynomial rings over ﬁnite ﬁelds to be models of the aﬃne spaces over that ﬁeld, and letting subvarieties or spectra of

other rings provide the curves that lie in that space. Although the space in which the curves appear has a ﬁnite number

of points, the curves are not so much sets of points as analogues of curves in continuous settings. For example, every

point of the form V (x − c) ⊂ Spec K[x] = A1 for K a ﬁeld can be studied either as Spec K[x]/(x − c) ∼

= Spec K

, a point, or as the spectrum Spec K[x](x−c) of the local ring at (x-c), a point together with a neighborhood around

it. Algebraic varieties also have a well-deﬁned notion of tangent space called the Zariski tangent space, making many

features of calculus applicable even in ﬁnite settings.

2.2.17

Hybrid discrete and continuous mathematics

The time scale calculus is a uniﬁcation of the theory of diﬀerence equations with that of diﬀerential equations, which

has applications to ﬁelds requiring simultaneous modelling of discrete and continuous data. Another way of modeling

such a situation is the notion of hybrid dynamical system.

28

CHAPTER 2. DISCRETE MATHEMATICS

2.3 See also

• Outline of discrete mathematics

• CyberChase, a show that teaches Discrete Mathematics to children

2.4 References

[1] Richard Johnsonbaugh, Discrete Mathematics, Prentice Hall, 2008.

[2] Weisstein, Eric W., “Discrete mathematics”, MathWorld.

[3] Biggs, Norman L. (2002), Discrete mathematics, Oxford Science Publications (2nd ed.), New York: The Clarendon Press

Oxford University Press, p. 89, ISBN 9780198507178, MR 1078626, Discrete Mathematics is the branch of Mathematics

in which we deal with questions involving ﬁnite or countably inﬁnite sets.

[4] Brian Hopkins, Resources for Teaching Discrete Mathematics, Mathematical Association of America, 2008.

[5] Ken Levasseur; Al Doerr. Applied Discrete Structures. p. 8.

[6] Albert Geoﬀrey Howson, ed. (1988). Mathematics as a Service Subject. Cambridge University Press. pp. 77–78. ISBN

978-0-521-35395-3.

[7] Joseph G. Rosenstein. Discrete Mathematics in the Schools. American Mathematical Soc. p. 323. ISBN 978-0-82188578-9.

[8] http://ucsmp.uchicago.edu/secondary/curriculum/precalculus-discrete/

[9] Wilson, Robin (2002). Four Colors Suﬃce. London: Penguin Books. ISBN 978-0-691-11533-7.

[10] Hodges, Andrew. Alan Turing: the enigma. Random House, 1992.

[11] Trevor R. Hodkinson; John A. N. Parnell (2007). Reconstruction the Tree of Life: Taxonomy And Systematics of Large And

Species Rich Taxa. CRC PressINC. p. 97. ISBN 978-0-8493-9579-6.

[12] “Millennium Prize Problems”. 2000-05-24. Retrieved 2008-01-12.

[13] A. S. Troelstra; H. Schwichtenberg (2000-07-27). Basic Proof Theory. Cambridge University Press. p. 186. ISBN

978-0-521-77911-1.

[14] Samuel R. Buss (1998). Handbook of Proof Theory. Elsevier. p. 13. ISBN 978-0-444-89840-1.

[15] Franz Baader; Gerhard Brewka; Thomas Eiter (2001-10-16). KI 2001: Advances in Artiﬁcial Intelligence: Joint German/Austrian Conference on AI, Vienna, Austria, September 19-21, 2001. Proceedings. Springer. p. 325. ISBN 978-3540-42612-7.

[16] Brotherston, J.; Bornat, R.; Calcagno, C. (January 2008). “Cyclic proofs of program termination in separation logic”. ACM

SIGPLAN Notices 43 (1). CiteSeerX: 10.1.1.111.1105.

[17] Graphs on Surfaces, Bojan Mohar and Carsten Thomassen, Johns Hopkins University press, 2001

2.5 Further reading

• Norman L. Biggs (2002-12-19). Discrete Mathematics. Oxford University Press. ISBN 978-0-19-850717-8.

• John Dwyer (2010). An Introduction to Discrete Mathematics for Business & Computing. ISBN 978-1-90793400-1.

• Susanna S. Epp (2010-08-04). Discrete Mathematics With Applications. Thomson Brooks/Cole. ISBN 978-0495-39132-6.

• Ronald Graham, Donald E. Knuth, Oren Patashnik, Concrete Mathematics.

• Ralph P. Grimaldi (2004). Discrete and Combinatorial Mathematics: An Applied Introduction. Addison Wesley.

ISBN 978-0-201-72634-3.

2.6. EXTERNAL LINKS

29

• Donald E. Knuth (2011-03-03). The Art of Computer Programming, Volumes 1-4a Boxed Set. Addison-Wesley

Professional. ISBN 978-0-321-75104-1.

• Jiří Matoušek; Jaroslav Nešetřil (1998). Discrete Mathematics. Oxford University Press. ISBN 978-0-19850208-1.

• Obrenic, Bojana (2003-01-29). Practice Problems in Discrete Mathematics. Prentice Hall. ISBN 978-0-13045803-2.

• Kenneth H. Rosen; John G. Michaels (2000). Hand Book of Discrete and Combinatorial Mathematics. CRC

PressI Llc. ISBN 978-0-8493-0149-0.

• Kenneth H. Rosen (2007). Discrete Mathematics: And Its Applications. McGraw-Hill College. ISBN 978-007-288008-3.

• Andrew Simpson (2002). Discrete Mathematics by Example. McGraw-Hill Incorporated. ISBN 978-0-07709840-7.

• Veerarajan, T.(2007), Discrete mathematics with graph theory and combinatorics, Tata Mcgraw Hill

2.6 External links

• Discrete mathematics at the utk.edu Mathematics Archives, providiing links to syllabi, tutorials, programs, etc.

• Iowa Central: Electrical Technologies Program Discrete mathematics for Electrical engineering.

Chapter 3

Glossary of graph theory

Graph theory is a growing area in mathematical research, and has a large specialized vocabulary. Some authors use

the same word with diﬀerent meanings. Some authors use diﬀerent words to mean the same thing. This page attempts

to describe the majority of current usage.

3.1 Basics

A graph G consists of two types of elements, namely vertices and edges. Every edge has two endpoints in the set of

vertices, and is said to connect or join the two endpoints. An edge can thus be deﬁned as a set of two vertices (or an

ordered pair, in the case of a directed graph - see Section Direction). The two endpoints of an edge are also said to

be adjacent to each other.

Alternative models of graphs exist; e.g., a graph may be thought of as a Boolean binary function over the set of

vertices or as a square (0,1)-matrix.

A vertex is simply drawn as a node or a dot. The vertex set of G is usually denoted by V(G), or V when there is no

danger of confusion. The order of a graph is the number of its vertices, i.e. |V(G)|.

An edge (a set of two elements) is drawn as a line connecting two vertices, called endpoints or end vertices or

endvertices. An edge with endvertices x and y is denoted by xy (without any symbol in between). The edge set of

G is usually denoted by E(G), or E when there is no danger of confusion. An edge xy is called incident to a vertex

when this vertex is one of the endpoints x or y.

The size of a graph is the number of its edges, i.e. |E(G)|.[1]

A loop is an edge whose endpoints are the same vertex. A link has two distinct endvertices. An edge is multiple if

there is another edge with the same endvertices; otherwise it is simple. The multiplicity of an edge is the number

of multiple edges sharing the same end vertices; the multiplicity of a graph, the maximum multiplicity of its edges.

A graph is a simple graph if it has no multiple edges or loops, a multigraph if it has multiple edges, but no loops,

and a multigraph or pseudograph if it contains both multiple edges and loops (the literature is highly inconsistent).

When stated without any qualiﬁcation, a graph is usually assumed to be simple, except in the literature of category

theory, where it refers to a quiver.

Graphs whose edges or vertices have names or labels are known as labeled, those without as unlabeled. Graphs with

labeled vertices only are vertex-labeled, those with labeled edges only are edge-labeled. The diﬀerence between a

labeled and an unlabeled graph is that the latter has no speciﬁc set of vertices or edges; it is regarded as another way

to look upon an isomorphism type of graphs. (Thus, this usage distinguishes between graphs with identiﬁable vertex

or edge sets on the one hand, and isomorphism types or classes of graphs on the other.)

(Graph labeling usually refers to the assignment of labels (usually natural numbers, usually distinct) to the edges and

vertices of a graph, subject to certain rules depending on the situation. This should not be confused with a graph’s

merely having distinct labels or names on the vertices.)

A hyperedge is an edge that is allowed to take on any number of vertices, possibly more than 2. A graph that allows

any hyperedge is called a hypergraph. A simple graph can be considered a special case of the hypergraph, namely

the 2-uniform hypergraph. However, when stated without any qualiﬁcation, an edge is always assumed to consist of

30

3.1. BASICS

31

In this pseudograph the blue edges are loops and the red edges are multiple edges of multiplicity 2 and 3. The multiplicity of the

graph is 3.

at most 2 vertices, and a graph is never confused with a hypergraph.

A non-edge (or anti-edge) is an edge that is not present in the graph. More formally, for two vertices u and v , {u, v}

is a non-edge in a graph G whenever {u, v} is not an edge in G . This means that there is either no edge between the

two vertices or (for directed graphs) at most one of (u, v) and (v, u) from v is an arc in G.

Occasionally the term cotriangle or anti-triangle is used for a set of three vertices none of which are connected.

¯ of a graph G is a graph with the same vertex set as G but with an edge set such that xy is an edge

The complement G

¯ if and only if xy is not an edge in G.

in G

An edgeless graph or empty graph or null graph is a graph with zero or more vertices, but no edges. The empty

graph or null graph may also be the graph with no vertices and no edges. If it is a graph with no edges and any

number n of vertices, it may be called the null graph on n vertices. (There is no consistency at all in the literature.)

A graph is inﬁnite if it has inﬁnitely many vertices or edges or both; otherwise the graph is ﬁnite. An inﬁnite graph

where every vertex has ﬁnite degree is called locally ﬁnite. When stated without any qualiﬁcation, a graph is usually

assumed to be ﬁnite. See also continuous graph.

Two graphs G and H are said to be isomorphic, denoted by G ~ H, if there is a one-to-one correspondence, called

an isomorphism, between the vertices of the graph such that two vertices are adjacent in G if and only if their

32

CHAPTER 3. GLOSSARY OF GRAPH THEORY

6

4

3

5

1

2

A labeled simple graph with vertex set V = {1, 2, 3, 4, 5, 6} and edge set E = {{1,2}, {1,5}, {2,3}, {2,5}, {3,4}, {4,5}, {4,6}}.

corresponding vertices are adjacent in H. Likewise, a graph G is said to be homomorphic to a graph H if there

is a mapping, called a homomorphism, from V(G) to V(H) such that if two vertices are adjacent in G then their

corresponding vertices are adjacent in H.

3.1.1

Subgraphs

A subgraph, H, of a graph, G, is a graph whose vertices are a subset of the vertex set of G, and whose edges are a

subset of the edge set of G. In reverse, a supergraph of a graph G is a graph of which G is a subgraph. A graph, G,

contains a graph, H, if H is a subgraph of, or is isomorphic to G.

A subgraph, H, spans a graph, G, and is a spanning subgraph, or factor of G, if it has the same vertex set as G.

A subgraph, H, of a graph, G, is said to be induced (or full) if, for every pair of vertices x and y of H, xy is an edge

of H if and only if xy is an edge of G. In other words, H is an induced subgraph of G if it has exactly the edges that

appear in G over the same vertex set. If the vertex set of H is the subset S of V(G), then H can be written as G[S]

and is said to be induced by S.

A graph, G, is minimal with some property, P, provided that G has property P and no proper subgraph of G has

property P. In this deﬁnition, the term subgraph is usually understood to mean induced subgraph. The notion of

maximality is deﬁned dually: G is maximal with P provided that P(G) and G has no proper supergraph H such that

P(H).

A graph that does not contain H as an induced subgraph is said to be H-free, and more generally if F is a family of

graphs then the graphs that do not contain any induced subgraph isomorphic to a member of F are called F -free.[2]

For example the triangle-free graphs are the graphs that do not have a triangle graph as an induced subgraph. Many

important classes of graphs can be deﬁned by sets of forbidden subgraphs, the graphs that are not in the class and are

minimal with respect to subgraphs, induced subgraphs, or graph minors.

A universal graph in a class K of graphs is a simple graph in which every element in K can be embedded as a

subgraph.

3.1. BASICS

3.1.2

33

Walks

A walk is a sequence of vertices and edges, where each edge’s endpoints are the preceding and following vertices in

the sequence. A walk is closed if its ﬁrst and last vertices are the same, and open if they are diﬀerent.

The length l of a walk is the number of edges that it uses. For an open walk, l = n–1, where n is the number of

vertices visited (a vertex is counted each time it is visited). For a closed walk, l = n (the start/end vertex is listed

twice, but is not counted twice). In the example labeled simple graph, (1, 2, 5, 1, 2, 3) is an open walk with length 5,

and (4, 5, 2, 1, 5, 4) is a closed walk of length 5.

A trail is a walk in which all the edges are distinct. A closed trail has been called a tour or circuit, but these are not

universal, and the latter is often reserved for a regular subgraph of degree two.

A directed tour. This is not a simple cycle, since the blue vertices are used twice.

Traditionally, a path referred to what is now usually known as an open walk. Nowadays, when stated without any

qualiﬁcation, a path is usually understood to be simple, meaning that no vertices (and thus no edges) are repeated.

(The term chain has also been used to refer to a walk in which all vertices and edges are distinct.) In the example

labeled simple graph, (5, 2, 1) is a path of length 2. The closed equivalent to this type of walk, a walk that starts

and ends at the same vertex but otherwise has no repeated vertices or edges, is called a cycle. Like path, this term

traditionally referred to any closed walk, but now is usually understood to be simple by deﬁnition. In the example

labeled simple graph, (1, 5, 2, 1) is a cycle of length 3. (A cycle, unlike a path, is not allowed to have length 0.) Paths

and cycles of n vertices are often denoted by Pn and Cn, respectively. (Some authors use the length instead of the

number of vertices, however.)

C 1 is a loop, C 2 is a digon (a pair of parallel undirected edges in a multigraph, or a pair of antiparallel edges in a

34

CHAPTER 3. GLOSSARY OF GRAPH THEORY

directed graph), and C 3 is called a triangle.

A cycle that has odd length is an odd cycle; otherwise it is an even cycle. One theorem is that a graph is bipartite if

and only if it contains no odd cycles. (See complete bipartite graph.)

A graph is acyclic if it contains no cycles; unicyclic if it contains exactly one cycle; and pancyclic if it contains cycles

of every possible length (from 3 to the order of the graph).

A wheel graph is a graph with n vertices (n ≥ 4), formed by connecting a single vertex to all vertices of C -₁.

The girth of a graph is the length of a shortest (simple) cycle in the graph; and the circumference, the length of a

longest (simple) cycle. The girth and circumference of an acyclic graph are deﬁned to be inﬁnity ∞.

A path or cycle is Hamiltonian (or spanning) if it uses all vertices exactly once. A graph that contains a Hamiltonian path is traceable; and one that contains a Hamiltonian path for any given pair of (distinct) end vertices is a

Hamiltonian connected graph. A graph that contains a Hamiltonian cycle is a Hamiltonian graph.

A trail or circuit (or cycle) is Eulerian if it uses all edges precisely once. A graph that contains an Eulerian trail is

traversable. A graph that contains an Eulerian circuit is an Eulerian graph.

Two paths are internally disjoint (some people call it independent) if they do not have any vertex in common, except

the ﬁrst and last ones.

A theta graph is the union of three internally disjoint (simple) paths that have the same two distinct end vertices.[3]

A theta0 graph has seven vertices and eight edges that can be drawn as the perimeter and one diameter of a regular

hexagon. (The seventh vertex splits the diameter into two edges.) The smallest, excluding multigraphs, topological

minor of a theta0 graph consists of a square plus one of its diagonals.

3.1.3

Trees

A tree is a connected acyclic simple graph. For directed graphs, each vertex has at most one incoming edge. A vertex

of degree 1 is called a leaf, or pendant vertex. An edge incident to a leaf is a leaf edge, or pendant edge. (Some

people deﬁne a leaf edge as a leaf and then deﬁne a leaf vertex on top of it. These two sets of deﬁnitions are often

used interchangeably.) A non-leaf vertex is an internal vertex. Sometimes, one vertex of the tree is distinguished,

and called the root; in this case, the tree is called rooted. Rooted trees are often treated as directed acyclic graphs

with the edges pointing away from the root.

A subtree of the tree T is a connected subgraph of T.

A forest is an acyclic simple graph. For directed graphs, each vertex has at most one incoming edge. (That is, a tree

with the connectivity requirement removed; a graph containing multiple disconnected trees.)

A subforest of the forest F is a subgraph of F.

A spanning tree is a spanning subgraph that is a tree. Every graph has a spanning forest. But only a connected graph

has a spanning tree.

A special kind of tree called a star is K₁,k. An induced star with 3 edges is a claw.

A caterpillar is a tree in which all non-leaf nodes form a single path.

A k-ary tree is a rooted tree in which every internal vertex has no more than k children. A 1-ary tree is just a path.

A 2-ary tree is also called a binary tree.

3.1.4

Cliques

The complete graph Kn of order n is a simple graph with n vertices in which every vertex is adjacent to every other.

The pentagon-shaped graph to the right is complete. The complete graph on n vertices is often denoted by Kn. It has

n(n−1)/2 edges (corresponding to all possible choices of pairs of vertices).

A clique in a graph is a set of pairwise adjacent vertices. Since any subgraph induced by a clique is a complete

subgraph, the two terms and their notations are usually used interchangeably. A k-clique is a clique of order k. In the

example labeled simple graph above, vertices 1, 2 and 5 form a 3-clique, or a triangle. A maximal clique is a clique

that is not a subset of any other clique (some authors reserve the term clique for maximal cliques).

The clique number ω(G) of a graph G is the order of a largest clique in G.

3.1. BASICS

35

2

1

3

4

5

6

A labeled tree with 6 vertices and 5 edges. Nodes 1, 2, 3, and 6 are leaves, while 4 and 5 are internal vertices.

3.1.5

Strongly connected component

A related but weaker concept is that of a strongly connected component. Informally, a strongly connected component

of a directed graph is a subgraph where all nodes in the subgraph are reachable by all other nodes in the subgraph.

Reachability between nodes is established by the existence of a path between the nodes.

A directed graph can be decomposed into strongly connected components by running the depth-ﬁrst search (DFS)

algorithm twice: ﬁrst, on the graph itself and next on the transpose graph in decreasing order of the ﬁnishing times

of the ﬁrst DFS. Given a directed graph G, the transpose GT is the graph G with all the edge directions reversed.

36

CHAPTER 3. GLOSSARY OF GRAPH THEORY

K5 , a complete graph. If a subgraph looks like this, the vertices in that subgraph form a clique of size 5.

3.1.6

Hypercubes

A hypercube graph Qn is a regular graph with 2n vertices, 2n−1 n edges, and n edges touching each vertex. It can be

obtained as the one-dimensional skeleton of the geometric hypercube.

3.1.7

Knots

A knot in a directed graph is a collection of vertices and edges with the property that every vertex in the knot has

outgoing edges, and all outgoing edges from vertices in the knot terminate at other vertices in the knot. Thus it is

impossible to leave the knot while following the directions of the edges.

3.1.8

Minors

A minor G2 = (V2 , E2 ) of G1 = (V1 , E1 ) is an injection from V2 to V1 such that every edge in E2 corresponds to

a path (disjoint from all other such paths) in G1 such that every vertex in V1 is in one or more paths, or is part of

the injection from V2 to V1 . This can alternatively be phrased in terms of contractions, which are operations which

collapse a path and all vertices on it into a single edge (see Minor (graph theory)).

3.2. ADJACENCY AND DEGREE

3.1.9

37

Embedding

An embedding G2 = (V2 , E2 ) of G1 = (V1 , E1 ) is an injection from V2 to V1 such that every edge in E2 corresponds

to a path in G1 .[4]

3.2 Adjacency and degree

In graph theory, degree, especially that of a vertex, is usually a measure of immediate adjacency.

An edge connects two vertices; these two vertices are said to be incident to that edge, or, equivalently, that edge

incident to those two vertices. All degree-related concepts have to do with adjacency or incidence.

The degree, or valency, dG(v) of a vertex v in a graph G is the number of edges incident to v, with loops being

counted twice. A vertex of degree 0 is an isolated vertex. A vertex of degree 1 is a leaf. In the example labeled

simple graph, vertices 1 and 3 have a degree of 2, vertices 2, 4 and 5 have a degree of 3, and vertex 6 has a degree of

1. If E is ﬁnite, then the total sum of vertex degrees is equal to twice the number of edges.

The total degree of a graph is the sum of the degrees of all its vertices. Thus, for a graph without loops, it is equal to

the number of incidences between vertices and edges. The handshaking lemma states that the total degree is always

equal to two times the number of edges, loops included. This means that for a simple graph with 3 vertices with each

vertex having a degree of two (i.e. a triangle) the total degree would be six (e.g. 3 x 2 = 6).

A degree sequence is a list of degrees of a graph in non-increasing order (e.g. d1 ≥ d2 ≥ … ≥ dn). A sequence of

non-increasing integers is realizable if it is a degree sequence of some graph.

Two vertices u and v are called adjacent if an edge exists between them. We denote this by u ~ v or u ↓ v. In the

above graph, vertices 1 and 2 are adjacent, but vertices 2 and 4 are not. The set of neighbors of v, that is, vertices

adjacent to v not including v itself, forms an induced subgraph called the (open) neighborhood of v and denoted

NG(v). When v is also included, it is called a closed neighborhood and denoted by NG[v]. When stated without

any qualiﬁcation, a neighborhood is assumed to be open. The subscript G is usually dropped when there is no danger

of confusion; the same neighborhood notation may also be used to refer to sets of adjacent vertices rather than the

corresponding induced subgraphs. In the example labeled simple graph, vertex 1 has two neighbors: vertices 2 and

5. For a simple graph, the number of neighbors that a vertex has coincides with its degree.

A dominating set of a graph is a vertex subset whose closed neighborhood includes all vertices of the graph. A vertex

v dominates another vertex u if there is an edge from v to u. A vertex subset V dominates another vertex subset

U if every vertex in U is adjacent to some vertex in V. The minimum size of a dominating set is the domination

number γ(G).

In computers, a ﬁnite, directed or undirected graph (with n vertices, say) is often represented by its adjacency matrix:

an n-by-n matrix whose entry in row i and column j gives the number of edges from the i-th to the j-th vertex.

Spectral graph theory studies relationships between the properties of a graph and its adjacency matrix or other

matrices associated with the graph.

The maximum degree Δ(G) of a graph G is the largest degree over all vertices; the minimum degree δ(G), the

smallest.

A graph in which every vertex has the same degree is regular. It is k-regular if every vertex has degree k. A 0regular graph is an independent set. A 1-regular graph is a matching. A 2-regular graph is a vertex disjoint union of

cycles. A 3-regular graph is said to be cubic, or trivalent.

A k-factor is a k-regular spanning subgraph. A 1-factor is a perfect matching. A partition of edges of a graph into

k-factors is called a k-factorization. A k-factorable graph is a graph that admits a k-factorization.

A graph is biregular if it has unequal maximum and minimum degrees and every vertex has one of those two degrees.

A strongly regular graph is a regular graph such that any adjacent vertices have the same number of common

neighbors as other adjacent pairs and that any nonadjacent vertices have the same number of common neighbors as

other nonadjacent pairs.

38

3.2.1

CHAPTER 3. GLOSSARY OF GRAPH THEORY

Independence

In graph theory, the word independent usually carries the connotation of pairwise disjoint or mutually nonadjacent.

In this sense, independence is a form of immediate nonadjacency. An isolated vertex is a vertex not incident to any

edges. An independent set, or coclique, or stable set, is a set of vertices of which no pair is adjacent. Since the graph

induced by any independent set is an empty graph, the two terms are usually used interchangeably. In the example

labeled simple graph at the top of this page, vertices 1, 3, and 6 form an independent set; and 2 and 4 form another

one.

Two subgraphs are edge disjoint if they have no edges in common. Similarly, two subgraphs are vertex disjoint if

they have no vertices (and thus, also no edges) in common. Unless speciﬁed otherwise, a set of disjoint subgraphs

are assumed to be pairwise vertex disjoint.

The independence number α(G) of a graph G is the size of the largest independent set of G.

A graph can be decomposed into independent sets in the sense that the entire vertex set of the graph can be partitioned

into pairwise disjoint independent subsets. Such independent subsets are called partite sets, or simply parts.

A graph that can be decomposed into two partite sets bipartite; three sets, tripartite; k sets, k-partite; and an

unknown number of sets, multipartite. An 1-partite graph is the same as an independent set, or an empty graph. A

2-partite graph is the same as a bipartite graph. A graph that can be decomposed into k partite sets is also said to be

k-colourable.

A complete multipartite graph is a graph in which vertices are adjacent if and only if they belong to diﬀerent partite

sets. A complete bipartite graph is also referred to as a biclique; if its partite sets contain n and m vertices, respectively,

then the graph is denoted Kn,m.

A k-partite graph is semiregular if each of its partite sets has a uniform degree; equipartite if each partite set has

the same size; and balanced k-partite if each partite set diﬀers in size by at most 1 with any other.

The matching number α′ (G) of a graph G is the size of a largest matching, or pairwise vertex disjoint edges, of G.

A spanning matching, also called a perfect matching is a matching that covers all vertices of a graph.

3.3 Complexity

Complexity of a graph denotes the quantity of information that a graph contained, and can be measured in several

ways. For example, by counting the number of its spanning trees, or the value of a certain formula involving the

number of vertices, edges, and proper paths in a graph. [5]

3.4 Connectivity

Connectivity extends the concept of adjacency and is essentially a form (and measure) of concatenated adjacency.

If it is possible to establish a path from any vertex to any other vertex of a graph, the graph is said to be connected;

otherwise, the graph is disconnected. A graph is totally disconnected if there is no path connecting any pair of

vertices. This is just another name to describe an empty graph or independent set.

A cut vertex, or articulation point, is a vertex whose removal disconnects the remaining subgraph. A cut set, or

vertex cut or separating set, is a set of vertices whose removal disconnects the remaining subgraph. A bridge is an

analogous edge (see below).

If it is always possible to establish a path from any vertex to every other even after removing any k - 1 vertices,

then the graph is said to be k-vertex-connected or k-connected. Note that a graph is k-connected if and only if it

contains k internally disjoint paths between any two vertices. The example labeled simple graph above is connected

(and therefore 1-connected), but not 2-connected. The vertex connectivity or connectivity κ(G) of a graph G is the

minimum number of vertices that need to be removed to disconnect G. The complete graph Kn has connectivity n 1 for n > 1; and a disconnected graph has connectivity 0.

In network theory, a giant component is a connected subgraph that contains a majority of the entire graph’s nodes.

A bridge, or cut edge or isthmus, is an edge whose removal disconnects a graph. (For example, all the edges in a tree

are bridges.) A cut vertex is an analogous vertex (see above). A disconnecting set is a set of edges whose removal

3.5. DISTANCE

39

increases the number of components. An edge cut is the set of all edges which have one vertex in some proper vertex

subset S and the other vertex in V(G)\S. Edges of K 3 form a disconnecting set but not an edge cut. Any two edges

of K 3 form a minimal disconnecting set as well as an edge cut. An edge cut is necessarily a disconnecting set; and a

minimal disconnecting set of a nonempty graph is necessarily an edge cut. A bond is a minimal (but not necessarily

minimum), nonempty set of edges whose removal disconnects a graph.

A graph is k-edge-connected if any subgraph formed by removing any k - 1 edges is still connected. The edge

connectivity κ'(G) of a graph G is the minimum number of edges needed to disconnect G. One well-known result is

that κ(G) ≤ κ'(G) ≤ δ(G).

A component is a maximally connected subgraph. A block is either a maximally 2-connected subgraph, a bridge

(together with its vertices), or an isolated vertex. A biconnected component is a 2-connected component.

An articulation point (also known as a separating vertex) of a graph is a vertex whose removal from the graph

increases its number of connected components. A biconnected component can be deﬁned as a subgraph induced by

a maximal set of nodes that has no separating vertex.

3.5 Distance

The distance dG(u, v) between two (not necessary distinct) vertices u and v in a graph G is the length of a shortest

path (also called a graph geodesic) between them. The subscript G is usually dropped when there is no danger of

confusion. When u and v are identical, their distance is 0. When u and v are unreachable from each other, their

distance is deﬁned to be inﬁnity ∞.

The eccentricity εG(v) of a vertex v in a graph G is the maximum distance from v to any other vertex. The diameter

diam(G) of a graph G is the maximum eccentricity over all vertices in a graph; and the radius rad(G), the minimum.

When there are two components in G, diam(G) and rad(G) deﬁned to be inﬁnity ∞. Trivially, diam(G) ≤ 2 rad(G).

Vertices with maximum eccentricity are called peripheral vertices. Vertices of minimum eccentricity form the

center. A tree has at most two center vertices.

The Wiener index of a vertex v in a graph G, denoted by WG(v) is the sum of distances between v and all others.

The Wiener index of a graph G, denoted by W(G), is the sum of distances over all pairs of vertices. An undirected

graph’s Wiener polynomial is deﬁned to be Σ qd(u,v) over all unordered pairs of vertices u and v. Wiener index and

Wiener polynomial are of particular interest to mathematical chemists.

The k-th power Gk of a graph G is a supergraph formed by adding an edge between all pairs of vertices of G with

distance at most k. A second power of a graph is also called a square.

A k-spanner is a spanning subgraph, S, in which every two vertices are at most k times as far apart on S than on G.

The number k is the dilation. k-spanner is used for studying geometric network optimization.

3.6 Genus

A crossing is a pair of intersecting edges. A graph is embeddable on a surface if its vertices and edges can be

arranged on it without any crossing. The genus of a graph is the lowest genus of any surface on which the graph can

embed.

A planar graph is one which can be drawn on the (Euclidean) plane without any crossing; and a plane graph, one

which is drawn in such fashion. In other words, a planar graph is a graph of genus 0. The example labeled simple

graph is planar; the complete graph on n vertices, for n> 4, is not planar. Also, a tree is necessarily a planar graph.

When a graph is drawn without any crossing, any cycle that surrounds a region without any edges reaching from the

cycle into the region forms a face. Two faces on a plane graph are adjacent if they share a common edge. A dual,

or planar dual when the context needs to be clariﬁed, G* of a plane graph G is a graph whose vertices represent the

faces, including any outerface, of G and are adjacent in G* if and only if their corresponding faces are adjacent in G.

The dual of a planar graph is always a planar pseudograph (e.g. consider the dual of a triangle). In the familiar case

of a 3-connected simple planar graph G (isomorphic to a convex polyhedron P), the dual G* is also a 3-connected

simple planar graph (and isomorphic to the dual polyhedron P * ).

Furthermore, since we can establish a sense of “inside” and “outside” on a plane, we can identify an “outermost”

region that contains the entire graph if the graph does not cover the entire plane. Such outermost region is called

40

CHAPTER 3. GLOSSARY OF GRAPH THEORY

an outer face. An outerplanar graph is one which can be drawn in the planar fashion such that its vertices are all

adjacent to the outer face; and an outerplane graph, one which is drawn in such fashion.

The minimum number of crossings that must appear when a graph is drawn on a plane is called the crossing number.

The minimum number of planar graphs needed to cover a graph is the thickness of the graph.

3.7 Weighted graphs and networks

A weighted graph associates a label (weight) with every edge in the graph. Weights are usually real numbers. They

may be restricted to rational numbers or integers. Certain algorithms require further restrictions on weights; for

instance, Dijkstra’s algorithm works properly only for positive weights. The weight of a path or the weight of a

tree in a weighted graph is the sum of the weights of the selected edges. Sometimes a non-edge (a vertex pair with

no connecting edge) is indicated by labeling it with a special weight representing inﬁnity. Sometimes the word cost

is used instead of weight. When stated without any qualiﬁcation, a graph is always assumed to be unweighted. In

some writing on graph theory the term network is a synonym for a weighted graph. A network may be directed or

undirected, it may contain special vertices (nodes), such as source or sink. The classical network problems include:

• minimum cost spanning tree,

• shortest paths,

• maximal ﬂow (and the max-ﬂow min-cut theorem)

3.8 Direction

Main article: Digraph (mathematics)

A directed arc, or directed edge, is an ordered pair of endvertices that can be represented graphically as an arrow

drawn between the endvertices. In such an ordered pair the ﬁrst vertex is called the initial vertex or tail; the second

one is called the terminal vertex or head (because it appears at the arrow head). An undirected edge disregards

any sense of direction and treats both endvertices interchangeably. A loop in a digraph, however, keeps a sense of

direction and treats both head and tail identically. A set of arcs are multiple, or parallel, if they share the same head

and the same tail. A pair of arcs are anti-parallel if one’s head/tail is the other’s tail/head. A digraph, or directed

graph, or oriented graph, is analogous to an undirected graph except that it contains only arcs. A mixed graph may

contain both directed and undirected edges; it generalizes both directed and undirected graphs. When stated without

any qualiﬁcation, a graph is almost always assumed to be undirected.

A digraph is called simple if it has no loops and at most one arc between any pair of vertices. When stated without

any qualiﬁcation, a digraph is usually assumed to be simple. A quiver is a directed graph which is speciﬁcally allowed,

but not required, to have loops and more than one arc between any pair of vertices.

In a digraph Γ, we distinguish the out degree dΓ+ (v), the number of edges leaving a vertex v, and the in degree

dΓ− (v), the number of edges entering a vertex v. If the graph is oriented, the degree dΓ(v) of a vertex v is equal to the

sum of its out- and in- degrees. When the context is clear, the subscript Γ can be dropped. Maximum and minimum

out degrees are denoted by Δ+ (Γ) and δ+ (Γ); and maximum and minimum in degrees, Δ− (Γ) and δ− (Γ).

An out-neighborhood, or successor set, N + Γ(v) of a vertex v is the set of heads of arcs going from v. Likewise, an

in-neighborhood, or predecessor set, N − Γ(v) of a vertex v is the set of tails of arcs going into v.

A source is a vertex with 0 in-degree; and a sink, 0 out-degree.

A vertex v dominates another vertex u if there is an arc from v to u. A vertex subset S is out-dominating if every

vertex not in S is dominated by some vertex in S; and in-dominating if every vertex in S is dominated by some vertex

not in S.

A kernel in a (possibly directed) graph G is an independent set S such that every vertex in V(G) \ S dominates

some vertex in S. In undirected graphs, kernels are maximal independent sets.[6] A digraph is kernel perfect if every

induced sub-digraph has a kernel.[7]

An Eulerian digraph is a digraph with equal in- and out-degrees at every vertex.

3.9. COLOURING

41

The zweieck of an undirected edge e = (u, v) is the pair of diedges (u, v) and (v, u) which form the simple dicircuit.

An orientation is an assignment of directions to the edges of an undirected or partially directed graph. When

stated without any qualiﬁcation, it is usually assumed that all undirected edges are replaced by a directed one in an

orientation. Also, the underlying graph is usually assumed to be undirected and simple.

A tournament is a digraph in which each pair of vertices is connected by exactly one arc. In other words, it is an

oriented complete graph.

A directed path, or just a path when the context is clear, is an oriented simple path such that all arcs go the same

direction, meaning all internal vertices have in- and out-degrees 1. A vertex v is reachable from another vertex u if

there is a directed path that starts from u and ends at v. Note that in general the condition that u is reachable from v

does not imply that v is also reachable from u.

If v is reachable from u, then u is a predecessor of v and v is a successor of u. If there is an arc from u to v, then u

is a direct predecessor of v, and v is a direct successor of u.

A digraph is strongly connected if every vertex is reachable from every other following the directions of the arcs.

On the contrary, a digraph is weakly connected if its underlying undirected graph is connected. A weakly connected

graph can be thought of as a digraph in which every vertex is “reachable” from every other but not necessarily following

the directions of the arcs. A strong orientation is an orientation that produces a strongly connected digraph.

A directed cycle, or just a cycle when the context is clear, is an oriented simple cycle such that all arcs go the same

direction, meaning all vertices have in- and out-degrees 1. A digraph is acyclic if it does not contain any directed

cycle. A ﬁnite, acyclic digraph with no isolated vertices necessarily contains at least one source and at least one sink.

An arborescence, or out-tree or branching, is an oriented tree in which all vertices are reachable from a single vertex.

Likewise, an in-tree is an oriented tree in which a single vertex is reachable from every other one.

3.8.1

Directed acyclic graphs

Main article: directed acyclic graph

The partial order structure of directed acyclic graphs (or DAGs) gives them their own terminology.

If there is a directed edge from u to v, then we say u is a parent of v and v is a child of u. If there is a directed path

from u to v, we say u is an ancestor of v and v is a descendant of u.

The moral graph of a DAG is the undirected graph created by adding an (undirected) edge between all parents of

the same node (sometimes called marrying), and then replacing all directed edges by undirected edges. A DAG is

perfect if, for each node, the set of parents is complete (i.e. no new edges need to be added when forming the moral

graph).

3.9 Colouring

Main article: Graph colouring

Vertices in graphs can be given colours to identify or label them. Although they may actually be rendered in diagrams

in diﬀerent colours, working mathematicians generally pencil in numbers or letters (usually numbers) to represent the

colours.

Given a graph G (V,E) a k-colouring of G is a map ϕ : V → {1, ..., k} with the property that (u, v) ∈ E ⇒ ϕ(u) ≠

ϕ(v) - in other words, every vertex is assigned a colour with the condition that adjacent vertices cannot be assigned

the same colour.

The chromatic number χ(G) is the smallest k for which G has a k-colouring.

Given a graph and a colouring, the colour classes of the graph are the sets of vertices given the same colour.

A graph is called k-critical if its chromatic number is k but all of its proper subgraphs have chromatic number less

than k. An odd cycle is 3-critical, and the complete graph on k vertices is k-critical.

42

CHAPTER 3. GLOSSARY OF GRAPH THEORY

This graph is an example of a 4-critical graph. Its chromatic number is 4 but all of its proper subgraphs have a chromatic number

less than 4. This graph is also planar

3.10 Various

A graph invariant is a property of a graph G, usually a number or a polynomial, that depends only on the isomorphism

class of G. Examples are the order, genus, chromatic number, and chromatic polynomial of a graph.

3.11 See also

• Graph (mathematics)

• List of graph theory topics

3.12 References

[1] Harris, John M. (2000). Combinatorics and Graph Theory. New York: Springer-Verlag. p. 5. ISBN 0-387-98736-3.

[2] Brandstädt, Andreas; Le, Van Bang; Spinrad, Jeremy (1999), “Chapter 7: Forbidden Subgraph”, Graph Classes: A Survey,

SIAM Monographs on Discrete Mathematics and Applications, pp. 105–121, ISBN 0-89871-432-X.

[3] Mitchem, John (1969), “Hypo-properties in graphs”, The Many Facets of Graph Theory (Proc. Conf., Western Mich. Univ.,

Kalamazoo, Mich., 1968), Springer, pp. 223–230, doi:10.1007/BFb0060121, MR 0253932; Bondy, J. A. (1972), “The

“graph theory” of the Greek alphabet”, Graph theory and applications (Proc. Conf., Western Michigan Univ., Kalamazoo,

Mich., 1972; dedicated to the memory of J. W. T. Youngs), Lecture Notes in Mathematics 303, Springer, pp. 43–54,

doi:10.1007/BFb0067356, MR 0335362.

[4] Rosenberg, Arnold L. and Heath, Lenwood S. (2001). Graph separators with applications. (1st edition ed.). Kluwer. ISBN

978-0-306-46464-5.

3.12. REFERENCES

43

[5] Neel, David L. (2006), “The linear complexity of a graph”, The electronic journal of combinatorics

[6] Bondy, J.A., Murty, U.S.R., Graph Theory, p. 298

[7] Béla Bollobás, Modern Graph theory, p. 298

• Bollobás, Béla (1998). Modern Graph Theory. Graduate Texts in Mathematics 184. New York: SpringerVerlag. ISBN 0-387-98488-7. Zbl 0902.05016.. [Packed with advanced topics followed by a historical

overview at the end of each chapter.]

• Brandstädt, Andreas; Le, Van Bang; Spinrad, Jeremy P. (1999). Graph classes: a survey. SIAM Monographs

on Discrete Mathematics. and Applications 3. Philadelphia, PA: Society for Industrial and Applied Mathematics. ISBN 978-0-898714-32-6. Zbl 0919.05001.

• Diestel, Reinhard (2010). Graph Theory. Graduate Texts in Mathematics 173 (4th ed.). Springer-Verlag.

ISBN 978-3-642-14278-9. Zbl 1204.05001. [Standard textbook, most basic material and some deeper results,

exercises of various diﬃculty and notes at the end of each chapter; known for being quasi error-free.]

• West, Douglas B. (2001). Introduction to Graph Theory (2ed). Upper Saddle River: Prentice Hall. ISBN

0-13-014400-2. [Tons of illustrations, references, and exercises. The most complete introductory guide to the

subject.]

• Weisstein, Eric W., “Graph”, MathWorld.

• Zaslavsky, Thomas. Glossary of signed and gain graphs and allied areas. Electronic Journal of Combinatorics,

Dynamic Surveys in Combinatorics, # DS 8. http://www.combinatorics.org/Surveys/

Chapter 4

Graph (mathematics)

This article is about sets of vertices connected by edges. For graphs of mathematical functions, see Graph of a

function. For other uses, see Graph (disambiguation).

In mathematics, and more speciﬁcally in graph theory, a graph is a representation of a set of objects where some

6

5

4

1

2

3

A drawing of a labeled graph on 6 vertices and 7 edges.

pairs of objects are connected by links. The interconnected objects are represented by mathematical abstractions

called vertices, and the links that connect some pairs of vertices are called edges.[1] Typically, a graph is depicted in

diagrammatic form as a set of dots for the vertices, joined by lines or curves for the edges. Graphs are one of the

objects of study in discrete mathematics.

The edges may be directed or undirected. For example, if the vertices represent people at a party, and there is an

edge between two people if they shake hands, then this is an undirected graph, because if person A shook hands with

person B, then person B also shook hands with person A. In contrast, if there is an edge from person A to person B

when person A knows of person B, then this graph is directed, because knowledge of someone is not necessarily a

symmetric relation (that is, one person knowing another person does not necessarily imply the reverse; for example,

many fans may know of a celebrity, but the celebrity is unlikely to know of all their fans). This latter type of graph

is called a directed graph and the edges are called directed edges or arcs.

44

4.1. DEFINITIONS

45

Vertices are also called nodes or points, and edges are also called arcs or lines. Graphs are the basic subject studied

by graph theory. The word “graph” was ﬁrst used in this sense by J.J. Sylvester in 1878.[2][3]

4.1 Deﬁnitions

Deﬁnitions in graph theory vary. The following are some of the more basic ways of deﬁning graphs and related

mathematical structures.

4.1.1

Graph

In the most common sense of the term,[4] a graph is an ordered pair G = (V, E) comprising a set V of vertices or

nodes together with a set E of edges or links, which are 2-element subsets of V (i.e., an edge is related with two

vertices, and the relation is represented as an unordered pair of the vertices with respect to the particular edge). To

avoid ambiguity, this type of graph may be described precisely as undirected and simple.

Other senses of graph stem from diﬀerent conceptions of the edge set. In one more generalized notion,[5] E is a set

together with a relation of incidence that associates with each edge two vertices. In another generalized notion, E is

a multiset of unordered pairs of (not necessarily distinct) vertices. Many authors call this type of object a multigraph

or pseudograph.

All of these variants and others are described more fully below.

The vertices belonging to an edge are called the ends, endpoints, or end vertices of the edge. A vertex may exist in

a graph and not belong to an edge.

V and E are usually taken to be ﬁnite, and many of the well-known results are not true (or are rather diﬀerent) for

inﬁnite graphs because many of the arguments fail in the inﬁnite case. Moreover, V is often assumed to be nonempty, but E is allowed to be the empty set. The order of a graph is |V | (the number of vertices). A graph’s size

is |E| , the number of edges. The degree of a vertex is the number of edges that connect to it, where an edge that

connects to the vertex at both ends (a loop) is counted twice.

For an edge {u, v}, graph theorists usually use the somewhat shorter notation uv.

4.1.2

Adjacency relation

The edges E of an undirected graph G induce a symmetric binary relation ~ on V that is called the adjacency relation

of G. Speciﬁcally, for each edge {u, v} the vertices u and v are said to be adjacent to one another, which is denoted

u ~ v.

4.2 Types of graphs

4.2.1

Distinction in terms of the main deﬁnition

As stated above, in diﬀerent contexts it may be useful to reﬁne the term graph with diﬀerent degrees of generality.

Whenever it is necessary to draw a strict distinction, the following terms are used. Most commonly, in modern texts

in graph theory, unless stated otherwise, graph means “undirected simple ﬁnite graph” (see the deﬁnitions below).

A directed graph.

46

CHAPTER 4. GRAPH (MATHEMATICS)

A simple undirected graph with three vertices and three edges. Each vertex has degree two, so this is also a regular

graph.

Undirected graph

An undirected graph is one in which edges have no orientation. The edge (a, b) is identical to the edge (b, a), i.e., they

are not ordered pairs, but sets {u, v} (or 2-multisets) of vertices. The maximum number of edges in an undirected

graph without a self-loop is n(n - 1)/2.

Directed graph

Main article: Directed graph

A directed graph or digraph is an ordered pair D = (V, A) with

• V a set whose elements are called vertices or nodes, and

• A a set of ordered pairs of vertices, called arcs, directed edges, or arrows.

An arc a = (x, y) is considered to be directed from x to y; y is called the head and x is called the tail of the arc; y is

said to be a direct successor of x, and x is said to be a direct predecessor of y. If a path leads from x to y, then y is

said to be a successor of x and reachable from x, and x is said to be a predecessor of y. The arc (y, x) is called the

arc (x, y) inverted.

A directed graph D is called symmetric if, for every arc in D, the corresponding inverted arc also belongs to D. A

symmetric loopless directed graph D = (V, A) is equivalent to a simple undirected graph G = (V, E), where the pairs

of inverse arcs in A correspond 1-to-1 with the edges in E; thus the edges in G number |E| = |A|/2, or half the number

of arcs in D.

An oriented graph is a directed graph in which at most one of (x, y) and (y, x) may be arcs.

Mixed graph

Main article: Mixed graph

A mixed graph G is a graph in which some edges may be directed and some may be undirected. It is written as an

ordered triple G = (V, E, A) with V, E, and A deﬁned as above. Directed and undirected graphs are special cases.

Multigraph

A loop is an edge (directed or undirected) which starts and ends on the same vertex; these may be permitted or not

permitted according to the application. In this context, an edge with two diﬀerent ends is called a link.

The term "multigraph" is generally understood to mean that multiple edges (and sometimes loops) are allowed. Where

graphs are deﬁned so as to allow loops and multiple edges, a multigraph is often deﬁned to mean a graph without

loops,[6] however, where graphs are deﬁned so as to disallow loops and multiple edges, the term is often deﬁned to

mean a “graph” which can have both multiple edges and loops,[7] although many use the term "pseudograph" for this

meaning.[8]

4.2. TYPES OF GRAPHS

47

Quiver

A quiver or “multidigraph” is a directed graph which may have more than one arrow from a given source to a given

target. A quiver may also have directed loops in it.

Simple graph

As opposed to a multigraph, a simple graph is an undirected graph that has no loops (edges connected at both ends

to the same vertex) and no more than one edge between any two diﬀerent vertices. In a simple graph the edges of the

graph form a set (rather than a multiset) and each edge is a pair of distinct vertices. In a simple graph with n vertices,

the degree of every vertex is at most n-1.

Weighted graph

A graph is a weighted graph if a number (weight) is assigned to each edge.[9] Such weights might represent, for

example, costs, lengths or capacities, etc. depending on the problem at hand. Some authors call such a graph a

network.[10] Weighted correlation networks can be deﬁned by soft-thresholding the pairwise correlations among

variables (e.g. gene measurements).

Half-edges, loose edges

In certain situations it can be helpful to allow edges with only one end, called half-edges, or no ends (loose edges);

see for example signed graphs and biased graphs.

4.2.2

Important graph classes

Regular graph

Main article: Regular graph

A regular graph is a graph where each vertex has the same number of neighbours, i.e., every vertex has the same

degree or valency. A regular graph with vertices of degree k is called a k‑regular graph or regular graph of degree k.

Complete graph

Main article: Complete graph

Complete graphs have the feature that each pair of vertices has an edge connecting them.

Finite and inﬁnite graphs

A ﬁnite graph is a graph G = (V, E) such that V and E are ﬁnite sets. An inﬁnite graph is one with an inﬁnite set of

vertices or edges or both.

Most commonly in graph theory it is implied that the graphs discussed are ﬁnite. If the graphs are inﬁnite, that is

usually speciﬁcally stated.

Graph classes in terms of connectivity

Main article: Connectivity (graph theory)

48

CHAPTER 4. GRAPH (MATHEMATICS)

A complete graph with 5 vertices. Each vertex has an edge to every other vertex.

In an undirected graph G, two vertices u and v are called connected if G contains a path from u to v. Otherwise,

they are called disconnected. A graph is called connected if every pair of distinct vertices in the graph is connected;

otherwise, it is called disconnected.

A graph is called k-vertex-connected or k-edge-connected if no set of k-1 vertices (respectively, edges) exists that,

when removed, disconnects the graph. A k-vertex-connected graph is often called simply k-connected.

A directed graph is called weakly connected if replacing all of its directed edges with undirected edges produces

a connected (undirected) graph. It is strongly connected or strong if it contains a directed path from u to v and a

directed path from v to u for every pair of vertices u, v.

Category of all graphs

The category of all graphs is the slice category Set ↓ D where D : Set → Set is the functor taking a set s to s × s .

4.3 Properties of graphs

See also: Glossary of graph theory and Graph property

4.4. EXAMPLES

49

Two edges of a graph are called adjacent if they share a common vertex. Two arrows of a directed graph are called

consecutive if the head of the ﬁrst one is at the tail of the second one. Similarly, two vertices are called adjacent if

they share a common edge (consecutive if they are at the tail and at the head of an arrow), in which case the common

edge is said to join the two vertices. An edge and a vertex on that edge are called incident.

The graph with only one vertex and no edges is called the trivial graph. A graph with only vertices and no edges is

known as an edgeless graph. The graph with no vertices and no edges is sometimes called the null graph or empty

graph, but the terminology is not consistent and not all mathematicians allow this object.

In a weighted graph or digraph, each edge is associated with some value, variously called its cost, weight, length or

other term depending on the application; such graphs arise in many contexts, for example in optimal routing problems

such as the traveling salesman problem.

Normally, the vertices of a graph, by their nature as elements of a set, are distinguishable. This kind of graph may be

called vertex-labeled. However, for many questions it is better to treat vertices as indistinguishable; then the graph

may be called unlabeled. (Of course, the vertices may be still distinguishable by the properties of the graph itself,

e.g., by the numbers of incident edges). The same remarks apply to edges, so graphs with labeled edges are called

edge-labeled graphs. Graphs with labels attached to edges or vertices are more generally designated as labeled.

Consequently, graphs in which vertices are indistinguishable and edges are indistinguishable are called unlabeled.

(Note that in the literature the term labeled may apply to other kinds of labeling, besides that which serves only to

distinguish diﬀerent vertices or edges.)

4.4 Examples

6

4

3

5

1

2

A graph with six nodes.

• The diagram at right is a graphic representation of the following graph:

V = {1, 2, 3, 4, 5, 6}

E = {{1, 2}, {1, 5}, {2, 3}, {2, 5}, {3, 4}, {4, 5}, {4, 6}}.

• In category theory a small category can be represented by a directed multigraph in which the objects of the

50

CHAPTER 4. GRAPH (MATHEMATICS)

category represented as vertices and the morphisms as directed edges. Then, the functors between categories

induce some, but not necessarily all, of the digraph morphisms of the graph.

• In computer science, directed graphs are used to represent knowledge (e.g., Conceptual graph), ﬁnite state

machines, and many other discrete structures.

• A binary relation R on a set X deﬁnes a directed graph. An element x of X is a direct predecessor of an element

y of X iﬀ xRy.

• A directed edge can model information networks such as Twitter, with one user following another [11]

4.5 Important graphs

Basic examples are:

• In a complete graph, each pair of vertices is joined by an edge; that is, the graph contains all possible edges.

• In a bipartite graph, the vertex set can be partitioned into two sets, W and X, so that no two vertices in W are

adjacent and no two vertices in X are adjacent. Alternatively, it is a graph with a chromatic number of 2.

• In a complete bipartite graph, the vertex set is the union of two disjoint sets, W and X, so that every vertex in

W is adjacent to every vertex in X but there are no edges within W or X.

• In a linear graph or path graph of length n, the vertices can be listed in order, v0 , v1 , ..., v , so that the edges

are vᵢ₋₁vᵢ for each i = 1, 2, ..., n. If a linear graph occurs as a subgraph of another graph, it is a path in that

graph.

• In a cycle graph of length n ≥ 3, vertices can be named v1 , ..., v so that the edges are vᵢ₋₁vi for each i = 2,...,n

in addition to v v1 . Cycle graphs can be characterized as connected 2-regular graphs. If a cycle graph occurs

as a subgraph of another graph, it is a cycle or circuit in that graph.

• A planar graph is a graph whose vertices and edges can be drawn in a plane such that no two of the edges

intersect (i.e., embedded in a plane).

• A tree is a connected graph with no cycles.

• A forest is a graph with no cycles (i.e. the disjoint union of one or more trees).

More advanced kinds of graphs are:

• The Petersen graph and its generalizations

• Perfect graphs

• Cographs

• Chordal graphs

• Other graphs with large automorphism groups: vertex-transitive, arc-transitive, and distance-transitive graphs.

• Strongly regular graphs and their generalization distance-regular graphs.

4.6 Operations on graphs

Main article: Operations on graphs

There are several operations that produce new graphs from old ones, which might be classiﬁed into the following

categories:

4.7. GENERALIZATIONS

51

• Elementary operations, sometimes called “editing operations” on graphs, which create a new graph from the

original one by a simple, local change, such as addition or deletion of a vertex or an edge, merging and splitting

of vertices, etc.

• Graph rewrite operations replacing the occurrence of some pattern graph within the host graph by an instance

of the corresponding replacement graph.

• Unary operations, which create a signiﬁcantly new graph from the old one. Examples:

• Line graph

• Dual graph

• Complement graph

• Binary operations, which create new graph from two initial graphs. Examples:

• Disjoint union of graphs

• Cartesian product of graphs

• Tensor product of graphs

• Strong product of graphs

• Lexicographic product of graphs

4.7 Generalizations

In a hypergraph, an edge can join more than two vertices.

An undirected graph can be seen as a simplicial complex consisting of 1-simplices (the edges) and 0-simplices (the

vertices). As such, complexes are generalizations of graphs since they allow for higher-dimensional simplices.

Every graph gives rise to a matroid.

In model theory, a graph is just a structure. But in that case, there is no limitation on the number of edges: it can be

any cardinal number, see continuous graph.

In computational biology, power graph analysis introduces power graphs as an alternative representation of undirected

graphs.

In geographic information systems, geometric networks are closely modeled after graphs, and borrow many concepts

from graph theory to perform spatial analysis on road networks or utility grids.

4.8 See also

• Conceptual graph

• Dual graph

• Glossary of graph theory

• Graph (data structure)

• Graph database

• Graph drawing

• Graph theory

• Hypergraph

• List of graph theory topics

• List of publications in graph theory

• Network theory

52

CHAPTER 4. GRAPH (MATHEMATICS)

4.9 Notes

[1] Trudeau, Richard J. (1993). Introduction to Graph Theory (Corrected, enlarged republication. ed.). New York: Dover

Pub. p. 19. ISBN 978-0-486-67870-2. Retrieved 8 August 2012. A graph is an object consisting of two sets called its

vertex set and its edge set.

[2] See:

• J. J. Sylvester (February 7, 1878) “Chemistry and algebra,” Nature, 17 : 284. From page 284: “Every invariant and

covariant thus becomes expressible by a graph precisely identical with a Kekuléan diagram or chemicograph.”

• J. J. Sylvester (1878) “On an application of the new atomic theory to the graphical representation of the invariants

and covariants of binary quantics, — with three appendices,” American Journal of Mathematics, Pure and Applied,

1 (1) : 64-90. The term “graph” ﬁrst appears in this paper on page 65.

[3] Gross, Jonathan L.; Yellen, Jay (2004). Handbook of graph theory. CRC Press. p. 35. ISBN 978-1-58488-090-5.

[4] See, for instance, Iyanaga and Kawada, 69 J, p. 234 or Biggs, p. 4.

[5] See, for instance, Graham et al., p. 5.

[6] For example, see Balakrishnan, p. 1, Gross (2003), p. 4, and Zwillinger, p. 220.

[7] For example, see. Bollobás, p. 7 and Diestel, p. 25.

[8] Gross (1998), p. 3, Gross (2003), p. 205, Harary, p.10, and Zwillinger, p. 220.

[9] Fletcher, Peter; Hoyle, Hughes; Patty, C. Wayne (1991). Foundations of Discrete Mathematics (International student ed.

ed.). Boston: PWS-KENT Pub. Co. p. 463. ISBN 0-53492-373-9. A weighted graph is a graph in which a number

w(e), called its weight, is assigned to each edge e.

[10] Strang, Gilbert (2005), Linear Algebra and Its Applications (4th ed.), Brooks Cole, ISBN 0-03-010567-6

[11] Pankaj Gupta, Ashish Goel, Jimmy Lin, Aneesh Sharma, Dong Wang, and Reza Bosagh Zadeh WTF: The who-to-follow

system at Twitter, Proceedings of the 22nd international conference on World Wide Web

4.10 References

• Balakrishnan, V. K. (1997-02-01). Graph Theory (1st ed.). McGraw-Hill. ISBN 0-07-005489-4.

• Berge, Claude (1958). Théorie des graphes et ses applications (in French). Dunod, Paris: Collection Universitaire de Mathématiques, II. pp. viii+277. Translation: -. Dover, New York: Wiley. 2001 [1962].

• Biggs, Norman (1993). Algebraic Graph Theory (2nd ed.). Cambridge University Press. ISBN 0-521-45897-8.

• Bollobás, Béla (2002-08-12). Modern Graph Theory (1st ed.). Springer. ISBN 0-387-98488-7.

• Bang-Jensen, J.; Gutin, G. (2000). Digraphs: Theory, Algorithms and Applications. Springer.

• Diestel, Reinhard (2005). Graph Theory (3rd ed.). Berlin, New York: Springer-Verlag. ISBN 978-3-54026183-4..

• Graham, R.L., Grötschel, M., and Lovász, L, ed. (1995). Handbook of Combinatorics. MIT Press. ISBN

0-262-07169-X.

• Gross, Jonathan L.; Yellen, Jay (1998-12-30). Graph Theory and Its Applications. CRC Press. ISBN 0-84933982-0.

• Gross, Jonathan L., & Yellen, Jay, ed. (2003-12-29). Handbook of Graph Theory. CRC. ISBN 1-58488-0902.

• Harary, Frank (January 1995). Graph Theory. Addison Wesley Publishing Company. ISBN 0-201-41033-8.

• Iyanaga, Shôkichi; Kawada, Yukiyosi (1977). Encyclopedic Dictionary of Mathematics. MIT Press. ISBN

0-262-09016-3.

• Zwillinger, Daniel (2002-11-27). CRC Standard Mathematical Tables and Formulae (31st ed.). Chapman &

Hall/CRC. ISBN 1-58488-291-3.

4.11. FURTHER READING

53

4.11 Further reading

• Trudeau, Richard J. (1993). Introduction to Graph Theory (Corrected, enlarged republication. ed.). New York:

Dover Publications. ISBN 978-0-486-67870-2. Retrieved 8 August 2012.

4.12 External links

• Weisstein, Eric W., “Graph”, MathWorld.

Chapter 5

Graph theory

This article is about sets of vertices connected by edges. For graphs of mathematical functions, see Graph of a

function. For other uses, see Graph (disambiguation).

In mathematics and computer science, graph theory is the study of graphs, which are mathematical structures used

6

5

4

1

2

3

A drawing of a graph

to model pairwise relations between objects. A “graph” in this context is made up of "vertices" or “nodes” and lines

called edges that connect them. A graph may be undirected, meaning that there is no distinction between the two

vertices associated with each edge, or its edges may be directed from one vertex to another; see graph (mathematics)

for more detailed deﬁnitions and for other variations in the types of graph that are commonly considered. Graphs are

one of the prime objects of study in discrete mathematics.

Refer to the glossary of graph theory for basic deﬁnitions in graph theory.

54

5.1. DEFINITIONS

55

5.1 Deﬁnitions

Deﬁnitions in graph theory vary. The following are some of the more basic ways of deﬁning graphs and related

mathematical structures.

5.1.1

Graph

In the most common sense of the term,[1] a graph is an ordered pair G = (V, E) comprising a set V of vertices or

nodes together with a set E of edges or lines, which are 2-element subsets of V (i.e., an edge is related with two

vertices, and the relation is represented as an unordered pair of the vertices with respect to the particular edge). To

avoid ambiguity, this type of graph may be described precisely as undirected and simple.

Other senses of graph stem from diﬀerent conceptions of the edge set. In one more generalized notion,[2] V is a set

together with a relation of incidence that associates with each edge two vertices. In another generalized notion, E is

a multiset of unordered pairs of (not necessarily distinct) vertices. Many authors call this type of object a multigraph

or pseudograph.

All of these variants and others are described more fully below.

The vertices belonging to an edge are called the ends, endpoints, or end vertices of the edge. A vertex may exist in

a graph and not belong to an edge.

V and E are usually taken to be ﬁnite, and many of the well-known results are not true (or are rather diﬀerent) for

inﬁnite graphs because many of the arguments fail in the inﬁnite case. The order of a graph is |V | (the number of

vertices). A graph’s size is |E| , the number of edges. The degree or valency of a vertex is the number of edges that

connect to it, where an edge that connects to the vertex at both ends (a loop) is counted twice.

For an edge {u, v} , graph theorists usually use the somewhat shorter notation uv .

5.2 Applications

Graphs can be used to model many types of relations and processes in physical, biological,[4] social and information

systems. Many practical problems can be represented by graphs.

In computer science, graphs are used to represent networks of communication, data organization, computational

devices, the ﬂow of computation, etc. For instance, the link structure of a website can be represented by a directed

graph, in which the vertices represent web pages and directed edges represent links from one page to another. A

similar approach can be taken to problems in travel, biology, computer chip design, and many other ﬁelds. The

development of algorithms to handle graphs is therefore of major interest in computer science. The transformation

of graphs is often formalized and represented by graph rewrite systems. Complementary to graph transformation

systems focusing on rule-based in-memory manipulation of graphs are graph databases geared towards transactionsafe, persistent storing and querying of graph-structured data.

Graph-theoretic methods, in various forms, have proven particularly useful in linguistics, since natural language often

lends itself well to discrete structure. Traditionally, syntax and compositional semantics follow tree-based structures,

whose expressive power lies in the principle of compositionality, modeled in a hierarchical graph. More contemporary

approaches such as head-driven phrase structure grammar model the syntax of natural language using typed feature

structures, which are directed acyclic graphs. Within lexical semantics, especially as applied to computers, modeling

word meaning is easier when a given word is understood in terms of related words; semantic networks are therefore

important in computational linguistics. Still other methods in phonology (e.g. optimality theory, which uses lattice

graphs) and morphology (e.g. ﬁnite-state morphology, using ﬁnite-state transducers) are common in the analysis of

language as a graph. Indeed, the usefulness of this area of mathematics to linguistics has borne organizations such as

TextGraphs, as well as various 'Net' projects, such as WordNet, VerbNet, and others.

Graph theory is also used to study molecules in chemistry and physics. In condensed matter physics, the threedimensional structure of complicated simulated atomic structures can be studied quantitatively by gathering statistics

on graph-theoretic properties related to the topology of the atoms. In chemistry a graph makes a natural model for a

molecule, where vertices represent atoms and edges bonds. This approach is especially used in computer processing of

molecular structures, ranging from chemical editors to database searching. In statistical physics, graphs can represent

56

CHAPTER 5. GRAPH THEORY

hu

pt

ca

es

no

pl

fr

ro

sv

fi

ko

it

he

en

de

da

ja

bg

nl

ar

tr

uk

ru

zh

id

cs

fa

The network graph formed by Wikipedia editors (edges) contributing to diﬀerent Wikipedia language versions (nodes) during one

month in summer 2013.[3]

local connections between interacting parts of a system, as well as the dynamics of a physical process on such systems.

Graphs are also used to represent the micro-scale channels of porous media, in which the vertices represent the pores

and the edges represent the smaller channels connecting the pores.

Graph theory is also widely used in sociology as a way, for example, to measure actors’ prestige or to explore rumor

spreading, notably through the use of social network analysis software. Under the umbrella of social networks are

many diﬀerent types of graphs:[5] Acquaintanceship and friendship graphs describe whether people know each other.

Inﬂuence graphs model whether certain people can inﬂuence the behavior of others. Finally, collaboration graphs

model whether two people work together in a particular way, such as acting in a movie together.

Likewise, graph theory is useful in biology and conservation eﬀorts where a vertex can represent regions where

certain species exist (or habitats) and the edges represent migration paths, or movement between the regions. This

information is important when looking at breeding patterns or tracking the spread of disease, parasites or how changes

to the movement can aﬀect other species.

In mathematics, graphs are useful in geometry and certain parts of topology such as knot theory. Algebraic graph

theory has close links with group theory.

A graph structure can be extended by assigning a weight to each edge of the graph. Graphs with weights, or weighted

graphs, are used to represent structures in which pairwise connections have some numerical values. For example if a

graph represents a road network, the weights could represent the length of each road.

5.3. HISTORY

57

5.3 History

The Königsberg Bridge problem

The paper written by Leonhard Euler on the Seven Bridges of Königsberg and published in 1736 is regarded as the ﬁrst

paper in the history of graph theory.[6] This paper, as well as the one written by Vandermonde on the knight problem,

carried on with the analysis situs initiated by Leibniz. Euler’s formula relating the number of edges, vertices, and faces

of a convex polyhedron was studied and generalized by Cauchy[7] and L'Huillier,[8] and is at the origin of topology.

More than one century after Euler’s paper on the bridges of Königsberg and while Listing introduced topology, Cayley

was led by the study of particular analytical forms arising from diﬀerential calculus to study a particular class of graphs,

the trees.[9] This study had many implications in theoretical chemistry. The involved techniques mainly concerned the

enumeration of graphs having particular properties. Enumerative graph theory then rose from the results of Cayley

and the fundamental results published by Pólya between 1935 and 1937 and the generalization of these by De Bruijn

in 1959. Cayley linked his results on trees with the contemporary studies of chemical composition.[10] The fusion

of the ideas coming from mathematics with those coming from chemistry is at the origin of a part of the standard

terminology of graph theory.

In particular, the term “graph” was introduced by Sylvester in a paper published in 1878 in Nature, where he draws

an analogy between “quantic invariants” and “co-variants” of algebra and molecular diagrams:[11]

"[...] Every invariant and co-variant thus becomes expressible by a graph precisely identical with a

Kekuléan diagram or chemicograph. [...] I give a rule for the geometrical multiplication of graphs, i.e.

for constructing a graph to the product of in- or co-variants whose separate graphs are given. [...]" (italics

as in the original).

The ﬁrst textbook on graph theory was written by Dénes Kőnig, and published in 1936.[12] Another book by Frank

Harary, published in 1969, was “considered the world over to be the deﬁnitive textbook on the subject”,[13] and

enabled mathematicians, chemists, electrical engineers and social scientists to talk to each other. Harary donated all

of the royalties to fund the Pólya Prize.[14]

58

CHAPTER 5. GRAPH THEORY

One of the most famous and stimulating problems in graph theory is the four color problem: “Is it true that any

map drawn in the plane may have its regions colored with four colors, in such a way that any two regions having a

common border have diﬀerent colors?" This problem was ﬁrst posed by Francis Guthrie in 1852 and its ﬁrst written

record is in a letter of De Morgan addressed to Hamilton the same year. Many incorrect proofs have been proposed,

including those by Cayley, Kempe, and others. The study and the generalization of this problem by Tait, Heawood,

Ramsey and Hadwiger led to the study of the colorings of the graphs embedded on surfaces with arbitrary genus.

Tait’s reformulation generated a new class of problems, the factorization problems, particularly studied by Petersen

and Kőnig. The works of Ramsey on colorations and more specially the results obtained by Turán in 1941 was at the

origin of another branch of graph theory, extremal graph theory.

The four color problem remained unsolved for more than a century. In 1969 Heinrich Heesch published a method for

solving the problem using computers.[15] A computer-aided proof produced in 1976 by Kenneth Appel and Wolfgang

Haken makes fundamental use of the notion of “discharging” developed by Heesch.[16][17] The proof involved checking the properties of 1,936 conﬁgurations by computer, and was not fully accepted at the time due to its complexity.

A simpler proof considering only 633 conﬁgurations was given twenty years later by Robertson, Seymour, Sanders

and Thomas.[18]

The autonomous development of topology from 1860 and 1930 fertilized graph theory back through the works of

Jordan, Kuratowski and Whitney. Another important factor of common development of graph theory and topology

came from the use of the techniques of modern algebra. The ﬁrst example of such a use comes from the work of the

physicist Gustav Kirchhoﬀ, who published in 1845 his Kirchhoﬀ’s circuit laws for calculating the voltage and current

in electric circuits.

The introduction of probabilistic methods in graph theory, especially in the study of Erdős and Rényi of the asymptotic

probability of graph connectivity, gave rise to yet another branch, known as random graph theory, which has been a

fruitful source of graph-theoretic results.

5.4 Graph drawing

Main article: Graph drawing

Graphs are represented visually by drawing a dot or circle for every vertex, and drawing an arc between two vertices

if they are connected by an edge. If the graph is directed, the direction is indicated by drawing an arrow.

A graph drawing should not be confused with the graph itself (the abstract, non-visual structure) as there are several

ways to structure the graph drawing. All that matters is which vertices are connected to which others by how many

edges and not the exact layout. In practice it is often diﬃcult to decide if two drawings represent the same graph.

Depending on the problem domain some layouts may be better suited and easier to understand than others.

The pioneering work of W. T. Tutte was very inﬂuential in the subject of graph drawing. Among other achievements,

he introduced the use of linear algebraic methods to obtain graph drawings.

Graph drawing also can be said to encompass problems that deal with the crossing number and its various generalizations. The crossing number of a graph is the minimum number of intersections between edges that a drawing of

the graph in the plane must contain. For a planar graph, the crossing number is zero by deﬁnition.

Drawings on surfaces other than the plane are also studied.

5.5 Graph-theoretic data structures

Main article: Graph (abstract data type)

There are diﬀerent ways to store graphs in a computer system. The data structure used depends on both the graph

structure and the algorithm used for manipulating the graph. Theoretically one can distinguish between list and

matrix structures but in concrete applications the best structure is often a combination of both. List structures are

often preferred for sparse graphs as they have smaller memory requirements. Matrix structures on the other hand

provide faster access for some applications but can consume huge amounts of memory.

List structures include the incidence list, an array of pairs of vertices, and the adjacency list, which separately lists

5.6. PROBLEMS IN GRAPH THEORY

59

the neighbors of each vertex: Much like the incidence list, each vertex has a list of which vertices it is adjacent to.

Matrix structures include the incidence matrix, a matrix of 0’s and 1’s whose rows represent vertices and whose

columns represent edges, and the adjacency matrix, in which both the rows and columns are indexed by vertices. In

both cases a 1 indicates two adjacent objects and a 0 indicates two non-adjacent objects. The Laplacian matrix is a

modiﬁed form of the adjacency matrix that incorporates information about the degrees of the vertices, and is useful

in some calculations such as Kirchhoﬀ’s theorem on the number of spanning trees of a graph. The distance matrix,

like the adjacency matrix, has both its rows and columns indexed by vertices, but rather than containing a 0 or a 1 in

each cell it contains the length of a shortest path between two vertices.

5.6 Problems in graph theory

5.6.1

Enumeration

There is a large literature on graphical enumeration: the problem of counting graphs meeting speciﬁed conditions.

Some of this work is found in Harary and Palmer (1973).

5.6.2

Subgraphs, induced subgraphs, and minors

A common problem, called the subgraph isomorphism problem, is ﬁnding a ﬁxed graph as a subgraph in a given

graph. One reason to be interested in such a question is that many graph properties are hereditary for subgraphs,

which means that a graph has the property if and only if all subgraphs have it too. Unfortunately, ﬁnding maximal

subgraphs of a certain kind is often an NP-complete problem.

• Finding the largest complete subgraph is called the clique problem (NP-complete).

A similar problem is ﬁnding induced subgraphs in a given graph. Again, some important graph properties are hereditary with respect to induced subgraphs, which means that a graph has a property if and only if all induced subgraphs

also have it. Finding maximal induced subgraphs of a certain kind is also often NP-complete. For example,

• Finding the largest edgeless induced subgraph, or independent set, called the independent set problem (NPcomplete).

Still another such problem, the minor containment problem, is to ﬁnd a ﬁxed graph as a minor of a given graph. A

minor or subcontraction of a graph is any graph obtained by taking a subgraph and contracting some (or no) edges.

Many graph properties are hereditary for minors, which means that a graph has a property if and only if all minors

have it too. A famous example:

• A graph is planar if it contains as a minor neither the complete bipartite graph K3,3 (See the Three-cottage

problem) nor the complete graph K5 .

Another class of problems has to do with the extent to which various species and generalizations of graphs are determined by their point-deleted subgraphs, for example:

• The reconstruction conjecture.

5.6.3

Graph coloring

Many problems have to do with various ways of coloring graphs, for example:

• The four-color theorem

• The strong perfect graph theorem

• The Erdős–Faber–Lovász conjecture(unsolved)

60

CHAPTER 5. GRAPH THEORY

• The total coloring conjecture, also called Behzad's conjecture) (unsolved)

• The list coloring conjecture (unsolved)

• The Hadwiger conjecture (graph theory) (unsolved)

5.6.4

Subsumption and uniﬁcation

Constraint modeling theories concern families of directed graphs related by a partial order. In these applications,

graphs are ordered by speciﬁcity, meaning that more constrained graphs—which are more speciﬁc and thus contain

a greater amount of information—are subsumed by those that are more general. Operations between graphs include

evaluating the direction of a subsumption relationship between two graphs, if any, and computing graph uniﬁcation.

The uniﬁcation of two argument graphs is deﬁned as the most general graph (or the computation thereof) that is

consistent with (i.e. contains all of the information in) the inputs, if such a graph exists; eﬃcient uniﬁcation algorithms

are known.

For constraint frameworks which are strictly compositional, graph uniﬁcation is the suﬃcient satisﬁability and combination function. Well-known applications include automatic theorem proving and modeling the elaboration of

linguistic structure.

5.6.5

Route problems

• Hamiltonian path and cycle problems

• Minimum spanning tree

• Route inspection problem (also called the “Chinese Postman Problem”)

• Seven Bridges of Königsberg

• Shortest path problem

• Steiner tree

• Three-cottage problem

• Traveling salesman problem (NP-hard)

5.6.6

Network ﬂow

There are numerous problems arising especially from applications that have to do with various notions of ﬂows in

networks, for example:

• Max ﬂow min cut theorem

5.6.7

Visibility problems

• Museum guard problem

5.6.8

Covering problems

Covering problems in graphs are speciﬁc instances of subgraph-ﬁnding problems, and they tend to be closely related

to the clique problem or the independent set problem.

• Set cover problem

• Vertex cover problem

5.7. SEE ALSO

5.6.9

61

Decomposition problems

Decomposition, deﬁned as partitioning the edge set of a graph (with as many vertices as necessary accompanying the

edges of each part of the partition), has a wide variety of question. Often, it is required to decompose a graph into

subgraphs isomorphic to a ﬁxed graph; for instance, decomposing a complete graph into Hamiltonian cycles. Other

problems specify a family of graphs into which a given graph should be decomposed, for instance, a family of cycles,

or decomposing a complete graph Kn into n − 1 speciﬁed trees having, respectively, 1, 2, 3, ..., n − 1 edges.

Some speciﬁc decomposition problems that have been studied include:

• Arboricity, a decomposition into as few forests as possible

• Cycle double cover, a decomposition into a collection of cycles covering each edge exactly twice

• Edge coloring, a decomposition into as few matchings as possible

• Graph factorization, a decomposition of a regular graph into regular subgraphs of given degrees

5.6.10

Graph classes

Many problems involve characterizing the members of various classes of graphs. Some examples of such questions

are below:

• Enumerating the members of a class

• Characterizing a class in terms of forbidden substructures

• Ascertaining relationships among classes (e.g., does one property of graphs imply another)

• Finding eﬃcient algorithms to decide membership in a class

• Finding representations for members of a class.

5.7 See also

• Gallery of named graphs

• Glossary of graph theory

• List of graph theory topics

• Publications in graph theory

5.7.1

Related topics

• Algebraic graph theory

• Citation graph

• Conceptual graph

• Data structure

• Disjoint-set data structure

• Dual-phase evolution

• Entitative graph

• Existential graph

• Graph algebras

62

CHAPTER 5. GRAPH THEORY

• Graph automorphism

• Graph coloring

• Graph database

• Graph data structure

• Graph drawing

• Graph equation

• Graph rewriting

• Graph sandwich problem

• Graph property

• Intersection graph

• Logical graph

• Loop

• Network theory

• Null graph

• Pebble motion problems

• Percolation

• Perfect graph

• Quantum graph

• Random regular graphs

• Semantic networks

• Spectral graph theory

• Strongly regular graphs

• Symmetric graphs

• Transitive reduction

• Tree data structure

5.7.2

Algorithms

• Bellman–Ford algorithm

• Dijkstra’s algorithm

• Ford–Fulkerson algorithm

• Kruskal’s algorithm

• Nearest neighbour algorithm

• Prim’s algorithm

• Depth-ﬁrst search

• Breadth-ﬁrst search

5.7. SEE ALSO

5.7.3

Subareas

• Algebraic graph theory

• Geometric graph theory

• Extremal graph theory

• Probabilistic graph theory

• Topological graph theory

5.7.4

Related areas of mathematics

• Combinatorics

• Group theory

• Knot theory

• Ramsey theory

5.7.5

Generalizations

• Hypergraph

• Abstract simplicial complex

5.7.6

Prominent graph theorists

• Alon, Noga

• Berge, Claude

• Bollobás, Béla

• Bondy, Adrian John

• Brightwell, Graham

• Chudnovsky, Maria

• Chung, Fan

• Dirac, Gabriel Andrew

• Erdős, Paul

• Euler, Leonhard

• Faudree, Ralph

• Golumbic, Martin

• Graham, Ronald

• Harary, Frank

• Heawood, Percy John

• Kotzig, Anton

• Kőnig, Dénes

• Lovász, László

63

64

CHAPTER 5. GRAPH THEORY

• Murty, U. S. R.

• Nešetřil, Jaroslav

• Rényi, Alfréd

• Ringel, Gerhard

• Robertson, Neil

• Seymour, Paul

• Szemerédi, Endre

• Thomas, Robin

• Thomassen, Carsten

• Turán, Pál

• Tutte, W. T.

• Whitney, Hassler

5.8 Notes

[1] See, for instance, Iyanaga and Kawada, 69 J, p. 234 or Biggs, p. 4.

[2] See, for instance, Graham et al., p. 5.

[3] Hale, Scott A. (2013). “Multilinguals and Wikipedia Editing”. arXiv:1312.0976 [cs.CY].

[4] Mashaghi, A. et al. (2004). “Investigation of a protein complex network”. European Physical Journal B 41 (1): 113–121.

doi:10.1140/epjb/e2004-00301-0.

[5] Rosen, Kenneth H. Discrete mathematics and its applications (7th ed.). New York: McGraw-Hill. ISBN 978-0-07-3383095.

[6] Biggs, N.; Lloyd, E. and Wilson, R. (1986), Graph Theory, 1736-1936, Oxford University Press

[7] Cauchy, A.L. (1813), “Recherche sur les polyèdres - premier mémoire”, Journal de l'École Polytechnique, 9 (Cahier 16):

66–86.

[8] L'Huillier, S.-A.-J. (1861), “Mémoire sur la polyèdrométrie”, Annales de Mathématiques 3: 169–189.

[9] Cayley, A. (1857), “On the theory of the analytical forms called trees”, Philosophical Magazine, Series IV 13 (85): 172–

176, doi:10.1017/CBO9780511703690.046.

[10] Cayley, A. (1875), “Ueber die Analytischen Figuren, welche in der Mathematik Bäume genannt werden und ihre Anwendung auf die Theorie chemischer Verbindungen”, Berichte der deutschen Chemischen Gesellschaft 8 (2): 1056–1059,

doi:10.1002/cber.18750080252.

[11] Joseph Sylvester, John (1878). “Chemistry and Algebra”. Nature 17: 284. doi:10.1038/017284a0.

[12] Tutte, W.T. (2001), Graph Theory, Cambridge University Press, p. 30, ISBN 978-0-521-79489-3.

[13] Gardner, Martin (1992), Fractal Music, Hypercards, and more...Mathematical Recreations from Scientiﬁc American, W. H.

Freeman and Company, p. 203.

[14] Society for Industrial and Applied Mathematics (2002), “The George Polya Prize”, Looking Back, Looking Ahead: A SIAM

History (PDF), p. 26.

[15] Heinrich Heesch: Untersuchungen zum Vierfarbenproblem. Mannheim: Bibliographisches Institut 1969.

[16] Appel, K. and Haken, W. (1977), “Every planar map is four colorable. Part I. Discharging”, Illinois J. Math. 21: 429–490.

[17] Appel, K. and Haken, W. (1977), “Every planar map is four colorable. Part II. Reducibility”, Illinois J. Math. 21: 491–567.

[18] Robertson, N.; Sanders, D.; Seymour, P. and Thomas, R. (1997), “The four color theorem”, Journal of Combinatorial

Theory Series B 70: 2–44, doi:10.1006/jctb.1997.1750.

5.9. REFERENCES

65

5.9 References

• Berge, Claude (1958), Théorie des graphes et ses applications, Collection Universitaire de Mathématiques II,

Paris: Dunod. English edition, Wiley 1961; Methuen & Co, New York 1962; Russian, Moscow 1961; Spanish,

Mexico 1962; Roumanian, Bucharest 1969; Chinese, Shanghai 1963; Second printing of the 1962 ﬁrst English

edition, Dover, New York 2001.

• Biggs, N.; Lloyd, E.; Wilson, R. (1986), Graph Theory, 1736–1936, Oxford University Press.

• Bondy, J.A.; Murty, U.S.R. (2008), Graph Theory, Springer, ISBN 978-1-84628-969-9.

• Bondy, Riordan, O.M (2003), Mathematical results on scale-free random graphs in “Handbook of Graphs and

Networks” (S. Bornholdt and H.G. Schuster (eds)), Wiley VCH, Weinheim, 1st ed..

• Chartrand, Gary (1985), Introductory Graph Theory, Dover, ISBN 0-486-24775-9.

• Gibbons, Alan (1985), Algorithmic Graph Theory, Cambridge University Press.

• Reuven Cohen, Shlomo Havlin (2010), Complex Networks: Structure, Robustness and Function, Cambridge

University Press

• Golumbic, Martin (1980), Algorithmic Graph Theory and Perfect Graphs, Academic Press.

• Harary, Frank (1969), Graph Theory, Reading, MA: Addison-Wesley.

• Harary, Frank; Palmer, Edgar M. (1973), Graphical Enumeration, New York, NY: Academic Press.

• Mahadev, N.V.R.; Peled, Uri N. (1995), Threshold Graphs and Related Topics, North-Holland.

• Mark Newman (2010), Networks: An Introduction, Oxford University Press.

5.10 External links

• Graph theory with examples

• Hazewinkel, Michiel, ed. (2001), “Graph theory”, Encyclopedia of Mathematics, Springer, ISBN 978-1-55608010-4

• Graph theory tutorial

• A searchable database of small connected graphs

• Image gallery: graphs at the Wayback Machine (archived February 6, 2006)

• Concise, annotated list of graph theory resources for researchers

• rocs — a graph theory IDE

• The Social Life of Routers — non-technical paper discussing graphs of people and computers

• Graph Theory Software — tools to teach and learn graph theory

• Online books, and library resources in your library and in other libraries about graph theory

5.10.1

Online textbooks

• Phase Transitions in Combinatorial Optimization Problems, Section 3: Introduction to Graphs (2006) by Hartmann and Weigt

• Digraphs: Theory Algorithms and Applications 2007 by Jorgen Bang-Jensen and Gregory Gutin

• Graph Theory, by Reinhard Diestel

Chapter 6

Loop (graph theory)

6

3

2

4

5

1

A graph with a loop on vertex 1

66

6.1. DEGREE

67

In graph theory, a loop (also called a self-loop or a “buckle”) is an edge that connects a vertex to itself. A simple

graph contains no loops.

Depending on the context, a graph or a multigraph may be deﬁned so as to either allow or disallow the presence of

loops (often in concert with allowing or disallowing multiple edges between the same vertices):

• Where graphs are deﬁned so as to allow loops and multiple edges, a graph without loops or multiple edges is

often distinguished from other graphs by calling it a “simple graph”.

• Where graphs are deﬁned so as to disallow loops and multiple edges, a graph that does have loops or multiple edges is often distinguished from the graphs that satisfy these constraints by calling it a “multigraph” or

"pseudograph".

6.1 Degree

For an undirected graph, the degree of a vertex is equal to the number of adjacent vertices.

A special case is a loop, which adds two to the degree. This can be understood by letting each connection of the loop

edge count as its own adjacent vertex. In other words, a vertex with a loop “sees” itself as an adjacent vertex from

both ends of the edge thus adding two, not one, to the degree.

For a directed graph, a loop adds one to the in degree and one to the out degree

6.2 Notes

6.3 References

• Balakrishnan, V. K.; Graph Theory, McGraw-Hill; 1 edition (February 1, 1997). ISBN 0-07-005489-4.

• Bollobás, Béla; Modern Graph Theory, Springer; 1st edition (August 12, 2002). ISBN 0-387-98488-7.

• Diestel, Reinhard; Graph Theory, Springer; 2nd edition (February 18, 2000). ISBN 0-387-98976-5.

• Gross, Jonathon L, and Yellen, Jay; Graph Theory and Its Applications, CRC Press (December 30, 1998).

ISBN 0-8493-3982-0.

• Gross, Jonathon L, and Yellen, Jay; (eds); Handbook of Graph Theory. CRC (December 29, 2003). ISBN

1-58488-090-2.

• Zwillinger, Daniel; CRC Standard Mathematical Tables and Formulae, Chapman & Hall/CRC; 31st edition

(November 27, 2002). ISBN 1-58488-291-3.

6.4 External links

• Black, Paul E. “Self loop”. Dictionary of Algorithms and Data Structures. NIST.

6.5 See also

Loops in Graph Theory

• Cycle (graph theory)

• Graph theory

• Glossary of graph theory

68

CHAPTER 6. LOOP (GRAPH THEORY)

Loops in Topology

• Möbius ladder

• Möbius strip

• Strange loop

• Klein bottle

Chapter 7

Mathematics

This article is about the study of topics such as quantity and structure. For other uses, see Mathematics (disambiguation).

“Math” redirects here. For other uses, see Math (disambiguation).

Euclid (holding calipers), Greek mathematician, 3rd century BC, as imagined by Raphael in this detail from The School of Athens.[1]

Mathematics (from Greek μάθημα máthēma, “knowledge, study, learning”) is the study of topics such as quantity

(numbers),[2] structure,[3] space,[2] and change.[4][5][6] There is a range of views among mathematicians and philosophers as to the exact scope and deﬁnition of mathematics.[7][8]

Mathematicians seek out patterns[9][10] and use them to formulate new conjectures. Mathematicians resolve the truth

69

70

CHAPTER 7. MATHEMATICS

or falsity of conjectures by mathematical proof. When mathematical structures are good models of real phenomena,

then mathematical reasoning can provide insight or predictions about nature. Through the use of abstraction and

logic, mathematics developed from counting, calculation, measurement, and the systematic study of the shapes and

motions of physical objects. Practical mathematics has been a human activity for as far back as written records exist.

The research required to solve mathematical problems can take years or even centuries of sustained inquiry.

Rigorous arguments ﬁrst appeared in Greek mathematics, most notably in Euclid's Elements. Since the pioneering

work of Giuseppe Peano (1858–1932), David Hilbert (1862–1943), and others on axiomatic systems in the late 19th

century, it has become customary to view mathematical research as establishing truth by rigorous deduction from

appropriately chosen axioms and deﬁnitions. Mathematics developed at a relatively slow pace until the Renaissance,

when mathematical innovations interacting with new scientiﬁc discoveries led to a rapid increase in the rate of mathematical discovery that has continued to the present day.[11]

Galileo Galilei (1564–1642) said, “The universe cannot be read until we have learned the language and become

familiar with the characters in which it is written. It is written in mathematical language, and the letters are triangles, circles and other geometrical ﬁgures, without which means it is humanly impossible to comprehend a single

word. Without these, one is wandering about in a dark labyrinth.”[12] Carl Friedrich Gauss (1777–1855) referred to

mathematics as “the Queen of the Sciences”.[13] Benjamin Peirce (1809–1880) called mathematics “the science that

draws necessary conclusions”.[14] David Hilbert said of mathematics: “We are not speaking here of arbitrariness in

any sense. Mathematics is not like a game whose tasks are determined by arbitrarily stipulated rules. Rather, it is a

conceptual system possessing internal necessity that can only be so and by no means otherwise.”[15] Albert Einstein

(1879–1955) stated that “as far as the laws of mathematics refer to reality, they are not certain; and as far as they

are certain, they do not refer to reality.”[16] French mathematician Claire Voisin states “There is creative drive in

mathematics, it’s all about movement trying to express itself.” [17]

Mathematics is used throughout the world as an essential tool in many ﬁelds, including natural science, engineering,

medicine, ﬁnance and the social sciences. Applied mathematics, the branch of mathematics concerned with application of mathematical knowledge to other ﬁelds, inspires and makes use of new mathematical discoveries, which has

led to the development of entirely new mathematical disciplines, such as statistics and game theory. Mathematicians

also engage in pure mathematics, or mathematics for its own sake, without having any application in mind. There is

no clear line separating pure and applied mathematics, and practical applications for what began as pure mathematics

are often discovered.[18]

7.1 History

7.1.1

Evolution

Main article: History of mathematics

The evolution of mathematics can be seen as an ever-increasing series of abstractions. The ﬁrst abstraction, which

is shared by many animals,[19] was probably that of numbers: the realization that a collection of two apples and a

collection of two oranges (for example) have something in common, namely quantity of their members.

As evidenced by tallies found on bone, in addition to recognizing how to count physical objects, prehistoric peoples

may have also recognized how to count abstract quantities, like time – days, seasons, years.[20]

More complex mathematics did not appear until around 3000 BC, when the Babylonians and Egyptians began using

arithmetic, algebra and geometry for taxation and other ﬁnancial calculations, for building and construction, and for

astronomy.[21] The earliest uses of mathematics were in trading, land measurement, painting and weaving patterns

and the recording of time.

In Babylonian mathematics elementary arithmetic (addition, subtraction, multiplication and division) ﬁrst appears in

the archaeological record. Numeracy pre-dated writing and numeral systems have been many and diverse, with the

ﬁrst known written numerals created by Egyptians in Middle Kingdom texts such as the Rhind Mathematical Papyrus.

Between 600 and 300 BC the Ancient Greeks began a systematic study of mathematics in its own right with Greek

mathematics.[22]

Mathematics has since been greatly extended, and there has been a fruitful interaction between mathematics and

science, to the beneﬁt of both. Mathematical discoveries continue to be made today. According to Mikhail B.

Sevryuk, in the January 2006 issue of the Bulletin of the American Mathematical Society, “The number of papers and

7.1. HISTORY

71

Greek mathematician Pythagoras (c. 570 – c. 495 BC), commonly credited with discovering the Pythagorean theorem

books included in the Mathematical Reviews database since 1940 (the ﬁrst year of operation of MR) is now more

than 1.9 million, and more than 75 thousand items are added to the database each year. The overwhelming majority

of works in this ocean contain new mathematical theorems and their proofs.”[23]

72

CHAPTER 7. MATHEMATICS

0

1

2

3

4

5

6

7

8

9

10

11 12 13 14

15 16 17 18 19

Mayan numerals

7.1.2

Etymology

The word mathematics comes from the Greek μάθημα (máthēma), which, in the ancient Greek language, means “that

which is learnt”,[24] “what one gets to know”, hence also “study” and “science”, and in modern Greek just “lesson”. The

word máthēma is derived from μανθάνω (manthano), while the modern Greek equivalent is μαθαίνω (mathaino),

both of which mean “to learn”. In Greece, the word for “mathematics” came to have the narrower and more technical

meaning “mathematical study” even in Classical times.[25] Its adjective is μαθηματικός (mathēmatikós), meaning

“related to learning” or “studious”, which likewise further came to mean “mathematical”. In particular, μαθηματικὴ

τέχνη (mathēmatikḗ tékhnē), Latin: ars mathematica, meant “the mathematical art”.

In Latin, and in English until around 1700, the term mathematics more commonly meant “astrology” (or sometimes

“astronomy”) rather than “mathematics"; the meaning gradually changed to its present one from about 1500 to 1800.

This has resulted in several mistranslations: a particularly notorious one is Saint Augustine's warning that Christians should beware of mathematici meaning astrologers, which is sometimes mistranslated as a condemnation of

7.2. DEFINITIONS OF MATHEMATICS

73

mathematicians.[26]

The apparent plural form in English, like the French plural form les mathématiques (and the less commonly used

singular derivative la mathématique), goes back to the Latin neuter plural mathematica (Cicero), based on the Greek

plural τα μαθηματικά (ta mathēmatiká), used by Aristotle (384–322 BC), and meaning roughly “all things mathematical"; although it is plausible that English borrowed only the adjective mathematic(al) and formed the noun

mathematics anew, after the pattern of physics and metaphysics, which were inherited from the Greek.[27] In English,

the noun mathematics takes singular verb forms. It is often shortened to maths or, in English-speaking North America,

math.[28]

7.2 Deﬁnitions of mathematics

Main article: Deﬁnitions of mathematics

Aristotle deﬁned mathematics as “the science of quantity”, and this deﬁnition prevailed until the 18th century.[29]

Starting in the 19th century, when the study of mathematics increased in rigor and began to address abstract topics

such as group theory and projective geometry, which have no clear-cut relation to quantity and measurement, mathematicians and philosophers began to propose a variety of new deﬁnitions.[30] Some of these deﬁnitions emphasize the

deductive character of much of mathematics, some emphasize its abstractness, some emphasize certain topics within

mathematics. Today, no consensus on the deﬁnition of mathematics prevails, even among professionals.[7] There

is not even consensus on whether mathematics is an art or a science.[8] A great many professional mathematicians

take no interest in a deﬁnition of mathematics, or consider it undeﬁnable.[7] Some just say, “Mathematics is what

mathematicians do.”[7]

Three leading types of deﬁnition of mathematics are called logicist, intuitionist, and formalist, each reﬂecting a

diﬀerent philosophical school of thought.[31] All have severe problems, none has widespread acceptance, and no

reconciliation seems possible.[31]

An early deﬁnition of mathematics in terms of logic was Benjamin Peirce's “the science that draws necessary conclusions” (1870).[32] In the Principia Mathematica, Bertrand Russell and Alfred North Whitehead advanced the philosophical program known as logicism, and attempted to prove that all mathematical concepts, statements, and principles can be deﬁned and proven entirely in terms of symbolic logic. A logicist deﬁnition of mathematics is Russell’s

“All Mathematics is Symbolic Logic” (1903).[33]

Intuitionist deﬁnitions, developing from the philosophy of mathematician L.E.J. Brouwer, identify mathematics with

certain mental phenomena. An example of an intuitionist deﬁnition is “Mathematics is the mental activity which

consists in carrying out constructs one after the other.”[31] A peculiarity of intuitionism is that it rejects some mathematical ideas considered valid according to other deﬁnitions. In particular, while other philosophies of mathematics

allow objects that can be proven to exist even though they cannot be constructed, intuitionism allows only mathematical objects that one can actually construct.

Formalist deﬁnitions identify mathematics with its symbols and the rules for operating on them. Haskell Curry deﬁned

mathematics simply as “the science of formal systems”.[34] A formal system is a set of symbols, or tokens, and some

rules telling how the tokens may be combined into formulas. In formal systems, the word axiom has a special meaning,

diﬀerent from the ordinary meaning of “a self-evident truth”. In formal systems, an axiom is a combination of tokens

that is included in a given formal system without needing to be derived using the rules of the system.

7.2.1

Mathematics as science

Gauss referred to mathematics as “the Queen of the Sciences”.[13] In the original Latin Regina Scientiarum, as well

as in German Königin der Wissenschaften, the word corresponding to science means a “ﬁeld of knowledge”, and

this was the original meaning of “science” in English, also; mathematics is in this sense a ﬁeld of knowledge. The

specialization restricting the meaning of “science” to natural science follows the rise of Baconian science, which

contrasted “natural science” to scholasticism, the Aristotelean method of inquiring from ﬁrst principles. The role

of empirical experimentation and observation is negligible in mathematics, compared to natural sciences such as

psychology, biology, or physics. Albert Einstein stated that “as far as the laws of mathematics refer to reality, they

are not certain; and as far as they are certain, they do not refer to reality.”[16] More recently, Marcus du Sautoy has

called mathematics “the Queen of Science ... the main driving force behind scientiﬁc discovery”.[35]

Many philosophers believe that mathematics is not experimentally falsiﬁable, and thus not a science according to the

74

CHAPTER 7. MATHEMATICS

Leonardo Fibonacci, the Italian mathematician who established the Hindu–Arabic numeral system to the Western World

deﬁnition of Karl Popper.[36] However, in the 1930s Gödel’s incompleteness theorems convinced many mathematicians that mathematics cannot be reduced to logic alone, and Karl Popper concluded that “most mathematical theories

are, like those of physics and biology, hypothetico-deductive: pure mathematics therefore turns out to be much closer

to the natural sciences whose hypotheses are conjectures, than it seemed even recently.”[37] Other thinkers, notably

Imre Lakatos, have applied a version of falsiﬁcationism to mathematics itself.

An alternative view is that certain scientiﬁc ﬁelds (such as theoretical physics) are mathematics with axioms that are

7.2. DEFINITIONS OF MATHEMATICS

75

Carl Friedrich Gauss, known as the prince of mathematicians

intended to correspond to reality. The theoretical physicist J.M. Ziman proposed that science is public knowledge,

and thus includes mathematics.[38] Mathematics shares much in common with many ﬁelds in the physical sciences,

notably the exploration of the logical consequences of assumptions. Intuition and experimentation also play a role in

the formulation of conjectures in both mathematics and the (other) sciences. Experimental mathematics continues to

grow in importance within mathematics, and computation and simulation are playing an increasing role in both the

sciences and mathematics.

The opinions of mathematicians on this matter are varied. Many mathematicians feel that to call their area a science

is to downplay the importance of its aesthetic side, and its history in the traditional seven liberal arts; others feel that

76

CHAPTER 7. MATHEMATICS

to ignore its connection to the sciences is to turn a blind eye to the fact that the interface between mathematics and

its applications in science and engineering has driven much development in mathematics. One way this diﬀerence of

viewpoint plays out is in the philosophical debate as to whether mathematics is created (as in art) or discovered (as

in science). It is common to see universities divided into sections that include a division of Science and Mathematics,

indicating that the ﬁelds are seen as being allied but that they do not coincide. In practice, mathematicians are typically

grouped with scientists at the gross level but separated at ﬁner levels. This is one of many issues considered in the

philosophy of mathematics.

7.3 Inspiration, pure and applied mathematics, and aesthetics

Main article: Mathematical beauty

Isaac Newton (left) and Gottfried Wilhelm Leibniz (right), developers of inﬁnitesimal calculus

Mathematics arises from many diﬀerent kinds of problems. At ﬁrst these were found in commerce, land measurement,

architecture and later astronomy; today, all sciences suggest problems studied by mathematicians, and many problems

arise within mathematics itself. For example, the physicist Richard Feynman invented the path integral formulation

of quantum mechanics using a combination of mathematical reasoning and physical insight, and today’s string theory,

a still-developing scientiﬁc theory which attempts to unify the four fundamental forces of nature, continues to inspire

new mathematics.[39]

Some mathematics is relevant only in the area that inspired it, and is applied to solve further problems in that area.

But often mathematics inspired by one area proves useful in many areas, and joins the general stock of mathematical

concepts. A distinction is often made between pure mathematics and applied mathematics. However pure mathematics topics often turn out to have applications, e.g. number theory in cryptography. This remarkable fact, that even the

“purest” mathematics often turns out to have practical applications, is what Eugene Wigner has called "the unreason-

7.4. NOTATION, LANGUAGE, AND RIGOR

77

able eﬀectiveness of mathematics".[40] As in most areas of study, the explosion of knowledge in the scientiﬁc age has

led to specialization: there are now hundreds of specialized areas in mathematics and the latest Mathematics Subject

Classiﬁcation runs to 46 pages.[41] Several areas of applied mathematics have merged with related traditions outside

of mathematics and become disciplines in their own right, including statistics, operations research, and computer

science.

For those who are mathematically inclined, there is often a deﬁnite aesthetic aspect to much of mathematics. Many

mathematicians talk about the elegance of mathematics, its intrinsic aesthetics and inner beauty. Simplicity and

generality are valued. There is beauty in a simple and elegant proof, such as Euclid's proof that there are inﬁnitely

many prime numbers, and in an elegant numerical method that speeds calculation, such as the fast Fourier transform.

G.H. Hardy in A Mathematician’s Apology expressed the belief that these aesthetic considerations are, in themselves,

suﬃcient to justify the study of pure mathematics. He identiﬁed criteria such as signiﬁcance, unexpectedness, inevitability, and economy as factors that contribute to a mathematical aesthetic.[42] Mathematicians often strive to ﬁnd

proofs that are particularly elegant, proofs from “The Book” of God according to Paul Erdős.[43][44] The popularity

of recreational mathematics is another sign of the pleasure many ﬁnd in solving mathematical questions.

7.4 Notation, language, and rigor

Main article: Mathematical notation

Most of the mathematical notation in use today was not invented until the 16th century.[45] Before that, mathematics was written out in words, a painstaking process that limited mathematical discovery.[46] Euler (1707–1783)

was responsible for many of the notations in use today. Modern notation makes mathematics much easier for the

professional, but beginners often ﬁnd it daunting. It is extremely compressed: a few symbols contain a great deal

of information. Like musical notation, modern mathematical notation has a strict syntax (which to a limited extent

varies from author to author and from discipline to discipline) and encodes information that would be diﬃcult to

write in any other way.

Mathematical language can be diﬃcult to understand for beginners. Words such as or and only have more precise

meanings than in everyday speech. Moreover, words such as open and ﬁeld have been given specialized mathematical

meanings. Technical terms such as homeomorphism and integrable have precise meanings in mathematics. Additionally, shorthand phrases such as iﬀ for "if and only if" belong to mathematical jargon. There is a reason for special

notation and technical vocabulary: mathematics requires more precision than everyday speech. Mathematicians refer

to this precision of language and logic as “rigor”.

Mathematical proof is fundamentally a matter of rigor. Mathematicians want their theorems to follow from axioms

by means of systematic reasoning. This is to avoid mistaken "theorems", based on fallible intuitions, of which many

instances have occurred in the history of the subject.[47] The level of rigor expected in mathematics has varied over

time: the Greeks expected detailed arguments, but at the time of Isaac Newton the methods employed were less

rigorous. Problems inherent in the deﬁnitions used by Newton would lead to a resurgence of careful analysis and

formal proof in the 19th century. Misunderstanding the rigor is a cause for some of the common misconceptions

of mathematics. Today, mathematicians continue to argue among themselves about computer-assisted proofs. Since

large computations are hard to verify, such proofs may not be suﬃciently rigorous.[48]

Axioms in traditional thought were “self-evident truths”, but that conception is problematic.[49] At a formal level,

an axiom is just a string of symbols, which has an intrinsic meaning only in the context of all derivable formulas

of an axiomatic system. It was the goal of Hilbert’s program to put all of mathematics on a ﬁrm axiomatic basis,

but according to Gödel’s incompleteness theorem every (suﬃciently powerful) axiomatic system has undecidable

formulas; and so a ﬁnal axiomatization of mathematics is impossible. Nonetheless mathematics is often imagined to

be (as far as its formal content) nothing but set theory in some axiomatization, in the sense that every mathematical

statement or proof could be cast into formulas within set theory.[50]

7.5 Fields of mathematics

See also: Areas of mathematics and Glossary of areas of mathematics

Mathematics can, broadly speaking, be subdivided into the study of quantity, structure, space, and change (i.e.

arithmetic, algebra, geometry, and analysis). In addition to these main concerns, there are also subdivisions dedicated

to exploring links from the heart of mathematics to other ﬁelds: to logic, to set theory (foundations), to the empirical

mathematics of the various sciences (applied mathematics), and more recently to the rigorous study of uncertainty.

78

CHAPTER 7. MATHEMATICS

Leonhard Euler, who created and popularized much of the mathematical notation used today

7.5.1

Foundations and philosophy

In order to clarify the foundations of mathematics, the ﬁelds of mathematical logic and set theory were developed.

Mathematical logic includes the mathematical study of logic and the applications of formal logic to other areas of

mathematics; set theory is the branch of mathematics that studies sets or collections of objects. Category theory,

which deals in an abstract way with mathematical structures and relationships between them, is still in development.

The phrase “crisis of foundations” describes the search for a rigorous foundation for mathematics that took place from

approximately 1900 to 1930.[51] Some disagreement about the foundations of mathematics continues to the present

day. The crisis of foundations was stimulated by a number of controversies at the time, including the controversy

over Cantor’s set theory and the Brouwer–Hilbert controversy.

7.5. FIELDS OF MATHEMATICS

79

An abacus, a simple calculating tool used since ancient times

Mathematical logic is concerned with setting mathematics within a rigorous axiomatic framework, and studying the

implications of such a framework. As such, it is home to Gödel’s incompleteness theorems which (informally) imply

that any eﬀective formal system that contains basic arithmetic, if sound (meaning that all theorems that can be proven

are true), is necessarily incomplete (meaning that there are true theorems which cannot be proved in that system).

Whatever ﬁnite collection of number-theoretical axioms is taken as a foundation, Gödel showed how to construct a

formal statement that is a true number-theoretical fact, but which does not follow from those axioms. Therefore, no

formal system is a complete axiomatization of full number theory. Modern logic is divided into recursion theory,

model theory, and proof theory, and is closely linked to theoretical computer science, as well as to category theory.

Theoretical computer science includes computability theory, computational complexity theory, and information theory. Computability theory examines the limitations of various theoretical models of the computer, including the most

well-known model – the Turing machine. Complexity theory is the study of tractability by computer; some problems,

although theoretically solvable by computer, are so expensive in terms of time or space that solving them is likely to

remain practically unfeasible, even with the rapid advancement of computer hardware. A famous problem is the "P =

NP?" problem, one of the Millennium Prize Problems.[52] Finally, information theory is concerned with the amount

of data that can be stored on a given medium, and hence deals with concepts such as compression and entropy.

7.5.2

Pure mathematics

Quantity

The study of quantity starts with numbers, ﬁrst the familiar natural numbers and integers (“whole numbers”) and

arithmetical operations on them, which are characterized in arithmetic. The deeper properties of integers are studied

in number theory, from which come such popular results as Fermat’s Last Theorem. The twin prime conjecture and

Goldbach’s conjecture are two unsolved problems in number theory.

As the number system is further developed, the integers are recognized as a subset of the rational numbers ("fractions").

These, in turn, are contained within the real numbers, which are used to represent continuous quantities. Real numbers are generalized to complex numbers. These are the ﬁrst steps of a hierarchy of numbers that goes on to include

quaternions and octonions. Consideration of the natural numbers also leads to the transﬁnite numbers, which formalize the concept of "inﬁnity". Another area of study is size, which leads to the cardinal numbers and then to another

conception of inﬁnity: the aleph numbers, which allow meaningful comparison of the size of inﬁnitely large sets.

80

CHAPTER 7. MATHEMATICS

Structure

Many mathematical objects, such as sets of numbers and functions, exhibit internal structure as a consequence of

operations or relations that are deﬁned on the set. Mathematics then studies properties of those sets that can be

expressed in terms of that structure; for instance number theory studies properties of the set of integers that can be

expressed in terms of arithmetic operations. Moreover, it frequently happens that diﬀerent such structured sets (or

structures) exhibit similar properties, which makes it possible, by a further step of abstraction, to state axioms for a

class of structures, and then study at once the whole class of structures satisfying these axioms. Thus one can study

groups, rings, ﬁelds and other abstract systems; together such studies (for structures deﬁned by algebraic operations)

constitute the domain of abstract algebra.

By its great generality, abstract algebra can often be applied to seemingly unrelated problems; for instance a number of ancient problems concerning compass and straightedge constructions were ﬁnally solved using Galois theory,

which involves ﬁeld theory and group theory. Another example of an algebraic theory is linear algebra, which is

the general study of vector spaces, whose elements called vectors have both quantity and direction, and can be used

to model (relations between) points in space. This is one example of the phenomenon that the originally unrelated

areas of geometry and algebra have very strong interactions in modern mathematics. Combinatorics studies ways of

enumerating the number of objects that ﬁt a given structure.

Space

The study of space originates with geometry – in particular, Euclidean geometry. Trigonometry is the branch of

mathematics that deals with relationships between the sides and the angles of triangles and with the trigonometric

functions; it combines space and numbers, and encompasses the well-known Pythagorean theorem. The modern

study of space generalizes these ideas to include higher-dimensional geometry, non-Euclidean geometries (which

play a central role in general relativity) and topology. Quantity and space both play a role in analytic geometry,

diﬀerential geometry, and algebraic geometry. Convex and discrete geometry were developed to solve problems in

number theory and functional analysis but now are pursued with an eye on applications in optimization and computer

science. Within diﬀerential geometry are the concepts of ﬁber bundles and calculus on manifolds, in particular,

vector and tensor calculus. Within algebraic geometry is the description of geometric objects as solution sets of

polynomial equations, combining the concepts of quantity and space, and also the study of topological groups, which

combine structure and space. Lie groups are used to study space, structure, and change. Topology in all its many

ramiﬁcations may have been the greatest growth area in 20th-century mathematics; it includes point-set topology,

set-theoretic topology, algebraic topology and diﬀerential topology. In particular, instances of modern day topology

are metrizability theory, axiomatic set theory, homotopy theory, and Morse theory. Topology also includes the now

solved Poincaré conjecture, and the still unsolved areas of the Hodge conjecture. Other results in geometry and

topology, including the four color theorem and Kepler conjecture, have been proved only with the help of computers.

Change

Understanding and describing change is a common theme in the natural sciences, and calculus was developed as a

powerful tool to investigate it. Functions arise here, as a central concept describing a changing quantity. The rigorous

study of real numbers and functions of a real variable is known as real analysis, with complex analysis the equivalent

ﬁeld for the complex numbers. Functional analysis focuses attention on (typically inﬁnite-dimensional) spaces of

functions. One of many applications of functional analysis is quantum mechanics. Many problems lead naturally

to relationships between a quantity and its rate of change, and these are studied as diﬀerential equations. Many

phenomena in nature can be described by dynamical systems; chaos theory makes precise the ways in which many of

these systems exhibit unpredictable yet still deterministic behavior.

7.6. MATHEMATICAL AWARDS

7.5.3

81

Applied mathematics

Applied mathematics concerns itself with mathematical methods that are typically used in science, engineering, business, and industry. Thus, “applied mathematics” is a mathematical science with specialized knowledge. The term

applied mathematics also describes the professional specialty in which mathematicians work on practical problems;

as a profession focused on practical problems, applied mathematics focuses on the “formulation, study, and use of

mathematical models” in science, engineering, and other areas of mathematical practice.

In the past, practical applications have motivated the development of mathematical theories, which then became the

subject of study in pure mathematics, where mathematics is developed primarily for its own sake. Thus, the activity

of applied mathematics is vitally connected with research in pure mathematics.

Statistics and other decision sciences

Applied mathematics has signiﬁcant overlap with the discipline of statistics, whose theory is formulated mathematically, especially with probability theory. Statisticians (working as part of a research project) “create data that makes

sense” with random sampling and with randomized experiments;[53] the design of a statistical sample or experiment

speciﬁes the analysis of the data (before the data be available). When reconsidering data from experiments and

samples or when analyzing data from observational studies, statisticians “make sense of the data” using the art of

modelling and the theory of inference – with model selection and estimation; the estimated models and consequential

predictions should be tested on new data.[54]

Statistical theory studies decision problems such as minimizing the risk (expected loss) of a statistical action, such as

using a procedure in, for example, parameter estimation, hypothesis testing, and selecting the best. In these traditional

areas of mathematical statistics, a statistical-decision problem is formulated by minimizing an objective function, like

expected loss or cost, under speciﬁc constraints: For example, designing a survey often involves minimizing the

cost of estimating a population mean with a given level of conﬁdence.[55] Because of its use of optimization, the

mathematical theory of statistics shares concerns with other decision sciences, such as operations research, control

theory, and mathematical economics.[56]

Computational mathematics

Computational mathematics proposes and studies methods for solving mathematical problems that are typically too

large for human numerical capacity. Numerical analysis studies methods for problems in analysis using functional

analysis and approximation theory; numerical analysis includes the study of approximation and discretization broadly

with special concern for rounding errors. Numerical analysis and, more broadly, scientiﬁc computing also study nonanalytic topics of mathematical science, especially algorithmic matrix and graph theory. Other areas of computational

mathematics include computer algebra and symbolic computation.

7.6 Mathematical awards

Arguably the most prestigious award in mathematics is the Fields Medal,[57][58] established in 1936 and now awarded

every four years. The Fields Medal is often considered a mathematical equivalent to the Nobel Prize.

The Wolf Prize in Mathematics, instituted in 1978, recognizes lifetime achievement, and another major international

award, the Abel Prize, was introduced in 2003. The Chern Medal was introduced in 2010 to recognize lifetime

achievement. These accolades are awarded in recognition of a particular body of work, which may be innovational,

or provide a solution to an outstanding problem in an established ﬁeld.

A famous list of 23 open problems, called "Hilbert’s problems", was compiled in 1900 by German mathematician

David Hilbert. This list achieved great celebrity among mathematicians, and at least nine of the problems have now

been solved. A new list of seven important problems, titled the "Millennium Prize Problems", was published in 2000.

A solution to each of these problems carries a $1 million reward, and only one (the Riemann hypothesis) is duplicated

in Hilbert’s problems.

82

CHAPTER 7. MATHEMATICS

7.7 See also

Main article: Lists of mathematics topics

• Mathematics and art

• Mathematics education

• Relationship between mathematics and physics

• STEM ﬁelds

7.8 Notes

[1] No likeness or description of Euclid’s physical appearance made during his lifetime survived antiquity. Therefore, Euclid’s

depiction in works of art depends on the artist’s imagination (see Euclid).

[2] “mathematics, n.". Oxford English Dictionary. Oxford University Press. 2012. Retrieved June 16, 2012. The science

of space, number, quantity, and arrangement, whose methods involve logical reasoning and usually the use of symbolic

notation, and which includes geometry, arithmetic, algebra, and analysis.

[3] Kneebone, G.T. (1963). Mathematical Logic and the Foundations of Mathematics: An Introductory Survey. Dover. pp. 4.

ISBN 0-486-41712-3. Mathematics ... is simply the study of abstract structures, or formal patterns of connectedness.

[4] LaTorre, Donald R., John W. Kenelly, Iris B. Reed, Laurel R. Carpenter, and Cynthia R Harris (2011). Calculus Concepts:

An Informal Approach to the Mathematics of Change. Cengage Learning. pp. 2. ISBN 1-4390-4957-2. Calculus is the

study of change—how things change, and how quickly they change.

[5] Ramana (2007). Applied Mathematics. Tata McGraw–Hill Education. p. 2.10. ISBN 0-07-066753-5. The mathematical

study of change, motion, growth or decay is calculus.

[6] Ziegler, Günter M. (2011). “What Is Mathematics?". An Invitation to Mathematics: From Competitions to Research.

Springer. pp. 7. ISBN 3-642-19532-6.

[7] Mura, Roberta (Dec 1993). “Images of Mathematics Held by University Teachers of Mathematical Sciences”. Educational

Studies in Mathematics 25 (4): 375–385.

[8] Tobies, Renate and Helmut Neunzert (2012). Iris Runge: A Life at the Crossroads of Mathematics, Science, and Industry.

Springer. pp. 9. ISBN 3-0348-0229-3. It is ﬁrst necessary to ask what is meant by mathematics in general. Illustrious

scholars have debated this matter until they were blue in the face, and yet no consensus has been reached about whether

mathematics is a natural science, a branch of the humanities, or an art form.

[9] Steen, L.A. (April 29, 1988). The Science of Patterns Science, 240: 611–616. And summarized at Association for Supervision and Curriculum Development, www.ascd.org.

[10] Devlin, Keith, Mathematics: The Science of Patterns: The Search for Order in Life, Mind and the Universe (Scientiﬁc

American Paperback Library) 1996, ISBN 978-0-7167-5047-5

[11] Eves

[12] Marcus du Sautoy, A Brief History of Mathematics: 1. Newton and Leibniz, BBC Radio 4, September 27, 2010.

[13] Waltershausen

[14] Peirce, p. 97.

[15] Hilbert, D. (1919–20), Natur und Mathematisches Erkennen: Vorlesungen, gehalten 1919–1920 in Göttingen. Nach der

Ausarbeitung von Paul Bernays (Edited and with an English introduction by David E. Rowe), Basel, Birkhäuser (1992).

[16] Einstein, p. 28. The quote is Einstein’s answer to the question: “how can it be that mathematics, being after all a product

of human thought which is independent of experience, is so admirably appropriate to the objects of reality?" He, too, is

concerned with The Unreasonable Eﬀectiveness of Mathematics in the Natural Sciences.

[17] “Claire Voisin, Artist of the Abstract”. .cnrs.fr. Retrieved October 13, 2013.

7.8. NOTES

83

[18] Peterson

[19] Dehaene, Stanislas; Dehaene-Lambertz, Ghislaine; Cohen, Laurent (Aug 1998). “Abstract representations of numbers in

the animal and human brain”. Trends in Neuroscience 21 (8): 355–361. doi:10.1016/S0166-2236(98)01263-6. PMID

9720604.

[20] See, for example, Raymond L. Wilder, Evolution of Mathematical Concepts; an Elementary Study, passim

[21] Kline 1990, Chapter 1.

[22] "A History of Greek Mathematics: From Thales to Euclid". Thomas Little Heath (1981). ISBN 0-486-24073-8

[23] Sevryuk

[24] “mathematic”. Online Etymology Dictionary.

[25] Both senses can be found in Plato. μαθηματική. Liddell, Henry George; Scott, Robert; A Greek–English Lexicon at the

Perseus Project

[26] Cipra, Barry (1982). “St. Augustine v. The Mathematicians”. osu.edu. Ohio State University Mathematics department.

Retrieved July 14, 2014.

[27] The Oxford Dictionary of English Etymology, Oxford English Dictionary, sub “mathematics”, “mathematic”, “mathematics”

[28] “maths, n." and “math, n.3". Oxford English Dictionary, on-line version (2012).

[29] James Franklin, “Aristotelian Realism” in Philosophy of Mathematics”, ed. A.D. Irvine, p. 104. Elsevier (2009).

[30] Cajori, Florian (1893). A History of Mathematics. American Mathematical Society (1991 reprint). pp. 285–6. ISBN

0-8218-2102-4.

[31] Snapper, Ernst (September 1979). “The Three Crises in Mathematics: Logicism, Intuitionism, and Formalism”. Mathematics Magazine 52 (4): 207–16. doi:10.2307/2689412. JSTOR 2689412.

[32] Peirce, Benjamin (1882). Linear Associative Algebra. p. 1.

[33] Bertrand Russell, The Principles of Mathematics, p. 5. University Press, Cambridge (1903)

[34] Curry, Haskell (1951). Outlines of a Formalist Philosophy of Mathematics. Elsevier. pp. 56. ISBN 0-444-53368-0.

[35] Marcus du Sautoy, A Brief History of Mathematics: 10. Nicolas Bourbaki, BBC Radio 4, October 1, 2010.

[36] Shasha, Dennis Elliot; Lazere, Cathy A. (1998). Out of Their Minds: The Lives and Discoveries of 15 Great Computer

Scientists. Springer. p. 228.

[37] Popper 1995, p. 56

[38] Ziman

[39] Johnson, Gerald W.; Lapidus, Michel L. (2002). The Feynman Integral and Feynman’s Operational Calculus. Oxford

University Press. ISBN 0-8218-2413-9.

[40] Wigner, Eugene (1960). “The Unreasonable Eﬀectiveness of Mathematics in the Natural Sciences”. Communications on

Pure and Applied Mathematics 13 (1): 1–14. doi:10.1002/cpa.3160130102.

[41] “Mathematics Subject Classiﬁcation 2010” (PDF). Retrieved November 9, 2010.

[42] Hardy, G.H. (1940). A Mathematician’s Apology. Cambridge University Press. ISBN 0-521-42706-1.

[43] Gold, Bonnie; Simons, Rogers A. (2008). Proof and Other Dilemmas: Mathematics and Philosophy. MAA.

[44] Aigner, Martin; Ziegler, Günter M. (2001). Proofs from The Book. Springer. ISBN 3-540-40460-0.

[45] “Earliest Uses of Various Mathematical Symbols”. Retrieved September 14, 2014.

[46] Kline, p. 140, on Diophantus; p. 261, on Vieta.

[47] See false proof for simple examples of what can go wrong in a formal proof.

[48] Ivars Peterson, The Mathematical Tourist, Freeman, 1988, ISBN 0-7167-1953-3. p. 4 “A few complain that the computer

program can't be veriﬁed properly”, (in reference to the Haken–Apple proof of the Four Color Theorem).

84

CHAPTER 7. MATHEMATICS

[49] " The method of “postulating” what we want has many advantages; they are the same as the advantages of theft over honest

toil.” Bertrand Russell (1919), Introduction to Mathematical Philosophy, New York and London, p 71.

[50] Patrick Suppes, Axiomatic Set Theory, Dover, 1972, ISBN 0-486-61630-4. p. 1, “Among the many branches of modern

mathematics set theory occupies a unique place: with a few rare exceptions the entities which are studied and analyzed in

mathematics may be regarded as certain particular sets or classes of objects.”

[51] Luke Howard Hodgkin & Luke Hodgkin, A History of Mathematics, Oxford University Press, 2005.

[52] Clay Mathematics Institute, P=NP, claymath.org

[53] Rao, C.R. (1997) Statistics and Truth: Putting Chance to Work, World Scientiﬁc. ISBN 981-02-3111-3

[54] Like other mathematical sciences such as physics and computer science, statistics is an autonomous discipline rather than

a branch of applied mathematics. Like research physicists and computer scientists, research statisticians are mathematical

scientists. Many statisticians have a degree in mathematics, and some statisticians are also mathematicians.

[55] Rao, C.R. (1981). “Foreword”. In Arthanari, T.S.; Dodge, Yadolah. Mathematical programming in statistics. Wiley Series

in Probability and Mathematical Statistics. New York: Wiley. pp. vii–viii. ISBN 0-471-08073-X. MR 607328.

[56] Whittle (1994, pp. 10–11 and 14–18): Whittle, Peter (1994). “Almost home”. In Kelly, F.P. Probability, statistics and

optimisation: A Tribute to Peter Whittle (previously “A realised path: The Cambridge Statistical Laboratory upto 1993

(revised 2002)" ed.). Chichester: John Wiley. pp. 1–28. ISBN 0-471-94829-2.

[57] "The Fields Medal is now indisputably the best known and most inﬂuential award in mathematics." Monastyrsky

[58] Riehm

7.9 References

• Courant, Richard and H. Robbins, What Is Mathematics? : An Elementary Approach to Ideas and Methods,

Oxford University Press, USA; 2 edition (July 18, 1996). ISBN 0-19-510519-2.

• Einstein, Albert (1923). Sidelights on Relativity: I. Ether and relativity. II. Geometry and experience (translated

by G.B. Jeﬀery, D.Sc., and W. Perrett, Ph.D). E.P. Dutton & Co., New York.

• du Sautoy, Marcus, A Brief History of Mathematics, BBC Radio 4 (2010).

• Eves, Howard, An Introduction to the History of Mathematics, Sixth Edition, Saunders, 1990, ISBN 0-03029558-0.

• Kline, Morris, Mathematical Thought from Ancient to Modern Times, Oxford University Press, USA; Paperback

edition (March 1, 1990). ISBN 0-19-506135-7.

• Monastyrsky, Michael (2001). “Some Trends in Modern Mathematics and the Fields Medal” (PDF). Canadian

Mathematical Society. Retrieved July 28, 2006.

• Oxford English Dictionary, second edition, ed. John Simpson and Edmund Weiner, Clarendon Press, 1989,

ISBN 0-19-861186-2.

• The Oxford Dictionary of English Etymology, 1983 reprint. ISBN 0-19-861112-9.

• Pappas, Theoni, The Joy Of Mathematics, Wide World Publishing; Revised edition (June 1989). ISBN 0933174-65-9.

• Peirce, Benjamin (1881). Peirce, Charles Sanders, ed. “Linear associative algebra”. American Journal of

Mathematics (Corrected, expanded, and annotated revision with an 1875 paper by B. Peirce and annotations by his son, C.S. Peirce, of the 1872 lithograph ed.) (Johns Hopkins University) 4 (1–4): 97–229.

doi:10.2307/2369153. JSTOR 2369153. Corrected, expanded, and annotated revision with an 1875 paper

by B. Peirce and annotations by his son, C. S. Peirce, of the 1872 lithograph ed. Google Eprint and as an

extract, D. Van Nostrand, 1882, Google Eprint..

• Peterson, Ivars, Mathematical Tourist, New and Updated Snapshots of Modern Mathematics, Owl Books, 2001,

ISBN 0-8050-7159-8.

7.10. FURTHER READING

85

• Popper, Karl R. (1995). “On knowledge”. In Search of a Better World: Lectures and Essays from Thirty Years.

Routledge. ISBN 0-415-13548-6.

• Riehm, Carl (August 2002). “The Early History of the Fields Medal” (PDF). Notices of the AMS (AMS) 49

(7): 778–782.

• Sevryuk, Mikhail B. (January 2006). “Book Reviews” (PDF). Bulletin of the American Mathematical Society

43 (1): 101–109. doi:10.1090/S0273-0979-05-01069-4. Retrieved June 24, 2006.

• Waltershausen, Wolfgang Sartorius von (1965) [ﬁrst published 1856]. Gauss zum Gedächtniss. Sändig Reprint

Verlag H. R. Wohlwend. ASIN B0000BN5SQ. ISBN 3-253-01702-8. ASIN 3253017028.

7.10 Further reading

• Benson, Donald C., The Moment of Proof: Mathematical Epiphanies, Oxford University Press, USA; New Ed

edition (December 14, 2000). ISBN 0-19-513919-4.

• Boyer, Carl B., A History of Mathematics, Wiley; 2nd edition, revised by Uta C. Merzbach, (March 6, 1991).

ISBN 0-471-54397-7.—A concise history of mathematics from the Concept of Number to contemporary

Mathematics.

• Davis, Philip J. and Hersh, Reuben, The Mathematical Experience. Mariner Books; Reprint edition (January

14, 1999). ISBN 0-395-92968-7.

• Gullberg, Jan, Mathematics – From the Birth of Numbers. W. W. Norton & Company; 1st edition (October

1997). ISBN 0-393-04002-X.

• Hazewinkel, Michiel (ed.), Encyclopaedia of Mathematics. Kluwer Academic Publishers 2000. – A translated

and expanded version of a Soviet mathematics encyclopedia, in ten (expensive) volumes, the most complete

and authoritative work available. Also in paperback and on CD-ROM, and online.

• Jourdain, Philip E. B., The Nature of Mathematics, in The World of Mathematics, James R. Newman, editor,

Dover Publications, 2003, ISBN 0-486-43268-8.

• Maier, Annaliese, At the Threshold of Exact Science: Selected Writings of Annaliese Maier on Late Medieval

Natural Philosophy, edited by Steven Sargent, Philadelphia: University of Pennsylvania Press, 1982.

7.11 External links

• Mathematics at Encyclopædia Britannica

• Mathematics on In Our Time at the BBC. (listen now)

• Free Mathematics books Free Mathematics books collection.

• Encyclopaedia of Mathematics online encyclopaedia from Springer, Graduate-level reference work with over

8,000 entries, illuminating nearly 50,000 notions in mathematics.

• HyperMath site at Georgia State University

• FreeScience Library The mathematics section of FreeScience library

• Rusin, Dave: The Mathematical Atlas. A guided tour through the various branches of modern mathematics.

(Can also be found at NIU.edu.)

• Polyanin, Andrei: EqWorld: The World of Mathematical Equations. An online resource focusing on algebraic,

ordinary diﬀerential, partial diﬀerential (mathematical physics), integral, and other mathematical equations.

• Cain, George: Online Mathematics Textbooks available free online.

• Tricki, Wiki-style site that is intended to develop into a large store of useful mathematical problem-solving

techniques.

86

CHAPTER 7. MATHEMATICS

• Mathematical Structures, list information about classes of mathematical structures.

• Mathematician Biographies. The MacTutor History of Mathematics archive Extensive history and quotes from

all famous mathematicians.

• Metamath. A site and a language, that formalize mathematics from its foundations.

• Nrich, a prize-winning site for students from age ﬁve from Cambridge University

• Open Problem Garden, a wiki of open problems in mathematics

• Planet Math. An online mathematics encyclopedia under construction, focusing on modern mathematics. Uses

the Attribution-ShareAlike license, allowing article exchange with Wikipedia. Uses TeX markup.

• Some mathematics applets, at MIT

• Weisstein, Eric et al.: MathWorld: World of Mathematics. An online encyclopedia of mathematics.

• Patrick Jones’ Video Tutorials on Mathematics

• Citizendium: Theory (mathematics).

• du Sautoy, Marcus, A Brief History of Mathematics, BBC Radio 4 (2010).

• MathOverﬂow A Q&A site for research-level mathematics

• Math – Khan Academy

• National Museum of Mathematics, located in New York City

Chapter 8

Matrix (mathematics)

For other uses, see Matrix.

“Matrix theory” redirects here. For the physics topic, see Matrix string theory.

In mathematics, a matrix (plural matrices) is a rectangular array[1] —of numbers, symbols, or expressions, arranged

a2,1

a2,2

a2,3

a3,1

a3,2

a3,3

.

.

.

.

.

.

.

.

.

.

.

.

a1,3

.

.

.

a1,2

.

.

.

i

c

h

a

n

g

e

s

a1,1

.

m

rows

j changes

n columns

..

ai,j

Each element of a matrix is often denoted by a variable with two subscripts. For instance, a2,1 represents the element at the second

row and ﬁrst column of a matrix A.

in rows and columns[2][3] —that is treated in certain prescribed ways. One such way is to state the order of the matrix.

For example, the order of the matrix below is a 2x3 matrix because there are two rows and three columns. The

individual items in a matrix are called its elements or entries.[4]

87

88

[

1 9

20 5

CHAPTER 8. MATRIX (MATHEMATICS)

]

−13

.

−6

Provided that they are the same size (have the same number of rows and the same number of columns), two matrices

can be added or subtracted element by element. The rule for matrix multiplication, however, is that two matrices

can be multiplied only when the number of columns in the ﬁrst equals the number of rows in the second. A major

application of matrices is to represent linear transformations, that is, generalizations of linear functions such as f(x) =

4x. For example, the rotation of vectors in three dimensional space is a linear transformation which can be represented

by a rotation matrix R: if v is a column vector (a matrix with only one column) describing the position of a point in

space, the product Rv is a column vector describing the position of that point after a rotation. The product of two

transformation matrices is a matrix that represents the composition of two linear transformations. Another application

of matrices is in the solution of systems of linear equations. If the matrix is square, it is possible to deduce some of its

properties by computing its determinant. For example, a square matrix has an inverse if and only if its determinant

is not zero. Insight into the geometry of a linear transformation is obtainable (along with other information) from the

matrix’s eigenvalues and eigenvectors.

Applications of matrices are found in most scientiﬁc ﬁelds. In every branch of physics, including classical mechanics,

optics, electromagnetism, quantum mechanics, and quantum electrodynamics, they are used to study physical phenomena, such as the motion of rigid bodies. In computer graphics, they are used to project a 3-dimensional image

onto a 2-dimensional screen. In probability theory and statistics, stochastic matrices are used to describe sets of

probabilities; for instance, they are used within the PageRank algorithm that ranks the pages in a Google search.[5]

Matrix calculus generalizes classical analytical notions such as derivatives and exponentials to higher dimensions.

A major branch of numerical analysis is devoted to the development of eﬃcient algorithms for matrix computations,

a subject that is centuries old and is today an expanding area of research. Matrix decomposition methods simplify

computations, both theoretically and practically. Algorithms that are tailored to particular matrix structures, such as

sparse matrices and near-diagonal matrices, expedite computations in ﬁnite element method and other computations.

Inﬁnite matrices occur in planetary theory and in atomic theory. A simple example of an inﬁnite matrix is the matrix

representing the derivative operator, which acts on the Taylor series of a function.

8.1 Deﬁnition

A matrix is a rectangular array of numbers or other mathematical objects, for which operations such as addition and

multiplication are deﬁned.[6] Most commonly, a matrix over a ﬁeld F is a rectangular array of scalars from F.[7][8]

Most of this article focuses on real and complex matrices, i.e., matrices whose elements are real numbers or complex

numbers, respectively. More general types of entries are discussed below. For instance, this is a real matrix:

−1.3 0.6

5.5 .

A = 20.4

9.7 −6.2

The numbers, symbols or expressions in the matrix are called its entries or its elements. The horizontal and vertical

lines of entries in a matrix are called rows and columns, respectively.

8.1.1

Size

The size of a matrix is deﬁned by the number of rows and columns that it contains. A matrix with m rows and n

columns is called an m × n matrix or m-by-n matrix, while m and n are called its dimensions. For example, the matrix

A above is a 3 × 2 matrix.

Matrices which have a single row are called row vectors, and those which have a single column are called column

vectors. A matrix which has the same number of rows and columns is called a square matrix. A matrix with an

inﬁnite number of rows or columns (or both) is called an inﬁnite matrix. In some contexts, such as computer algebra

programs, it is useful to consider a matrix with no rows or no columns, called an empty matrix.

8.2. NOTATION

89

8.2 Notation

Matrices are commonly written in box brackets or an alternative notation uses large parentheses instead of box brackets:

a11

a21

A= .

..

a12

a22

..

.

···

···

..

.

am1

am2

···

a1n

a11 a12

a21 a22

a2n

.. =

..

..

.

.

.

amn

am1 am2

···

···

..

.

a1n

a2n

..

.

···

amn

∈ Rm×n .

The speciﬁcs of symbolic matrix notation varies widely, with some prevailing trends. Matrices are usually symbolized

using upper-case letters (such as A in the examples above), while the corresponding lower-case letters, with two

subscript indices (e.g., a11 , or a₁,₁), represent the entries. In addition to using upper-case letters to symbolize matrices,

many authors use a special typographical style, commonly boldface upright (non-italic), to further distinguish matrices

from other mathematical objects. An alternative notation involves the use of a double-underline with the variable

name, with or without boldface style, (e.g., A ).

The entry in the i-th row and j-th column of a matrix A is sometimes referred to as the i,j, (i,j), or (i,j)th entry of

the matrix, and most commonly denoted as ai,j, or aij. Alternative notations for that entry are A[i,j] or Ai,j. For

example, the (1,3) entry of the following matrix A is 5 (also denoted a13 , a₁,₃, A[1,3] or A1,3):

4 −7 5

0

11 8

A = −2 0

19

1 −3 12

Sometimes, the entries of a matrix can be deﬁned by a formula such as ai,j = f(i, j). For example, each of the entries

of the following matrix A is determined by aij = i − j.

0 −1 −2

A = 1 0 −1

2 1

0

−3

−2

−1

In this case, the matrix itself is sometimes deﬁned by that formula, within square brackets or double parenthesis. For

example, the matrix above is deﬁned as A = [i-j], or A = ((i-j)). If matrix size is m × n, the above-mentioned formula

f(i, j) is valid for any i = 1, ..., m and any j = 1, ..., n. This can be either speciﬁed separately, or using m × n as a

subscript. For instance, the matrix A above is 3 × 4 and can be deﬁned as A = [i − j] (i = 1, 2, 3; j = 1, ..., 4), or A =

[i − j]3×4.

Some programming languages utilize doubly subscripted arrays (or arrays of arrays) to represent an m-×-n matrix.

Some programming languages start the numbering of array indexes at zero, in which case the entries of an m-by-n

matrix are indexed by 0 ≤ i ≤ m − 1 and 0 ≤ j ≤ n − 1.[9] This article follows the more common convention in

mathematical writing where enumeration starts from 1.

The set of all m-by-n matrices is denoted 𝕄(m, n).

8.3 Basic operations

There are a number of basic operations that can be applied to modify matrices, called matrix addition, scalar multiplication, transposition, matrix multiplication, row operations, and submatrix.[11]

8.3.1

Addition, scalar multiplication and transposition

Main articles: Matrix addition, Scalar multiplication and Transpose

90

CHAPTER 8. MATRIX (MATHEMATICS)

Familiar properties of numbers extend to these operations of matrices: for example, addition is commutative, i.e.,

the matrix sum does not depend on the order of the summands: A + B = B + A.[12] The transpose is compatible with

addition and scalar multiplication, as expressed by (cA)T = c(AT ) and (A + B)T = AT + BT . Finally, (AT )T = A.

8.3.2

Matrix multiplication

Main article: Matrix multiplication

Multiplication of two matrices is deﬁned if and only if the number of columns of the left matrix is the same as the

B

b1,1 b1,2 b1,3

b2,1 b2,2 b2,3

a1,1 a1,2

A

a2,1 a2,2

a3,1 a3,2

a4,1 a4,2

Schematic depiction of the matrix product AB of two matrices A and B.

number of rows of the right matrix. If A is an m-by-n matrix and B is an n-by-p matrix, then their matrix product AB

is the m-by-p matrix whose entries are given by dot product of the corresponding row of A and the corresponding

column of B:

[AB]i,j = Ai,1 B1,j + Ai,2 B2,j + · · · + Ai,n Bn,j =

∑n

r=1

Ai,r Br,j ,

where 1 ≤ i ≤ m and 1 ≤ j ≤ p.[13] For example, the underlined entry 2340 in the product is calculated as (2 × 1000)

+ (3 × 100) + (4 × 10) = 2340:

[

] 0 1000

[

2 3 4

3

1 100 =

1 0 0

0

0 10

]

2340

.

1000

8.3. BASIC OPERATIONS

91

Matrix multiplication satisﬁes the rules (AB)C = A(BC) (associativity), and (A+B)C = AC+BC as well as C(A+B)

= CA+CB (left and right distributivity), whenever the size of the matrices is such that the various products are

deﬁned.[14] The product AB may be deﬁned without BA being deﬁned, namely if A and B are m-by-n and n-by-k

matrices, respectively, and m ≠ k. Even if both products are deﬁned, they need not be equal, i.e., generally

AB ≠ BA,

i.e., matrix multiplication is not commutative, in marked contrast to (rational, real, or complex) numbers whose

product is independent of the order of the factors. An example of two matrices not commuting with each other is:

[

1 2

3 4

][

] [

]

0 1

0 1

=

,

0 0

0 3

whereas

[

0 1

0 0

][

] [

]

1 2

3 4

=

.

3 4

0 0

Besides the ordinary matrix multiplication just described, there exist other less frequently used operations on matrices

that can be considered forms of multiplication, such as the Hadamard product and the Kronecker product.[15] They

arise in solving matrix equations such as the Sylvester equation.

8.3.3

Row operations

Main article: Row operations

There are three types of row operations:

1. row addition, that is adding a row to another.

2. row multiplication, that is multiplying all entries of a row by a non-zero constant;

3. row switching, that is interchanging two rows of a matrix;

These operations are used in a number of ways, including solving linear equations and ﬁnding matrix inverses.

8.3.4

Submatrix

A submatrix of a matrix is obtained by deleting any collection of rows and/or columns.[16][17][18] For example, from

the following 3-by-4 matrix, we can construct a 2-by-3 submatrix by removing row 3 and column 2:

1 2

A = 5 6

9 10

[

3 4

1

7 8 →

5

11 12

3

7

]

4

.

8

The minors and cofactors of a matrix are found by computing the determinant of certain submatrices.[18][19]

A principal submatrix is a square submatrix obtained by removing certain rows and columns. The deﬁnition varies

from author to author. According to some authors, a principal submatrix is a submatrix in which the set of row indices

that remain is the same as the set of column indices that remain.[20][21] Other authors deﬁne a principal submatrix to

be one in which the ﬁrst k rows and columns, for some number k, are the ones that remain;[22] this type of submatrix

has also been called a leading principal submatrix.[23]

92

CHAPTER 8. MATRIX (MATHEMATICS)

8.4 Linear equations

Main articles: Linear equation and System of linear equations

Matrices can be used to compactly write and work with multiple linear equations, i.e., systems of linear equations.

For example, if A is an m-by-n matrix, x designates a column vector (i.e., n×1-matrix) of n variables x1 , x2 , ..., xn,

and b is an m×1-column vector, then the matrix equation

Ax = b

is equivalent to the system of linear equations

A₁,₁x1 + A₁,₂x2 + ... + A₁,nxn = b1

...

Am,₁x1 + Am,₂x2 + ... + Am,nxn = bm .[24]

8.5 Linear transformations

Main articles: Linear transformation and Transformation matrix

Matrices and matrix multiplication reveal their essential features when related to linear transformations, also known

as linear maps. A real m-by-n matrix A gives rise to a linear transformation Rn → Rm mapping each vector x in Rn

to the (matrix) product Ax, which is a vector in Rm . Conversely, each linear transformation f: Rn → Rm arises from

a unique m-by-n matrix A: explicitly, the (i, j)-entry of A is the ith coordinate of f(ej), where ej = (0,...,0,1,0,...,0) is

the unit vector with 1 in the j th position and 0 elsewhere. The matrix A is said to represent the linear map f, and A

is called the transformation matrix of f.

For example, the 2×2 matrix

[

]

a c

A=

b d

can be viewed as the transform of the unit square into a parallelogram with vertices at (0, 0), (a, b), (a + c, b + d),

and

pictured at the right is obtained by multiplying A with each of the column vectors

[ ] (c,

[ ]d).[ The

] parallelogram

[ ]

0 1

1

0

,

,

and

in turn. These vectors deﬁne the vertices of the unit square.

0 0

1

1

The following table shows a number of 2-by-2 matrices with the associated linear maps of R2 . The blue original is

mapped to the green grid and shapes. The origin (0,0) is marked with a black point.

Under the 1-to-1 correspondence between matrices and linear maps, matrix multiplication corresponds to composition

of maps:[25] if a k-by-m matrix B represents another linear map g : Rm → Rk , then the composition g ∘ f is represented

by BA since

(g ∘ f)(x) = g(f(x)) = g(Ax) = B(Ax) = (BA)x.

The last equality follows from the above-mentioned associativity of matrix multiplication.

The rank of a matrix A is the maximum number of linearly independent row vectors of the matrix, which is the same

as the maximum number of linearly independent column vectors.[26] Equivalently it is the dimension of the image of

the linear map represented by A.[27] The rank-nullity theorem states that the dimension of the kernel of a matrix plus

the rank equals the number of columns of the matrix.[28]

8.6 Square matrices

Main article: Square matrix

8.6. SQUARE MATRICES

93

(a+c,b+d)

(c,d)

ad−bc

(a,b)

(0,0)

The vectors represented by a 2-by-2 matrix correspond to the sides of a unit square transformed into a parallelogram.

A square matrix is a matrix with the same number of rows and columns. An n-by-n matrix is known as a square

matrix of order n. Any two square matrices of the same order can be added and multiplied. The entries aii form the

main diagonal of a square matrix. They lie on the imaginary line which runs from the top left corner to the bottom

right corner of the matrix.

8.6.1

Main types

94

CHAPTER 8. MATRIX (MATHEMATICS)

Diagonal and triangular matrices

If all entries of A below the main diagonal are zero, A is called an upper triangular matrix. Similarly if all entries of

A above the main diagonal are zero, A is called a lower triangular matrix. If all entries outside the main diagonal are

zero, A is called a diagonal matrix.

Identity matrix

The identity matrix In of size n is the n-by-n matrix in which all the elements on the main diagonal are equal to 1 and

all other elements are equal to 0, e.g.

[

[ ]

1

I1 = 1 , I2 =

0

1 0

]

0 1

0

, · · · , In = . .

1

.. ..

0 0

···

···

..

.

0

0

..

.

···

1

It is a square matrix of order n, and also a special kind of diagonal matrix. It is called an identity matrix because

multiplication with it leaves a matrix unchanged:

AIn = ImA = A for any m-by-n matrix A.

Symmetric or skew-symmetric matrix

A square matrix A that is equal to its transpose, i.e., A = AT , is a symmetric matrix. If instead, A was equal to the

negative of its transpose, i.e., A = −AT , then A is a skew-symmetric matrix. In complex matrices, symmetry is often

replaced by the concept of Hermitian matrices, which satisfy A∗ = A, where the star or asterisk denotes the conjugate

transpose of the matrix, i.e., the transpose of the complex conjugate of A.

By the spectral theorem, real symmetric matrices and complex Hermitian matrices have an eigenbasis; i.e., every

vector is expressible as a linear combination of eigenvectors. In both cases, all eigenvalues are real.[29] This theorem

can be generalized to inﬁnite-dimensional situations related to matrices with inﬁnitely many rows and columns, see

below.

Invertible matrix and its inverse

A square matrix A is called invertible or non-singular if there exists a matrix B such that

AB = BA = In.[30][31]

If B exists, it is unique and is called the inverse matrix of A, denoted A−1 .

Deﬁnite matrix

A symmetric n×n-matrix is called positive-deﬁnite (respectively negative-deﬁnite; indeﬁnite), if for all nonzero vectors

x ∈ Rn the associated quadratic form given by

Q(x) = xT Ax

takes only positive values (respectively only negative values; both some negative and some positive values).[32] If

the quadratic form takes only non-negative (respectively only non-positive) values, the symmetric matrix is called

positive-semideﬁnite (respectively negative-semideﬁnite); hence the matrix is indeﬁnite precisely when it is neither

positive-semideﬁnite nor negative-semideﬁnite.

A symmetric matrix is positive-deﬁnite if and only if all its eigenvalues are positive, i.e., the matrix is positivesemideﬁnite and it is invertible.[33] The table at the right shows two possibilities for 2-by-2 matrices.

Allowing as input two diﬀerent vectors instead yields the bilinear form associated to A:

8.6. SQUARE MATRICES

95

BA (x, y) = xT Ay.[34]

Orthogonal matrix

An orthogonal matrix is a square matrix with real entries whose columns and rows are orthogonal unit vectors (i.e.,

orthonormal vectors). Equivalently, a matrix A is orthogonal if its transpose is equal to its inverse:

AT = A−1 ,

which entails

AT A = AAT = I,

where I is the identity matrix.

An orthogonal matrix A is necessarily invertible (with inverse A−1 = AT ), unitary (A−1 = A*), and normal (A*A =

AA*). The determinant of any orthogonal matrix is either +1 or −1. A special orthogonal matrix is an orthogonal

matrix with determinant +1. As a linear transformation, every orthogonal matrix with determinant +1 is a pure

rotation, while every orthogonal matrix with determinant −1 is either a pure reﬂection, or a composition of reﬂection

and rotation.

The complex analogue of an orthogonal matrix is a unitary matrix.

8.6.2

Main operations

Trace

The trace, tr(A) of a square matrix A is the sum of its diagonal entries. While matrix multiplication is not commutative

as mentioned above, the trace of the product of two matrices is independent of the order of the factors:

tr(AB) = tr(BA).

This is immediate from the deﬁnition of matrix multiplication:

tr(AB)=

∑m ∑n

i=1

j=1

Aij Bji =tr(BA).

Also, the trace of a matrix is equal to that of its transpose, i.e.,

tr(A) = tr(AT ).

Determinant

Main article: Determinant

The determinant det(A) or |A| of a square matrix A is a number encoding certain properties of the matrix. A matrix

is invertible if and only if its determinant is nonzero. Its absolute value equals the area (in R2 ) or volume (in R3 ) of

the image of the unit square (or cube), while its sign corresponds to the orientation of the corresponding linear map:

the determinant is positive if and only if the orientation is preserved.

The determinant of 2-by-2 matrices is given by

[

]

a b

det

= ad − bc.

c d

The determinant of 3-by-3 matrices involves 6 terms (rule of Sarrus). The more lengthy Leibniz formula generalises

these two formulae to all dimensions.[35]

The determinant of a product of square matrices equals the product of their determinants:

96

CHAPTER 8. MATRIX (MATHEMATICS)

( 01 −11)

x2

f(x1 )

x1

f(x2 )

A linear transformation on R2 given by the indicated matrix. The determinant of this matrix is −1, as the area of the green

parallelogram at the right is 1, but the map reverses the orientation, since it turns the counterclockwise orientation of the vectors

to a clockwise one.

det(AB) = det(A) · det(B).[36]

Adding a multiple of any row to another row, or a multiple of any column to another column, does not change

the determinant. Interchanging two rows or two columns aﬀects the determinant by multiplying it by −1.[37] Using

these operations, any matrix can be transformed to a lower (or upper) triangular matrix, and for such matrices the

determinant equals the product of the entries on the main diagonal; this provides a method to calculate the determinant

of any matrix. Finally, the Laplace expansion expresses the determinant in terms of minors, i.e., determinants of

smaller matrices.[38] This expansion can be used for a recursive deﬁnition of determinants (taking as starting case

the determinant of a 1-by-1 matrix, which is its unique entry, or even the determinant of a 0-by-0 matrix, which is

1), that can be seen to be equivalent to the Leibniz formula. Determinants can be used to solve linear systems using

Cramer’s rule, where the division of the determinants of two related square matrices equates to the value of each of

the system’s variables.[39]

Eigenvalues and eigenvectors

Main article: Eigenvalues and eigenvectors

A number λ and a non-zero vector v satisfying

Av = λv

are called an eigenvalue and an eigenvector of A, respectively.[nb 1][40] The number λ is an eigenvalue of an n×n-matrix

A if and only if A−λIn is not invertible, which is equivalent to

det(A − λI) = 0.

[41]

The polynomial pA in an indeterminate X given by evaluation the determinant det(XIn−A) is called the characteristic

polynomial of A. It is a monic polynomial of degree n. Therefore the polynomial equation pA(λ) = 0 has at most

n diﬀerent solutions, i.e., eigenvalues of the matrix.[42] They may be complex even if the entries of A are real.

According to the Cayley–Hamilton theorem, pA(A) = 0, that is, the result of substituting the matrix itself into its own

characteristic polynomial yields the zero matrix.

8.7 Computational aspects

Matrix calculations can be often performed with diﬀerent techniques. Many problems can be solved by both direct

algorithms or iterative approaches. For example, the eigenvectors of a square matrix can be obtained by ﬁnding a

8.8. DECOMPOSITION

97

sequence of vectors xn converging to an eigenvector when n tends to inﬁnity.[43]

To be able to choose the more appropriate algorithm for each speciﬁc problem, it is important to determine both

the eﬀectiveness and precision of all the available algorithms. The domain studying these matters is called numerical

linear algebra.[44] As with other numerical situations, two main aspects are the complexity of algorithms and their

numerical stability.

Determining the complexity of an algorithm means ﬁnding upper bounds or estimates of how many elementary operations such as additions and multiplications of scalars are necessary to perform some algorithm, e.g., multiplication

of matrices. For example, calculating the matrix product of two n-by-n matrix using the deﬁnition given above needs

n3 multiplications, since for any of the n2 entries of the product, n multiplications are necessary. The Strassen algorithm outperforms this “naive” algorithm; it needs only n2.807 multiplications.[45] A reﬁned approach also incorporates

speciﬁc features of the computing devices.

In many practical situations additional information about the matrices involved is known. An important case are

sparse matrices, i.e., matrices most of whose entries are zero. There are speciﬁcally adapted algorithms for, say,

solving linear systems Ax = b for sparse matrices A, such as the conjugate gradient method.[46]

An algorithm is, roughly speaking, numerically stable, if little deviations in the input values do not lead to big deviations in the result. For example, calculating the inverse of a matrix via Laplace’s formula (Adj (A) denotes the

adjugate matrix of A)

A−1 = Adj(A) / det(A)

may lead to signiﬁcant rounding errors if the determinant of the matrix is very small. The norm of a matrix can be

used to capture the conditioning of linear algebraic problems, such as computing a matrix’s inverse.[47]

Although most computer languages are not designed with commands or libraries for matrices, as early as the 1970s,

some engineering desktop computers such as the HP 9830 had ROM cartridges to add BASIC commands for matrices.

Some computer languages such as APL were designed to manipulate matrices, and various mathematical programs

can be used to aid computing with matrices.[48]

8.8 Decomposition

Main articles: Matrix decomposition, Matrix diagonalization, Gaussian elimination and Montante’s method

There are several methods to render matrices into a more easily accessible form. They are generally referred to as

matrix decomposition or matrix factorization techniques. The interest of all these techniques is that they preserve

certain properties of the matrices in question, such as determinant, rank or inverse, so that these quantities can be

calculated after applying the transformation, or that certain matrix operations are algorithmically easier to carry out

for some types of matrices.

The LU decomposition factors matrices as a product of lower (L) and an upper triangular matrices (U).[49] Once

this decomposition is calculated, linear systems can be solved more eﬃciently, by a simple technique called forward

and back substitution. Likewise, inverses of triangular matrices are algorithmically easier to calculate. The Gaussian elimination is a similar algorithm; it transforms any matrix to row echelon form.[50] Both methods proceed by

multiplying the matrix by suitable elementary matrices, which correspond to permuting rows or columns and adding

multiples of one row to another row. Singular value decomposition expresses any matrix A as a product UDV∗ , where

U and V are unitary matrices and D is a diagonal matrix.

The eigendecomposition or diagonalization expresses A as a product VDV−1 , where D is a diagonal matrix and V

is a suitable invertible matrix.[51] If A can be written in this form, it is called diagonalizable. More generally, and

applicable to all matrices, the Jordan decomposition transforms a matrix into Jordan normal form, that is to say

matrices whose only nonzero entries are the eigenvalues λ1 to λ of A, placed on the main diagonal and possibly

entries equal to one directly above the main diagonal, as shown at the right.[52] Given the eigendecomposition, the nth

power of A (i.e., n-fold iterated matrix multiplication) can be calculated via

An = (VDV−1 )n = VDV−1 VDV−1 ...VDV−1 = VDn V−1

and the power of a diagonal matrix can be calculated by taking the corresponding powers of the diagonal entries, which

is much easier than doing the exponentiation for A instead. This can be used to compute the matrix exponential eA , a

98

CHAPTER 8. MATRIX (MATHEMATICS)

An example of a matrix in Jordan normal form. The grey blocks are called Jordan blocks.

need frequently arising in solving linear diﬀerential equations, matrix logarithms and square roots of matrices.[53] To

avoid numerically ill-conditioned situations, further algorithms such as the Schur decomposition can be employed.[54]

8.9 Abstract algebraic aspects and generalizations

Matrices can be generalized in diﬀerent ways. Abstract algebra uses matrices with entries in more general ﬁelds

or even rings, while linear algebra codiﬁes properties of matrices in the notion of linear maps. It is possible to

consider matrices with inﬁnitely many columns and rows. Another extension are tensors, which can be seen as

higher-dimensional arrays of numbers, as opposed to vectors, which can often be realised as sequences of numbers,

while matrices are rectangular or two-dimensional arrays of numbers.[55] Matrices, subject to certain requirements

tend to form groups known as matrix groups.

8.9.1

Matrices with more general entries

This article focuses on matrices whose entries are real or complex numbers. However, matrices can be considered

with much more general types of entries than real or complex numbers. As a ﬁrst step of generalization, any ﬁeld, i.e.,

a set where addition, subtraction, multiplication and division operations are deﬁned and well-behaved, may be used

instead of R or C, for example rational numbers or ﬁnite ﬁelds. For example, coding theory makes use of matrices

over ﬁnite ﬁelds. Wherever eigenvalues are considered, as these are roots of a polynomial they may exist only in a

8.9. ABSTRACT ALGEBRAIC ASPECTS AND GENERALIZATIONS

99

larger ﬁeld than that of the entries of the matrix; for instance they may be complex in case of a matrix with real

entries. The possibility to reinterpret the entries of a matrix as elements of a larger ﬁeld (e.g., to view a real matrix

as a complex matrix whose entries happen to be all real) then allows considering each square matrix to possess a full

set of eigenvalues. Alternatively one can consider only matrices with entries in an algebraically closed ﬁeld, such as

C, from the outset.

More generally, abstract algebra makes great use of matrices with entries in a ring R.[56] Rings are a more general

notion than ﬁelds in that a division operation need not exist. The very same addition and multiplication operations of

matrices extend to this setting, too. The set M(n, R) of all square n-by-n matrices over R is a ring called matrix ring,

isomorphic to the endomorphism ring of the left R-module Rn .[57] If the ring R is commutative, i.e., its multiplication

is commutative, then M(n, R) is a unitary noncommutative (unless n = 1) associative algebra over R. The determinant

of square matrices over a commutative ring R can still be deﬁned using the Leibniz formula; such a matrix is invertible

if and only if its determinant is invertible in R, generalising the situation over a ﬁeld F, where every nonzero element

is invertible.[58] Matrices over superrings are called supermatrices.[59]

Matrices do not always have all their entries in the same ring – or even in any ring at all. One special but common

case is block matrices, which may be considered as matrices whose entries themselves are matrices. The entries need

not be quadratic matrices, and thus need not be members of any ordinary ring; but their sizes must fulﬁl certain

compatibility conditions.

8.9.2

Relationship to linear maps

Linear maps Rn → Rm are equivalent to m-by-n matrices, as described above. More generally, any linear map f: V

→ W between ﬁnite-dimensional vector spaces can be described by a matrix A = (aij), after choosing bases v1 , ...,

vn of V, and w1 , ..., wm of W (so n is the dimension of V and m is the dimension of W), which is such that

f (vj ) =

m

∑

ai,j wi

for j = 1, . . . , n.

i=1

In other words, column j of A expresses the image of vj in terms of the basis vectors wi of W; thus this relation uniquely

determines the entries of the matrix A. Note that the matrix depends on the choice of the bases: diﬀerent choices of

bases give rise to diﬀerent, but equivalent matrices.[60] Many of the above concrete notions can be reinterpreted in

this light, for example, the transpose matrix AT describes the transpose of the linear map given by A, with respect to

the dual bases.[61]

These properties can be restated in a more natural way: the category of all matrices with entries in a ﬁeld k with

multiplication as composition is equivalent to the category of ﬁnite dimensional vector spaces and linear maps over

this ﬁeld.

More generally, the set of m×n matrices can be used to represent the R-linear maps between the free modules Rm

and Rn for an arbitrary ring R with unity. When n = m composition of these maps is possible, and this gives rise to

the matrix ring of n×n matrices representing the endomorphism ring of Rn .

8.9.3

Matrix groups

Main article: Matrix group

A group is a mathematical structure consisting of a set of objects together with a binary operation, i.e., an operation

combining any two objects to a third, subject to certain requirements.[62] A group in which the objects are matrices

and the group operation is matrix multiplication is called a matrix group.[nb 2][63] Since in a group every element has

to be invertible, the most general matrix groups are the groups of all invertible matrices of a given size, called the

general linear groups.

Any property of matrices that is preserved under matrix products and inverses can be used to deﬁne further matrix

groups. For example, matrices with a given size and with a determinant of 1 form a subgroup of (i.e., a smaller

group contained in) their general linear group, called a special linear group.[64] Orthogonal matrices, determined by

the condition

MT M = I,

100

CHAPTER 8. MATRIX (MATHEMATICS)

form the orthogonal group.[65] Every orthogonal matrix has determinant 1 or −1. Orthogonal matrices with determinant 1 form a subgroup called special orthogonal group.

Every ﬁnite group is isomorphic to a matrix group, as one can see by considering the regular representation of the

symmetric group.[66] General groups can be studied using matrix groups, which are comparatively well-understood,

by means of representation theory.[67]

8.9.4

Inﬁnite matrices

It is also possible to consider matrices with inﬁnitely many rows and/or columns[68] even if, being inﬁnite objects, one

cannot write down such matrices explicitly. All that matters is that for every element in the set indexing rows, and

every element in the set indexing columns, there is a well-deﬁned entry (these index sets need not even be subsets of

the natural numbers). The basic operations of addition, subtraction, scalar multiplication and transposition can still

be deﬁned without problem; however matrix multiplication may involve inﬁnite summations to deﬁne the resulting

entries, and these are not deﬁned in general.

⊕

If R is any ring with unity, then the ring of endomorphisms of M = i∈I R as a right R module is isomorphic to

the ring of column ﬁnite matrices CFMI (R) whose entries are indexed by I × I , and whose columns each contain

only ﬁnitely many nonzero entries. The endomorphisms of M considered as a left R module result in an analogous

object, the row ﬁnite matrices RFMI (R) whose rows each only have ﬁnitely many nonzero entries.

If inﬁnite matrices are used to describe linear maps, then only those matrices can be used all of whose columns have

but a ﬁnite number of nonzero entries, for the following reason. For a matrix A to describe a linear map f: V→W,

bases for both spaces must have been chosen; recall that by deﬁnition this means that every vector in the space can be

written uniquely as a (ﬁnite) linear combination of basis vectors, so that written as a (column) vector v of coeﬃcients,

only ﬁnitely many entries vi are nonzero. Now the columns of A describe the images by f of individual basis vectors

of V in the basis of W, which is only meaningful if these columns have only ﬁnitely many nonzero entries. There

is no restriction on the rows of A however: in the product A·v there are only ﬁnitely many nonzero coeﬃcients of

v involved, so every one of its entries, even if it is given as an inﬁnite sum of products, involves only ﬁnitely many

nonzero terms and is therefore well deﬁned. Moreover this amounts to forming a linear combination of the columns

of A that eﬀectively involves only ﬁnitely many of them, whence the result has only ﬁnitely many nonzero entries,

because each of those columns do. One also sees that products of two matrices of the given type is well deﬁned

(provided as usual that the column-index and row-index sets match), is again of the same type, and corresponds to

the composition of linear maps.

If R is a normed ring, then the condition of row or column ﬁniteness can be relaxed. With the norm in place, absolutely

convergent series can be used instead of ﬁnite sums. For example, the matrices whose column sums are absolutely

convergent sequences form a ring. Analogously of course, the matrices whose row sums are absolutely convergent

series also form a ring.

In that vein, inﬁnite matrices can also be used to describe operators on Hilbert spaces, where convergence and

continuity questions arise, which again results in certain constraints that have to be imposed. However, the explicit

point of view of matrices tends to obfuscate the matter,[nb 3] and the abstract and more powerful tools of functional

analysis can be used instead.

8.9.5

Empty matrices

An empty matrix is a matrix in which the number of rows or columns (or both) is zero.[69][70] Empty matrices help

dealing with maps involving the zero vector space. For example, if A is a 3-by-0 matrix and B is a 0-by-3 matrix,

then AB is the 3-by-3 zero matrix corresponding to the null map from a 3-dimensional space V to itself, while BA

is a 0-by-0 matrix. There is no common notation for empty matrices, but most computer algebra systems allow

creating and computing with them. The determinant of the 0-by-0 matrix is 1 as follows from regarding the empty

product occurring in the Leibniz formula for the determinant as 1. This value is also consistent with the fact that the

identity map from any ﬁnite dimensional space to itself has determinant 1, a fact that is often used as a part of the

characterization of determinants.

8.10. APPLICATIONS

101

8.10 Applications

There are numerous applications of matrices, both in mathematics and other sciences. Some of them merely take

advantage of the compact representation of a set of numbers in a matrix. For example, in game theory and economics,

the payoﬀ matrix encodes the payoﬀ for two players, depending on which out of a given (ﬁnite) set of alternatives the

players choose.[71] Text mining and automated thesaurus compilation makes use of document-term matrices such as

tf-idf to track frequencies of certain words in several documents.[72]

Complex numbers can be represented by particular real 2-by-2 matrices via

[

a + ib ↔

]

a −b

,

b a

under which addition and multiplication of complex numbers and matrices correspond to each other. For example,

2-by-2 rotation matrices represent the multiplication with some complex number of absolute value 1, as above. A

similar interpretation is possible for quaternions[73] and Cliﬀord algebras in general.

Early encryption techniques such as the Hill cipher also used matrices. However, due to the linear nature of matrices,

these codes are comparatively easy to break.[74] Computer graphics uses matrices both to represent objects and to

calculate transformations of objects using aﬃne rotation matrices to accomplish tasks such as projecting a threedimensional object onto a two-dimensional screen, corresponding to a theoretical camera observation.[75] Matrices

over a polynomial ring are important in the study of control theory.

Chemistry makes use of matrices in various ways, particularly since the use of quantum theory to discuss molecular

bonding and spectroscopy. Examples are the overlap matrix and the Fock matrix used in solving the Roothaan

equations to obtain the molecular orbitals of the Hartree–Fock method.

8.10.1

Graph theory

The adjacency matrix of a ﬁnite graph is a basic notion of graph theory.[76] It records which vertices of the graph

are connected by an edge. Matrices containing just two diﬀerent values (1 and 0 meaning for example “yes” and

“no”, respectively) are called logical matrices. The distance (or cost) matrix contains information about distances of

the edges.[77] These concepts can be applied to websites connected by hyperlinks or cities connected by roads etc.,

in which case (unless the connection network is extremely dense) the matrices tend to be sparse, i.e., contain few

nonzero entries. Therefore, speciﬁcally tailored matrix algorithms can be used in network theory.

8.10.2

Analysis and geometry

The Hessian matrix of a diﬀerentiable function ƒ: Rn → R consists of the second derivatives of ƒ with respect to the

several coordinate directions, i.e.[78]

[

]

∂2f

H(f ) =

.

∂xi ∂xj

It encodes information about the local growth behaviour of the function: given a critical point x = (x1 , ..., xn), i.e., a

point where the ﬁrst partial derivatives ∂f /∂xi of ƒ vanish, the function has a local minimum if the Hessian matrix is

positive deﬁnite. Quadratic programming can be used to ﬁnd global minima or maxima of quadratic functions closely

related to the ones attached to matrices (see above).[79]

Another matrix frequently used in geometrical situations is the Jacobi matrix of a diﬀerentiable map f: Rn → Rm . If

f 1 , ..., fm denote the components of f, then the Jacobi matrix is deﬁned as [80]

[

∂fi

J(f ) =

∂xj

]

.

1≤i≤m,1≤j≤n

If n > m, and if the rank of the Jacobi matrix attains its maximal value m, f is locally invertible at that point, by the

implicit function theorem.[81]

102

CHAPTER 8. MATRIX (MATHEMATICS)

2

3

1

1

An undirected graph with adjacency matrix 1

0

1

0

1

0

1 .

0

Partial diﬀerential equations can be classiﬁed by considering the matrix of coeﬃcients of the highest-order diﬀerential

operators of the equation. For elliptic partial diﬀerential equations this matrix is positive deﬁnite, which has decisive

inﬂuence on the set of possible solutions of the equation in question.[82]

The ﬁnite element method is an important numerical method to solve partial diﬀerential equations, widely applied in

simulating complex physical systems. It attempts to approximate the solution to some equation by piecewise linear

functions, where the pieces are chosen with respect to a suﬃciently ﬁne grid, which in turn can be recast as a matrix

equation.[83]

8.10.3

Probability theory and statistics

Stochastic matrices are square matrices whose rows are probability vectors, i.e., whose entries are non-negative and

sum up to one. Stochastic matrices are used to deﬁne Markov chains with ﬁnitely many states.[84] A row of the

stochastic matrix gives the probability distribution for the next position of some particle currently in the state that

corresponds to the row. Properties of the Markov chain like absorbing states, i.e., states that any particle attains

eventually, can be read oﬀ the eigenvectors of the transition matrices.[85]

Statistics also makes use of matrices in many diﬀerent forms.[86] Descriptive statistics is concerned with describing

data sets, which can often be represented as data matrices, which may then be subjected to dimensionality reduction

techniques. The covariance matrix encodes the mutual variance of several random variables.[87] Another technique

8.10. APPLICATIONS

At the saddle point (x = 0, y = 0) (red) of the function f(x,−y) = x2 − y2 , the Hessian matrix

103

[

2

0

]

0

is indeﬁnite.

−2

using matrices are linear least squares, a method that approximates a ﬁnite set of pairs (x1 , y1 ), (x2 , y2 ), ..., (xN, yN),

by a linear function

yi ≈ axi + b, i = 1, ..., N

which can be formulated in terms of matrices, related to the singular value decomposition of matrices.[88]

Random matrices are matrices whose entries are random numbers, subject to suitable probability distributions, such

as matrix normal distribution. Beyond probability theory, they are applied in domains ranging from number theory

to physics.[89][90]

8.10.4

Symmetries and transformations in physics

Further information: Symmetry in physics

Linear transformations and the associated symmetries play a key role in modern physics. For example, elementary

particles in quantum ﬁeld theory are classiﬁed as representations of the Lorentz group of special relativity and, more

speciﬁcally, by their behavior under the spin group. Concrete representations involving the Pauli matrices and more

general gamma matrices are an integral part of the physical description of fermions, which behave as spinors.[91] For

the three lightest quarks, there is a group-theoretical representation involving the special unitary group SU(3); for

their calculations, physicists use a convenient matrix representation known as the Gell-Mann matrices, which are also

used for the SU(3) gauge group that forms the basis of the modern description of strong nuclear interactions, quantum

chromodynamics. The Cabibbo–Kobayashi–Maskawa matrix, in turn, expresses the fact that the basic quark states

that are important for weak interactions are not the same as, but linearly related to the basic quark states that deﬁne

particles with speciﬁc and distinct masses.[92]

104

CHAPTER 8. MATRIX (MATHEMATICS)

Two diﬀerent Markov chains. The chart depicts the number of

[ particles

] (of a total

[ of 1000)

] in state “2”. Both limiting values can be

.7 0

.7 .2

determined from the transition matrices, which are given by

(red) and

(black).

.3 1

.3 .8

8.10.5

Linear combinations of quantum states

The ﬁrst model of quantum mechanics (Heisenberg, 1925) represented the theory’s operators by inﬁnite-dimensional

matrices acting on quantum states.[93] This is also referred to as matrix mechanics. One particular example is the

density matrix that characterizes the “mixed” state of a quantum system as a linear combination of elementary, “pure”

eigenstates.[94]

Another matrix serves as a key tool for describing the scattering experiments that form the cornerstone of experimental particle physics: Collision reactions such as occur in particle accelerators, where non-interacting particles head

towards each other and collide in a small interaction zone, with a new set of non-interacting particles as the result,

can be described as the scalar product of outgoing particle states and a linear combination of ingoing particle states.

The linear combination is given by a matrix known as the S-matrix, which encodes all information about the possible

interactions between particles.[95]

8.10.6

Normal modes

A general application of matrices in physics is to the description of linearly coupled harmonic systems. The equations

of motion of such systems can be described in matrix form, with a mass matrix multiplying a generalized velocity

to give the kinetic term, and a force matrix multiplying a displacement vector to characterize the interactions. The

best way to obtain solutions is to determine the system’s eigenvectors, its normal modes, by diagonalizing the matrix

equation. Techniques like this are crucial when it comes to the internal dynamics of molecules: the internal vibrations of systems consisting of mutually bound component atoms.[96] They are also needed for describing mechanical

vibrations, and oscillations in electrical circuits.[97]

8.11. HISTORY

8.10.7

105

Geometrical optics

Geometrical optics provides further matrix applications. In this approximative theory, the wave nature of light is

neglected. The result is a model in which light rays are indeed geometrical rays. If the deﬂection of light rays by

optical elements is small, the action of a lens or reﬂective element on a given light ray can be expressed as multiplication

of a two-component vector with a two-by-two matrix called ray transfer matrix: the vector’s components are the light

ray’s slope and its distance from the optical axis, while the matrix encodes the properties of the optical element.

Actually, there are two kinds of matrices, viz. a refraction matrix describing the refraction at a lens surface, and a

translation matrix, describing the translation of the plane of reference to the next refracting surface, where another

refraction matrix applies. The optical system, consisting of a combination of lenses and/or reﬂective elements, is

simply described by the matrix resulting from the product of the components’ matrices.[98]

8.10.8

Electronics

Traditional mesh analysis in electronics leads to a system of linear equations that can be described with a matrix.

The behaviour of many electronic components can be described using matrices. Let A be a 2-dimensional vector

with the component’s input voltage v1 and input current i1 as its elements, and let B be a 2-dimensional vector

with the component’s output voltage v2 and output current i2 as its elements. Then the behaviour of the electronic

component can be described by B = H · A, where H is a 2 x 2 matrix containing one impedance element (h12 ),

one admittance element (h21 ) and two dimensionless elements (h11 and h22 ). Calculating a circuit now reduces to

multiplying matrices.

8.11 History

Matrices have a long history of application in solving linear equations but they were known as arrays until the 1800s.

The Chinese text The Nine Chapters on the Mathematical Art written in 10th–2nd century BCE is the ﬁrst example

of the use of array methods to solve simultaneous equations,[99] including the concept of determinants. In 1545

Italian mathematician Girolamo Cardano brought the method to Europe when he published Ars Magna.[100] The

Japanese mathematician Seki used the same array methods to solve simultaneous equations in 1683.[101] The Dutch

Mathematician Jan de Witt represented transformations using arrays in his 1659 book Elements of Curves (1659).[102]

Between 1700 and 1710 Gottfried Wilhelm Leibniz publicized the use of arrays for recording information or solutions

and experimented with over 50 diﬀerent systems of arrays.[100] Cramer presented his rule in 1750.

The term “matrix” (Latin for “womb”, derived from mater—mother[103] ) was coined by James Joseph Sylvester in

1850,[104] who understood a matrix as an object giving rise to a number of determinants today called minors, that is

to say, determinants of smaller matrices that derive from the original one by removing columns and rows. In an 1851

paper, Sylvester explains:

I have in previous papers deﬁned a “Matrix” as a rectangular array of terms, out of which diﬀerent

systems of determinants may be engendered as from the womb of a common parent.[105]

Arthur Cayley published a treatise on geometric transformations using matrices that were not rotated versions of the

coeﬃcients being investigated as had previously been done. Instead he deﬁned operations such as addition, subtraction, multiplication, and division as transformations of those matrices and showed the associative and distributive

properties held true. Cayley investigated and demonstrated the non-commutative property of matrix multiplication

as well as the commutative property of matrix addition.[100] Early matrix theory had limited the use of arrays almost

exclusively to determinants and Arthur Cayley’s abstract matrix operations were revolutionary. He was instrumental

in proposing a matrix concept independent of equation systems. In 1858 Cayley published his Memoir on the theory

of matrices[106][107] in which he proposed and demonstrated the Cayley-Hamilton theorem.[100]

An English mathematician named Cullis was the ﬁrst to use modern bracket notation for matrices in 1913 and he

simultaneously demonstrated the ﬁrst signiﬁcant use the notation A = [ai,j] to represent a matrix where ai,j refers to

the ith row and the jth column.[100]

The study of determinants sprang from several sources.[108] Number-theoretical problems led Gauss to relate coefﬁcients of quadratic forms, i.e., expressions such as x2 + xy − 2y2 , and linear maps in three dimensions to matrices. Eisenstein further developed these notions, including the remark that, in modern parlance, matrix products are

106

CHAPTER 8. MATRIX (MATHEMATICS)

non-commutative. Cauchy was the ﬁrst to prove general statements about determinants, using as deﬁnition of the

determinant of a matrix A = [ai,j] the following: replace the powers aj k by ajk in the polynomial

a1 a2 · · · an

∏

(aj − ai )

i<j

where Π denotes the product of the indicated terms. He also showed, in 1829, that the eigenvalues of symmetric matrices are real.[109] Jacobi studied “functional determinants”—later called Jacobi determinants by Sylvester—which

can be used to describe geometric transformations at a local (or inﬁnitesimal) level, see above; Kronecker’s Vorlesungen über die Theorie der Determinanten[110] and Weierstrass’ Zur Determinantentheorie,[111] both published in 1903,

ﬁrst treated determinants axiomatically, as opposed to previous more concrete approaches such as the mentioned

formula of Cauchy. At that point, determinants were ﬁrmly established.

Many theorems were ﬁrst established for small matrices only, for example the Cayley–Hamilton theorem was proved

for 2×2 matrices by Cayley in the aforementioned memoir, and by Hamilton for 4×4 matrices. Frobenius, working

on bilinear forms, generalized the theorem to all dimensions (1898). Also at the end of the 19th century the Gauss–

Jordan elimination (generalizing a special case now known as Gauss elimination) was established by Jordan. In the

early 20th century, matrices attained a central role in linear algebra.[112] partially due to their use in classiﬁcation of

the hypercomplex number systems of the previous century.

The inception of matrix mechanics by Heisenberg, Born and Jordan led to studying matrices with inﬁnitely many

rows and columns.[113] Later, von Neumann carried out the mathematical formulation of quantum mechanics, by

further developing functional analytic notions such as linear operators on Hilbert spaces, which, very roughly speaking,

correspond to Euclidean space, but with an inﬁnity of independent directions.

8.11.1

Other historical usages of the word “matrix” in mathematics

The word has been used in unusual ways by at least two authors of historical importance.

Bertrand Russell and Alfred North Whitehead in their Principia Mathematica (1910–1913) use the word “matrix” in

the context of their Axiom of reducibility. They proposed this axiom as a means to reduce any function to one of

lower type, successively, so that at the “bottom” (0 order) the function is identical to its extension:

“Let us give the name of matrix to any function, of however many variables, which does not involve any

apparent variables. Then any possible function other than a matrix is derived from a matrix by means

of generalization, i.e., by considering the proposition which asserts that the function in question is true

with all possible values or with some value of one of the arguments, the other argument or arguments

remaining undetermined”.[114]

For example a function Φ(x, y) of two variables x and y can be reduced to a collection of functions of a single variable,

e.g., y, by “considering” the function for all possible values of “individuals” ai substituted in place of variable x. And

then the resulting collection of functions of the single variable y, i.e., ∀aᵢ: Φ(ai, y), can be reduced to a “matrix” of

values by “considering” the function for all possible values of “individuals” bi substituted in place of variable y:

∀b ∀aᵢ: Φ(ai, b ).

Alfred Tarski in his 1946 Introduction to Logic used the word “matrix” synonymously with the notion of truth table

as used in mathematical logic.[115]

8.12 See also

• Algebraic multiplicity

• Geometric multiplicity

• Gram-Schmidt process

• List of matrices

8.13. NOTES

107

• Matrix calculus

• Periodic matrix set

• Tensor

8.13 Notes

[1] equivalently, table

[2] Anton (1987, p. 23)

[3] Beauregard & Fraleigh (1973, p. 56)

[4] Young, Cynthia. Precalculus. Laurie Rosatone. p. 727. Check date values in: |accessdate= (help);

[5] K. Bryan and T. Leise. The $25,000,000,000 eigenvector: The linear algebra behind Google. SIAM Review, 48(3):569–

581, 2006.

[6] Lang 2002

[7] Fraleigh (1976, p. 209)

[8] Nering (1970, p. 37)

[9] Oualline 2003, Ch. 5

[10] “How to organize, add and multiply matrices - Bill Shillito”. TED ED. Retrieved April 6, 2013.

[11] Brown 1991, Deﬁnition I.2.1 (addition), Deﬁnition I.2.4 (scalar multiplication), and Deﬁnition I.2.33 (transpose)

[12] Brown 1991, Theorem I.2.6

[13] Brown 1991, Deﬁnition I.2.20

[14] Brown 1991, Theorem I.2.24

[15] Horn & Johnson 1985, Ch. 4 and 5

[16] Bronson (1970, p. 16)

[17] Kreyszig (1972, p. 220)

[18] Protter & Morrey (1970, p. 869)

[19] Kreyszig (1972, pp. 241,244)

[20] Schneider, Hans; Barker, George Phillip (2012), Matrices and Linear Algebra, Dover Books on Mathematics, Courier

Dover Corporation, p. 251, ISBN 9780486139302.

[21] Perlis, Sam (1991), Theory of Matrices, Dover books on advanced mathematics, Courier Dover Corporation, p. 103, ISBN

9780486668109.

[22] Anton, Howard (414), Elementary Linear Algebra (10th ed.), John Wiley & Sons, ISBN 9780470458211 .

[23] Horn, Roger A.; Johnson, Charles R. (2012), Matrix Analysis (2nd ed.), Cambridge University Press, p. 17, ISBN

9780521839402.

[24] Brown 1991, I.2.21 and 22

[25] Greub 1975, Section III.2

[26] Brown 1991, Deﬁnition II.3.3

[27] Greub 1975, Section III.1

[28] Brown 1991, Theorem II.3.22

[29] Horn & Johnson 1985, Theorem 2.5.6

[30] Brown 1991, Deﬁnition I.2.28

108

[31] Brown 1991, Deﬁnition I.5.13

[32] Horn & Johnson 1985, Chapter 7

[33] Horn & Johnson 1985, Theorem 7.2.1

[34] Horn & Johnson 1985, Example 4.0.6, p. 169

[35] Brown 1991, Deﬁnition III.2.1

[36] Brown 1991, Theorem III.2.12

[37] Brown 1991, Corollary III.2.16

[38] Mirsky 1990, Theorem 1.4.1

[39] Brown 1991, Theorem III.3.18

[40] Brown 1991, Deﬁnition III.4.1

[41] Brown 1991, Deﬁnition III.4.9

[42] Brown 1991, Corollary III.4.10

[43] Householder 1975, Ch. 7

[44] Bau III & Trefethen 1997

[45] Golub & Van Loan 1996, Algorithm 1.3.1

[46] Golub & Van Loan 1996, Chapters 9 and 10, esp. section 10.2

[47] Golub & Van Loan 1996, Chapter 2.3

[48] For example, Mathematica, see Wolfram 2003, Ch. 3.7

[49] Press, Flannery & Teukolsky 1992

[50] Stoer & Bulirsch 2002, Section 4.1

[51] Horn & Johnson 1985, Theorem 2.5.4

[52] Horn & Johnson 1985, Ch. 3.1, 3.2

[53] Arnold & Cooke 1992, Sections 14.5, 7, 8

[54] Bronson 1989, Ch. 15

[55] Coburn 1955, Ch. V

[56] Lang 2002, Chapter XIII

[57] Lang 2002, XVII.1, p. 643

[58] Lang 2002, Proposition XIII.4.16

[59] Reichl 2004, Section L.2

[60] Greub 1975, Section III.3

[61] Greub 1975, Section III.3.13

[62] See any standard reference in group.

[63] Baker 2003, Def. 1.30

[64] Baker 2003, Theorem 1.2

[65] Artin 1991, Chapter 4.5

[66] Rowen 2008, Example 19.2, p. 198

[67] See any reference in representation theory or group representation.

[68] See the item “Matrix” in Itõ, ed. 1987

CHAPTER 8. MATRIX (MATHEMATICS)

8.13. NOTES

109

[69] “Empty Matrix: A matrix is empty if either its row or column dimension is zero”, Glossary, O-Matrix v6 User Guide

[70] “A matrix having at least one dimension equal to zero is called an empty matrix”, MATLAB Data Structures

[71] Fudenberg & Tirole 1983, Section 1.1.1

[72] Manning 1999, Section 15.3.4

[73] Ward 1997, Ch. 2.8

[74] Stinson 2005, Ch. 1.1.5 and 1.2.4

[75] Association for Computing Machinery 1979, Ch. 7

[76] Godsil & Royle 2004, Ch. 8.1

[77] Punnen 2002

[78] Lang 1987a, Ch. XVI.6

[79] Nocedal 2006, Ch. 16

[80] Lang 1987a, Ch. XVI.1

[81] Lang 1987a, Ch. XVI.5. For a more advanced, and more general statement see Lang 1969, Ch. VI.2

[82] Gilbarg & Trudinger 2001

[83] Šolin 2005, Ch. 2.5. See also stiﬀness method.

[84] Latouche & Ramaswami 1999

[85] Mehata & Srinivasan 1978, Ch. 2.8

[86] Healy, Michael (1986), Matrices for Statistics, Oxford University Press, ISBN 978-0-19-850702-4

[87] Krzanowski 1988, Ch. 2.2., p. 60

[88] Krzanowski 1988, Ch. 4.1

[89] Conrey 2007

[90] Zabrodin, Brezin & Kazakov et al. 2006

[91] Itzykson & Zuber 1980, Ch. 2

[92] see Burgess & Moore 2007, section 1.6.3. (SU(3)), section 2.4.3.2. (Kobayashi–Maskawa matrix)

[93] Schiﬀ 1968, Ch. 6

[94] Bohm 2001, sections II.4 and II.8

[95] Weinberg 1995, Ch. 3

[96] Wherrett 1987, part II

[97] Riley, Hobson & Bence 1997, 7.17

[98] Guenther 1990, Ch. 5

[99] Shen, Crossley & Lun 1999 cited by Bretscher 2005, p. 1

[100] Discrete Mathematics 4th Ed. Dossey, Otto, Spense, Vanden Eynden, Published by Addison Wesley, October 10, 2001

ISBN 978-0321079121 | p.564-565

[101] Needham, Joseph; Wang Ling (1959). Science and Civilisation in China III. Cambridge: Cambridge University Press. p.

117. ISBN 9780521058018.

[102] Discrete Mathematics 4th Ed. Dossey, Otto, Spense, Vanden Eynden, Published by Addison Wesley, October 10, 2001

ISBN 978-0321079121 | p.564

[103] Merriam–Webster dictionary, Merriam–Webster, retrieved April 20, 2009

110

CHAPTER 8. MATRIX (MATHEMATICS)

[104] Although many sources state that J. J. Sylvester coined the mathematical term “matrix” in 1848, Sylvester published nothing

in 1848. (For proof that Sylvester published nothing in 1848, see: J. J. Sylvester with H. F. Baker, ed., The Collected

Mathematical Papers of James Joseph Sylvester (Cambridge, England: Cambridge University Press, 1904), vol. 1.) His

earliest use of the term “matrix” occurs in 1850 in: J. J. Sylvester (1850) “Additions to the articles in the September

number of this journal, “On a new class of theorems,” and on Pascal’s theorem,” The London, Edinburgh and Dublin

Philosophical Magazine and Journal of Science, 37 : 363-370. From page 369: “For this purpose we must commence, not

with a square, but with an oblong arrangement of terms consisting, suppose, of m lines and n columns. This will not in

itself represent a determinant, but is, as it were, a Matrix out of which we may form various systems of determinants … "

[105] The Collected Mathematical Papers of James Joseph Sylvester: 1837–1853, Paper 37, p. 247

[106] Phil.Trans. 1858, vol.148, pp.17-37 Math. Papers II 475-496

[107] Dieudonné, ed. 1978, Vol. 1, Ch. III, p. 96

[108] Knobloch 1994

[109] Hawkins 1975

[110] Kronecker 1897

[111] Weierstrass 1915, pp. 271–286

[112] Bôcher 2004

[113] Mehra & Rechenberg 1987

[114] Whitehead, Alfred North; and Russell, Bertrand (1913) Principia Mathematica to *56, Cambridge at the University Press,

Cambridge UK (republished 1962) cf page 162ﬀ.

[115] Tarski, Alfred; (1946) Introduction to Logic and the Methodology of Deductive Sciences, Dover Publications, Inc, New York

NY, ISBN 0-486-28462-X.

[1] Eigen means “own” in German and in Dutch.

[2] Additionally, the group is required to be closed in the general linear group.

[3] “Not much of matrix theory carries over to inﬁnite-dimensional spaces, and what does is not so useful, but it sometimes

helps.” Halmos 1982, p. 23, Chapter 5

8.14 References

• Anton, Howard (1987), Elementary Linear Algebra (5th ed.), New York: Wiley, ISBN 0-471-84819-0

• Arnold, Vladimir I.; Cooke, Roger (1992), Ordinary diﬀerential equations, Berlin, DE; New York, NY:

Springer-Verlag, ISBN 978-3-540-54813-3

• Artin, Michael (1991), Algebra, Prentice Hall, ISBN 978-0-89871-510-1

• Association for Computing Machinery (1979), Computer Graphics, Tata McGraw–Hill, ISBN 978-0-07-0593763

• Baker, Andrew J. (2003), Matrix Groups: An Introduction to Lie Group Theory, Berlin, DE; New York, NY:

Springer-Verlag, ISBN 978-1-85233-470-3

• Bau III, David; Trefethen, Lloyd N. (1997), Numerical linear algebra, Philadelphia, PA: Society for Industrial

and Applied Mathematics, ISBN 978-0-89871-361-9

• Beauregard, Raymond A.; Fraleigh, John B. (1973), A First Course In Linear Algebra: with Optional Introduction to Groups, Rings, and Fields, Boston: Houghton Miﬄin Co., ISBN 0-395-14017-X

• Bretscher, Otto (2005), Linear Algebra with Applications (3rd ed.), Prentice Hall

• Bronson, Richard (1970), Matrix Methods: An Introduction, New York: Academic Press, LCCN 70097490

• Bronson, Richard (1989), Schaum’s outline of theory and problems of matrix operations, New York: McGraw–

Hill, ISBN 978-0-07-007978-6

8.14. REFERENCES

111

• Brown, William C. (1991), Matrices and vector spaces, New York, NY: Marcel Dekker, ISBN 978-0-82478419-5

• Coburn, Nathaniel (1955), Vector and tensor analysis, New York, NY: Macmillan, OCLC 1029828

• Conrey, J. Brian (2007), Ranks of elliptic curves and random matrix theory, Cambridge University Press, ISBN

978-0-521-69964-8

• Fraleigh, John B. (1976), A First Course In Abstract Algebra (2nd ed.), Reading: Addison-Wesley, ISBN 0201-01984-1

• Fudenberg, Drew; Tirole, Jean (1983), Game Theory, MIT Press

• Gilbarg, David; Trudinger, Neil S. (2001), Elliptic partial diﬀerential equations of second order (2nd ed.),

Berlin, DE; New York, NY: Springer-Verlag, ISBN 978-3-540-41160-4

• Godsil, Chris; Royle, Gordon (2004), Algebraic Graph Theory, Graduate Texts in Mathematics 207, Berlin,

DE; New York, NY: Springer-Verlag, ISBN 978-0-387-95220-8

• Golub, Gene H.; Van Loan, Charles F. (1996), Matrix Computations (3rd ed.), Johns Hopkins, ISBN 978-08018-5414-9

• Greub, Werner Hildbert (1975), Linear algebra, Graduate Texts in Mathematics, Berlin, DE; New York, NY:

Springer-Verlag, ISBN 978-0-387-90110-7

• Halmos, Paul Richard (1982), A Hilbert space problem book, Graduate Texts in Mathematics 19 (2nd ed.),

Berlin, DE; New York, NY: Springer-Verlag, ISBN 978-0-387-90685-0, MR 675952

• Horn, Roger A.; Johnson, Charles R. (1985), Matrix Analysis, Cambridge University Press, ISBN 978-0-52138632-6

• Householder, Alston S. (1975), The theory of matrices in numerical analysis, New York, NY: Dover Publications, MR 0378371

• Kreyszig, Erwin (1972), Advanced Engineering Mathematics (3rd ed.), New York: Wiley, ISBN 0-471-507288.

• Krzanowski, Wojtek J. (1988), Principles of multivariate analysis, Oxford Statistical Science Series 3, The

Clarendon Press Oxford University Press, ISBN 978-0-19-852211-9, MR 969370

• Itõ, Kiyosi, ed. (1987), Encyclopedic dictionary of mathematics. Vol. I-IV (2nd ed.), MIT Press, ISBN 978-0262-09026-1, MR 901762

• Lang, Serge (1969), Analysis II, Addison-Wesley

• Lang, Serge (1987a), Calculus of several variables (3rd ed.), Berlin, DE; New York, NY: Springer-Verlag,

ISBN 978-0-387-96405-8

• Lang, Serge (1987b), Linear algebra, Berlin, DE; New York, NY: Springer-Verlag, ISBN 978-0-387-96412-6

• Lang, Serge (2002), Algebra, Graduate Texts in Mathematics 211 (Revised third ed.), New York: SpringerVerlag, ISBN 978-0-387-95385-4, MR 1878556

• Latouche, Guy; Ramaswami, Vaidyanathan (1999), Introduction to matrix analytic methods in stochastic modeling (1st ed.), Philadelphia, PA: Society for Industrial and Applied Mathematics, ISBN 978-0-89871-425-8

• Manning, Christopher D.; Schütze, Hinrich (1999), Foundations of statistical natural language processing, MIT

Press, ISBN 978-0-262-13360-9

• Mehata, K. M.; Srinivasan, S. K. (1978), Stochastic processes, New York, NY: McGraw–Hill, ISBN 978-0-07096612-3

• Mirsky, Leonid (1990), An Introduction to Linear Algebra, Courier Dover Publications, ISBN 978-0-48666434-7

• Nering, Evar D. (1970), Linear Algebra and Matrix Theory (2nd ed.), New York: Wiley, LCCN 76-91646

112

CHAPTER 8. MATRIX (MATHEMATICS)

• Nocedal, Jorge; Wright, Stephen J. (2006), Numerical Optimization (2nd ed.), Berlin, DE; New York, NY:

Springer-Verlag, p. 449, ISBN 978-0-387-30303-1

• Oualline, Steve (2003), Practical C++ programming, O'Reilly, ISBN 978-0-596-00419-4

• Press, William H.; Flannery, Brian P.; Teukolsky, Saul A.; Vetterling, William T. (1992), “LU Decomposition and Its Applications”, Numerical Recipes in FORTRAN: The Art of Scientiﬁc Computing (PDF) (2nd ed.),

Cambridge University Press, pp. 34–42

• Protter, Murray H.; Morrey, Jr., Charles B. (1970), College Calculus with Analytic Geometry (2nd ed.), Reading:

Addison-Wesley, LCCN 76087042

• Punnen, Abraham P.; Gutin, Gregory (2002), The traveling salesman problem and its variations, Boston, MA:

Kluwer Academic Publishers, ISBN 978-1-4020-0664-7

• Reichl, Linda E. (2004), The transition to chaos: conservative classical systems and quantum manifestations,

Berlin, DE; New York, NY: Springer-Verlag, ISBN 978-0-387-98788-0

• Rowen, Louis Halle (2008), Graduate Algebra: noncommutative view, Providence, RI: American Mathematical

Society, ISBN 978-0-8218-4153-2

• Šolin, Pavel (2005), Partial Diﬀerential Equations and the Finite Element Method, Wiley-Interscience, ISBN

978-0-471-76409-0

• Stinson, Douglas R. (2005), Cryptography, Discrete Mathematics and its Applications, Chapman & Hall/CRC,

ISBN 978-1-58488-508-5

• Stoer, Josef; Bulirsch, Roland (2002), Introduction to Numerical Analysis (3rd ed.), Berlin, DE; New York,

NY: Springer-Verlag, ISBN 978-0-387-95452-3

• Ward, J. P. (1997), Quaternions and Cayley numbers, Mathematics and its Applications 403, Dordrecht, NL:

Kluwer Academic Publishers Group, ISBN 978-0-7923-4513-8, MR 1458894

• Wolfram, Stephen (2003), The Mathematica Book (5th ed.), Champaign, IL: Wolfram Media, ISBN 978-157955-022-6

8.14.1

Physics references

• Bohm, Arno (2001), Quantum Mechanics: Foundations and Applications, Springer, ISBN 0-387-95330-2

• Burgess, Cliﬀ; Moore, Guy (2007), The Standard Model. A Primer, Cambridge University Press, ISBN 0-52186036-9

• Guenther, Robert D. (1990), Modern Optics, John Wiley, ISBN 0-471-60538-7

• Itzykson, Claude; Zuber, Jean-Bernard (1980), Quantum Field Theory, McGraw–Hill, ISBN 0-07-032071-3

• Riley, Kenneth F.; Hobson, Michael P.; Bence, Stephen J. (1997), Mathematical methods for physics and

engineering, Cambridge University Press, ISBN 0-521-55506-X

• Schiﬀ, Leonard I. (1968), Quantum Mechanics (3rd ed.), McGraw–Hill

• Weinberg, Steven (1995), The Quantum Theory of Fields. Volume I: Foundations, Cambridge University Press,

ISBN 0-521-55001-7

• Wherrett, Brian S. (1987), Group Theory for Atoms, Molecules and Solids, Prentice–Hall International, ISBN

0-13-365461-3

• Zabrodin, Anton; Brezin, Édouard; Kazakov, Vladimir; Serban, Didina; Wiegmann, Paul (2006), Applications

of Random Matrices in Physics (NATO Science Series II: Mathematics, Physics and Chemistry), Berlin, DE; New

York, NY: Springer-Verlag, ISBN 978-1-4020-4530-1

8.15. EXTERNAL LINKS

8.14.2

113

Historical references

• A. Cayley A memoir on the theory of matrices. Phil. Trans. 148 1858 17-37; Math. Papers II 475-496

• Bôcher, Maxime (2004), Introduction to higher algebra, New York, NY: Dover Publications, ISBN 978-0-48649570-5, reprint of the 1907 original edition

• Cayley, Arthur (1889), The collected mathematical papers of Arthur Cayley, I (1841–1853), Cambridge University Press, pp. 123–126

• Dieudonné, Jean, ed. (1978), Abrégé d'histoire des mathématiques 1700-1900, Paris, FR: Hermann

• Hawkins, Thomas (1975), “Cauchy and the spectral theory of matrices”, Historia Mathematica 2: 1–29,

doi:10.1016/0315-0860(75)90032-4, ISSN 0315-0860, MR 0469635

• Knobloch, Eberhard (1994), “From Gauss to Weierstrass: determinant theory and its historical evaluations”,

The intersection of history and mathematics, Science Networks Historical Studies 15, Basel, Boston, Berlin:

Birkhäuser, pp. 51–66, MR 1308079

• Kronecker, Leopold (1897), Hensel, Kurt, ed., Leopold Kronecker’s Werke, Teubner

• Mehra, Jagdish; Rechenberg, Helmut (1987), The Historical Development of Quantum Theory (1st ed.), Berlin,

DE; New York, NY: Springer-Verlag, ISBN 978-0-387-96284-9

• Shen, Kangshen; Crossley, John N.; Lun, Anthony Wah-Cheung (1999), Nine Chapters of the Mathematical

Art, Companion and Commentary (2nd ed.), Oxford University Press, ISBN 978-0-19-853936-0

• Weierstrass, Karl (1915), Collected works 3

8.15 External links

Encyclopedic articles

• Hazewinkel, Michiel, ed. (2001), “Matrix”, Encyclopedia of Mathematics, Springer, ISBN 978-1-55608-010-4

History

• MacTutor: Matrices and determinants

• Matrices and Linear Algebra on the Earliest Uses Pages

• Earliest Uses of Symbols for Matrices and Vectors

Online books

• Kaw, Autar K., Introduction to Matrix Algebra, ISBN 978-0-615-25126-4

• The Matrix Cookbook (PDF), retrieved 24 March 2014

• Brookes, Mike (2005), The Matrix Reference Manual, London: Imperial College, retrieved 10 Dec 2008

Online matrix calculators

• SimplyMath (Matrix Calculator)

• Matrix Calculator (DotNumerics)

• Xiao, Gang, Matrix calculator, retrieved 10 Dec 2008

• Online matrix calculator, retrieved 10 Dec 2008

• Online matrix calculator (ZK framework), retrieved 26 Nov 2009

114

CHAPTER 8. MATRIX (MATHEMATICS)

• Oehlert, Gary W.; Bingham, Christopher, MacAnova, University of Minnesota, School of Statistics, retrieved

10 Dec 2008, a freeware package for matrix algebra and statistics

• Online matrix calculator, retrieved 14 Dec 2009

• Operation with matrices in R (determinant, track, inverse, adjoint, transpose)

Chapter 9

Vertex (graph theory)

For other uses, see Vertex (disambiguation).

In mathematics, and more speciﬁcally in graph theory, a vertex (plural vertices) or node is the fundamental unit

6

5

4

1

2

3

A graph with 6 vertices and 7 edges where the vertex number 6 on the far-left is a leaf vertex or a pendant vertex

of which graphs are formed: an undirected graph consists of a set of vertices and a set of edges (unordered pairs

of vertices), while a directed graph consists of a set of vertices and a set of arcs (ordered pairs of vertices). In a

diagram of a graph, a vertex is usually represented by a circle with a label, and an edge is represented by a line or

arrow extending from one vertex to another.

From the point of view of graph theory, vertices are treated as featureless and indivisible objects, although they may

have additional structure depending on the application from which the graph arises; for instance, a semantic network

is a graph in which the vertices represent concepts or classes of objects.

The two vertices forming an edge are said to be the endpoints of this edge, and the edge is said to be incident to the

vertices. A vertex w is said to be adjacent to another vertex v if the graph contains an edge (v,w). The neighborhood

of a vertex v is an induced subgraph of the graph, formed by all vertices adjacent to v.

115

116

CHAPTER 9. VERTEX (GRAPH THEORY)

9.1 Types of vertices

The degree of a vertex in a graph is the number of edges incident to it. An isolated vertex is a vertex with degree

zero; that is, a vertex that is not an endpoint of any edge. A leaf vertex (also pendant vertex) is a vertex with degree

one. In a directed graph, one can distinguish the outdegree (number of outgoing edges) from the indegree (number

of incoming edges); a source vertex is a vertex with indegree zero, while a sink vertex is a vertex with outdegree

zero.

A cut vertex is a vertex the removal of which would disconnect the remaining graph; a vertex separator is a collection

of vertices the removal of which would disconnect the remaining graph into small pieces. A k-vertex-connected graph

is a graph in which removing fewer than k vertices always leaves the remaining graph connected. An independent

set is a set of vertices no two of which are adjacent, and a vertex cover is a set of vertices that includes at least

one endpoint of each edge in the graph. The vertex space of a graph is a vector space having a set of basis vectors

corresponding with the graph’s vertices.

A graph is vertex-transitive if it has symmetries that map any vertex to any other vertex. In the context of graph

enumeration and graph isomorphism it is important to distinguish between labeled vertices and unlabeled vertices.

A labeled vertex is a vertex that is associated with extra information that enables it to be distinguished from other

labeled vertices; two graphs can be considered isomorphic only if the correspondence between their vertices pairs up

vertices with equal labels. An unlabeled vertex is one that can be substituted for any other vertex based only on its

adjacencies in the graph and not based on any additional information.

Vertices in graphs are analogous to, but not the same as, vertices of polyhedra: the skeleton of a polyhedron forms a

graph, the vertices of which are the vertices of the polyhedron, but polyhedron vertices have additional structure (their

geometric location) that is not assumed to be present in graph theory. The vertex ﬁgure of a vertex in a polyhedron

is analogous to the neighborhood of a vertex in a graph.

9.2 See also

• Node (computer science)

• Graph theory

• Glossary of graph theory

9.3 References

• Gallo, Giorgio; Pallotino, Stefano (1988). “Shortest path algorithms”. Annals of Operations Research 13 (1):

1–79. doi:10.1007/BF02288320.

• Berge, Claude, Théorie des graphes et ses applications. Collection Universitaire de Mathématiques, II Dunod,

Paris 1958, viii+277 pp. (English edition, Wiley 1961; Methuen & Co, New York 1962; Russian, Moscow

1961; Spanish, Mexico 1962; Roumanian, Bucharest 1969; Chinese, Shanghai 1963; Second printing of the

1962 ﬁrst English edition. Dover, New York 2001)

• Chartrand, Gary (1985). Introductory graph theory. New York: Dover. ISBN 0-486-24775-9.

• Biggs, Norman; Lloyd, E. H.; Wilson, Robin J. (1986). Graph theory, 1736-1936. Oxford [Oxfordshire]:

Clarendon Press. ISBN 0-19-853916-9.

• Harary, Frank (1969). Graph theory. Reading, Mass.: Addison-Wesley Publishing. ISBN 0-201-41033-8.

• Harary, Frank; Palmer, Edgar M. (1973). Graphical enumeration. New York, Academic Press. ISBN 0-12324245-2.

9.4 External links

• Weisstein, Eric W., “Graph Vertex”, MathWorld.

9.5. TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES

117

9.5 Text and image sources, contributors, and licenses

9.5.1

Text

• Computer science Source: https://en.wikipedia.org/wiki/Computer_science?oldid=668898640 Contributors: AxelBoldt, Derek Ross,

LC~enwiki, Lee Daniel Crocker, Tuxisuau, Brion VIBBER, Mav, Robert Merkel, Espen, The Anome, Tarquin, Taw, Jzcool, DanKeshet,

Andre Engels, Khendon, LA2, Jkominek, Aldie, Fubar Obfusco, SolKarma, SimonP, Peterlin~enwiki, Hannes Hirzel, Ole Aamot,

Camembert, B4hand, Hephaestos, Olivier, Stevertigo, Ghyll~enwiki, DrewT2, JohnOwens, Ted~enwiki, Michael Hardy, Erik Zachte,

Gretyl, Kwertii, JakeVortex, Dante Alighieri, Fuzzie, Rp, Bensmith, Mic, Ixfd64, Phoe6, Sannse, TakuyaMurata, Delirium, Loisel,

7265, Minesweeper, Pcb21, Kvikeg, MartinSpamer, Ahoerstemeier, Haakon, Stan Shebs, Docu, J-Wiki, Kazuo Moriwaka, Angela,

Jdforrester, Salsa Shark, Glenn, Cyan, LouI, Poor Yorick, Nikai, Azazello, Kwekubo, Jiang, Cryoboy, Rob Hooft, Jonik, Mxn, BRG,

Denny, Dgreen34, Schneelocke, Nikola Smolenski, Revolver, Popsracer, Charles Matthews, Guaka, Timwi, Dcoetzee, Sbwoodside, Dysprosia, Jitse Niesen, Jay, Daniel Quinlan, Michaeln, Greenrd, Quux, HappyDog, Tpbradbury, Maximus Rex, Cleduc, Morwen, Buridan, Ed g2s, Persoid, Mikez80, Wakka, Wernher, Bevo, Spikey, Traroth, Shizhao, Farmerchris, Dbabbitt, Raul654, Jim Mahoney,

Marc Girod~enwiki, Guppy, Carbuncle, ThereIsNoSteve, RadicalBender, Robbot, Sdedeo, Fredrik, Hobbes~enwiki, Soilguy2, R3m0t,

RedWolf, Troworld, Altenmann, Naddy, Lowellian, Chris Roy, Mirv, MathMartin, Merovingian, Hellotoast, Rfc1394, Academic Challenger, Texture, Bethenco, Diderot, Hadal, Nerval, Borislav, MOiRe, Pps, Bshankaran, Anthony, Lupo, HaeB, TexasDex, Guy Peters,

Xanzzibar, Iain.mcclatchie, Pengo, Tobias Bergemann, Applegoddess, Ancheta Wis, Decumanus, Honta, Gbali, Giftlite, Thv, Fennec,

Kenny sh, Netoholic, Abigail-II, Levin, Lupin, Zigger, Everyking, Henry Flower, Guanaco, Eequor, Matt Crypto, Just Another Dan,

Arvind Singh, Wmahan, Neilc, Quackor, Andycjp, Dullhunk, Bact, Kjetil r, Mineminemine, Antandrus, BozMo, Thray, Billposer,

APH, Josephgrossberg, Kntg, Bumm13, Sovereigna, Eiserlohpp, Leire Sánchez, Robin klein, Fvilim~enwiki, Andreas Kaufmann, Zondor, Grunt, EagleOne, Bluemask, Corti, Perl guy, Jwdietrich2, MichaelMcGuﬃn, Smimram, Erc, Discospinster, Leibniz, Notinasnaid,

SocratesJedi, Michael Zimmermann, Mani1, BBB~enwiki, Bender235, ESkog, Android79, Kbh3rd, S.K., Mattingly23, Project2501a,

Relix~enwiki, Barfooz, Linn~enwiki, Barcelova, Briséis~enwiki, RoyBoy, Bookofjude, Matteh, Aaronbrick, Coolcaesar, Bobo192, Smalljim, Shenme, Matt Britt, Maurreen, NattyBumppo, Sam Korn, Haham hanuka, Pearle, Mpeisenbr, Nsaa, Mdd, Passw0rd, Poweroid, Alansohn, Liao, Pinar, Samuel.Jones, Tek022, Jason Davies, Hellhound, TheVenerableBede, Walkerma, InShaneee, Hu, Katefan0, DoesPHalt,

Caesura, Polyphilo, Shinjiman, Wtmitchell, Velella, Shepshep, Cburnett, CloudNine, Mikeo, MIT Trekkie, HenryLi, Bookandcoﬀee,

Oleg Alexandrov, SimonW, Ott, Alex.g, Novacatz, Soultaco, Marasmusine, Woohookitty, Debuggar, Uncle G, Robert K S, Ruud Koot,

JeremyA, Orz, MONGO, Nakos2208~enwiki, Shmitra, Al E., TreveX, Ralﬁpedia, Sega381, Z80x86, Graham87, Qwertyus, Chun-hian,

SixWingedSeraph, OMouse, Reisio, Rjwilmsi, Mayumashu, MarSch, Materdaddy, Nneonneo, Ddawson, Jhballard, Bubba73, Brighterorange, The wub, Mkehrt, Kwharris, Sango123, Oo64eva, Leithp, Sheldrake, FayssalF, Johnnyw, Old Moonraker, Mathbot, Crazycomputers, Vsion, Makkuro, TheDJ, Intgr, SpectrumDT, BMF81, Jersey Devil, Bgwhite, Gwernol, Flcelloguy, Jayme, Eray~enwiki, The

Rambling Man, Wavelength, Spacepotato, Angus Lepper, Phantomsteve, RussBot, Jeﬀhoy, Hyad, Piet Delport, Epolk, SpuriousQ, Thoreaulylazy, Stephenb, Gaius Cornelius, Bovineone, Wimt, Anomalocaris, CarlHewitt, Vanished user kjdioejh329io3rksdkj, Mipadi, Grafen,

Jaxl, Ino5hiro, Bobbo, Hakkinen, Anetode, Yym, Jstrater, Jpbowen, JulesH, E rulez, Petr.adamek, Mgnbar, Tigershrike, Light current,

MCB, Sterling, Shimei, The Fish, Claygate, GraemeL, Joshua bigamo, Bachmann1234, Donhalcon, Katieh5584, Kungfuadam, Junglecat, Zvika, DVD R W, Finell, Hide&Reason, Thijswijs, SmackBot, Wilycoder, Sparkz08, Rtc, Slashme, Zanter, Olorin28, K-UNIT,

McGeddon, Brick Thrower, Mmeri, CapitalSasha, Jpvinall, Powo, Gilliam, Ohnoitsjamie, Skizzik, RickiRich, Tv316, Somewherepurple,

Bluebot, Nympheta, Crashuniverse, Jprg1966, Technotaoist, Miquonranger03, Fluri, LaggedOnUser, Spellchecker, Dzonatas, Krallja,

A. B., Dﬂetter, Rrelf, Fireduck, Can't sleep, clown will eat me, Readams, Andri12, Vanished User 0001, Edivorce, Allan McInnes,

Robma, Cybercobra, Jonovision, “alyosha”, MisterCharlie, Dreadstar, Richard001, Tompsci, Iridescence, Brycedrm, JohnC1987, Ultraexactzz, Sigma 7, Zito ta xania, Fyver528, Nazgul533, Lambiam, ArglebargleIV, SilverStar, Harryboyles, Kuru, Treyt021, IAENG,

AlphaTwo, Msc44, Evanx, IronGargoyle, Edenphd, Physis, Ekrub-ntyh, Ckatz, 16@r, JHunterJ, Slakr, Emerybob, Avs5221, Dicklyon, Tee Owe, Allamericanbear, Eridani, Dhp1080, RichardF, Xionbox, Beefyt, Hu12, Lucid, Levineps, DouglasCalvert, Siebrand,

OnBeyondZebrax, Iridescent, Onestone, Markan~enwiki, Xsmith, Joseph Solis in Australia, Pegasus1138, Aeternus, Igoldste, Crippled Sloth, Courcelles, FairuseBot, Tawkerbot2, Jwalls, CRGreathouse, Ahy1, CBM, Banedon, NaBUru38, NickW557, Requestion,

Leujohn, Myasuda, Simeon, Gregbard, Mac010382, Bobthesmiley, Porco-esphino, Gogo Dodo, Blaisorblade, Christian75, Chrislk02,

Garik, Kozuch, Daven200520, Omicronpersei8, EnglishEfternamn, Epbr123, ClosedEyesSeeing, Hunan131, Headbomb, Newton2, Louis

Waweru, Ideogram, Thiaguwin, Mikeeg555, Druiloor, Klausness, Dawnseeker2000, Escarbot, AntiVandalBot, BokicaK, Luna Santin,

Seaphoto, Olexandr Kravchuk, Poshzombie, Superzohar, Mihas, Kdano, Carewolf, Hermel, JAnDbot, Niaz, Husond, Jimothytrotter,

Nthep, Mark Shaw, Rstevens27, Aviroop Ghosh, Fourchannel, Dream Focus, Bookinvestor, Raanoo, 4jobs, Bongwarrior, VoABot II,

Nyq, Necklace, JamesBWatson, Appraiser, Jlenthe, Cadsuane Melaidhrin, Rivertorch, Nikevich, Indon, Nucleophilic, ArchStanton69,

Allstarecho, Bmeguru, JaGa, Kgﬂeischmann, Esanchez7587, D.h, Calltech, Pavel Jelínek, Gwern, Hdt83, MartinBot, Mouhanad alramli,

Anaxial, CommonsDelinker, Pacdude9, Erkan Yilmaz, J.delanoy, Pedrito, Trusilver, Metamusing, Sandeepgupta, Ps ttf, Maurice Carbonaro, Rodhilton, Mike.lifeguard, Christian Storm, Tparameter, The Transhumanist (AWB), NewEnglandYankee, Hennessey, Patrick,

Brian Pearson, Sanscrit1234, Jevansen, Bonadea, Dzenanz, User77764, Regenspaziergang, Neil Dodgson, Cromoser, Idioma-bot, Sheliak, Wikieditor06, Vranak, 28bytes, Hersfold, Fossum~enwiki, Balaji7, MagicBanana, Barneca, Philip Trueman, TXiKiBoT, Coder Dan,

Austin Henderson, The Original Wildbear, Technopat, Sparkzy, Tomsega, Tms9980, Ocolon, T-Solo202, Ferengi, Metalmaniac69, Jackfork, Psyche825, Noformation, Everything counts, The Divine Fluﬀalizer, ARUNKUMAR P.R, Hankhuck, Andy Dingley, Julcia, Yk

Yk Yk, Wolfrock, Piecemealcranky, Careercornerstone, Lake Greifen, Oldwes, Nighthawk19, Insanity Incarnate, Sebastjanmm, Pjoef,

Palaeovia, E. H.-A. Gerbracht, Demize, NHRHS2010, Matthe20, D. Recorder, S.Örvarr.S, SieBot, EllenPetersen, Dawn Bard, Poisoncarter, Bruchowski, Ham Pastrami, Jerryobject, Happysailor, Flyer22, Radon210, JCLately, JetLover, JSpung, Aruton, Oxymoron83,

Anjin-san, Vpovilaitis, Lightmouse, Poindexter Propellerhead, Ceas webmaster, StaticGull, Mori Riyo~enwiki, Maxime.Debosschere,

Denisarona, Savie Kumara, Kayvan45622, Martarius, Sfan00 IMG, ClueBot, MBD123, Bwfrank, Foxj, The Thing That Should Not Be,

Chocoforfriends, Keeper76, HairyFotr, Diana cionoiu, Meisterkoch, Ndenison, Keraunoscopia, R000t, WDavis1911, Der Golem, Uncle

Milty, Agogino, SuperHamster, Niceguyedc, Zow, Amomam, Darkstar56, Jmcangas, Masterpiece2000, Excirial, Pumpmeup, Bedwanimas214, Diderot’s dreams, Jusdafax, Waiwai933, Farisori, John Nevard, Jakraay, Hezarfenn, Muhandes, Buscalade, Alejandrocaro35,

Sun Creator, Turnipface, Brianbjparker, Hans Adler, Morel, H.Marxen, ChrisHamburg, Thehelpfulone, GlasGhost, La Pianista, Thingg,

Hunhot, PCHS-NJROTC, Apparition11, DumZiBoT, AzraelUK, XLinkBot, Spitﬁre, Pichpich, Mohammadshamma, Rror, Pasha11,

Pimpedshortie99, Dhall1245, Little Mountain 5, Srikant.sharma, Dimoes, MCR789, Skarebo, WikHead, Galzigler, Airplaneman, Branrile09, Ackmenm, Max the tenken, Maimai009, Addbot, Some jerk on the Internet, DOI bot, Farzan mc, Betterusername, Elsendero,

CanadianLinuxUser, MrOllie, Download, LaaknorBot, Favonian, West.andrew.g, 5 albert square, Unknown483, Gusisgay, Cupat07, Systemetsys, Tide rolls, Bﬁgura’s puppy, Verbal, Teles, Jarble, Luckas-bot, Yobot, OrgasGirl, Fraggle81, MarioS, Cyanoa Crylate, SergeyJ,

118

CHAPTER 9. VERTEX (GRAPH THEORY)

Jnivekk, KamikazeBot, Khalfani khaldun, Sajibcse, Backslash Forwardslash, AnomieBOT, DemocraticLuntz, Jim1138, IRP, Galoubet, Royote, JackieBot, 9258fahsﬂkh917fas, Piano non troppo, Danielt998, Law, Flewis, Lilgip01, Giants27, Materialscientist, Rtyq2,

Salem F, Danno uk, Citation bot, Neurolysis, Roxxyroxursox, Quebec99, Xqbot, WikiNSK, Hubbard rox 2008, DSisyphBot, Grim23,

Raj Wijesinghe, Blix1ms0ns, Tyrol5, Miym, Deadbeatatdawn, Лев Дубовой, Shirik, Mathonius, Erstats, Amaury, Doulos Christos,

Dontknoa, Shadowjams, Methcub, CSgroup7, Luminique, Remshad, Velblod, CES1596, ESpublic013, FrescoBot, Skylark2008, Vitomontreal, Tobby72, Mark Renier, ToxicOranges, Recognizance, Vacuunaut, MTizz1, Machine Elf 1735, Louperibot, OgreBot, Citation bot 1, Dilaksan, MacMed, Pinethicket, Kiefer.Wolfowitz, BRUTE, Achraf52, Ezrdr, SpaceFlight89, Talbg, Meaghan, RandomStringOfCharacters, Jauhienij, Weylinp, Keri, Trappist the monk, SchreyP, Si23mk4n32i, Alexmilt, Lotje, Keith Cascio, Thefakeeditor, Ladies gifts, Weedwhacker128, Mttcmbs, Lysander89, Yondonjamts, DARTH SIDIOUS 2, Rednas1234, Saywhatman, Иъ Лю

Ха, Sarang, John.legal, Star-Syrup, Gnabi82, EmausBot, John of Reading, Acather96, WikitanvirBot, Pfuchs722, Surlyduﬀ50, Ibbn,

Tinytn, Xiaogaozi, Pratapy, Solarra, Tommy2010, Lightdarkend, Wikipelli, K6ka, Djembayz, Lucas Thoms, Sciprecision, AaronLLF,

Namastheg, BigMattyO, Cogiati, Spykeretro, Fæ, Josve05a, Bijuro, Steave77, H3llBot, Dennis714, Bveedu, Prashant Dey, Jay-Sebastos,

Vanished user ﬁjtji34toksdcknqrjn54yoimascj, Donner60, Junip~enwiki, Orange Suede Sofa, Rangoon11, Tijfo098, Danushka99999,

Srshetty94, TYelliot, 28bot, BigMatty93, Scotty16-2007, GreenEdu, Petrb, Hughleat, Signalizing, ClueBot NG, LogX, This lousy Tshirt, Satellizer, Sdht, Jcrwaford5, Fauzan, Hon-3s-T, Astew10, Dfarrell07, Bergbra, Rinaku, Cntras, Cnkids, O.Koslowski, Mcasswidmeyer, Widr, Tonywchen, Ashish Gaikwad, Ajjuddn, Lawsonstu, Saketmitra, Jk2q3jrklse, Helpful Pixie Bot, HMSSolent, Jkimdgu,

Wald, Wbm1058, Jiule0, Trunks ishida, Lowercase sigmabot, BG19bot, Furkhaocean, ISTB351, MusikAnimal, J991, Neutral current, FutureTrillionaire, Sickdartzepic, Cadiomals, Mayuri.sandhanshiv, CalaD33, Kairi p, Mihai.stefanache, Salesvery1, Bryson1410,

Zhenyanwang1, Sreedharram, Carso empires, Isacdaavid, Abilngeorge, Klilidiplomus, Pavankbk1113, Anbu121, XIN3N, LloydOlivier,

BattyBot, Computer tower, Mburkhol, Alkafriﬁras, ComputerScienceForum, Valueindian, Fagitcasey, E prosser, Varagrawal, The Illusive Man, GoShow, Chitraproject, JYBot, Tow, Mogism, Mani306, BlackHawkToughbook, Lugia2453, ַאְבָרָהם, Jamesx12345, Elcashini,

Zziccardi, Itoula, Snehlata1102, Ekips39, Faizan, Epicgenius, Babara150504, Crap12321, Littlejimmylel, Maggots187, Perfecshun,

Netiru, Agenbola1, Red-eyed demon, RG3Redskins, Eyesnore, PhantomTech, Tiberius6996, Satassi, Tentinator, Dad29, JpurvisUM,

Nbak, Kanoog, NJIT HUM dad29, Backendgaming, DavidLeighEllis, Diptytiw, Hollylilholly, Sibekoe, Spyglasses,

, Ginsuloft,

Quenhitran, MrLinkinPark333, Dannyruthe, Manul, TCMemoire, Rons corner, Jaaron95, Ritik2345678, Philroc, Sbrankov05, Magicalbeakz1, JaconaFrere, Indiasian mbe maﬁa, 7Sidz, Eaglepuﬀs, Kgeza71, CompSci, Bobobobobobobdup, Monkbot, Wigid, Vieque, Ahsannaweed101, James.hochadel, 1908rs, BobVermont, Swet.anzel mee, NishantRM, Stuartbrade, Chacha2001, Typherix, Crfranklin, Susith

Chandira Gts, Antithesisx, Oﬀy284, Robie024, Nigerhoe, Psychedgrad, ChamithN, Crystallizedcarbon, Prachi2812, Rider ranger47,

Yilinglou, Iman.haghdost, Hansguyholt, Yaourrrt, Pishcal, ErickaAgent, Astrachano, Yuil.Tr, K scheik, Swagkid1010, ABWarrick,

Niceguy69, KasparBot, Jamieddd, PACIFICASIAWiki, Brahma Pacey, Zakzak1112 and Anonymous: 1380

• Discrete mathematics Source: https://en.wikipedia.org/wiki/Discrete_mathematics?oldid=668426222 Contributors: AxelBoldt, General Wesc, Toby Bartels, Miguel~enwiki, Camembert, Bdesham, Michael Hardy, Nixdorf, Wapcaplet, TakuyaMurata, Nanshu, Snoyes,

Jdforrester, Poor Yorick, Rotem Dan, Andres, Mxn, Dgreen34, Charles Matthews, Dysprosia, McKay, Jni, Phil Boswell, Robbot, Josh

Cherry, R3m0t, Gandalf61, Tobias Bergemann, Marc Venot, Giftlite, DocWatson42, Nick8325, Tom harrison, Ds13, Guanaco, David

Battle, Knutux, APH, TiMike, Peter Kwok, Discospinster, Rich Farmbrough, Agnistus, Vsmith, Mani1, Goochelaar, ESkog, ZeroOne,

Zaslav, Kosmotheoria, Edwinstearns, RoyBoy, Obradovic Goran, Jumbuck, Msh210, Shoeﬂy, Igorpak, Oleg Alexandrov, Nahabedere,

MZMcBride, Bubba73, FlaBot, Jittat~enwiki, Psantora, Chobot, Kummi, YurikBot, Hairy Dude, Michael Slone, Hede2000, Bhny,

Stephenb, Chaos, Grafen, Trovatore, ZacBowling, Zwobot, Klutzy, Bbaumer, SimonMorgan, Majkl82, Sardanaphalus, JJL, SmackBot,

GoOdCoNtEnT, Kurykh, Silly rabbit, Taxipom, JonHarder, Jon Awbrey, N Shar, Drunken Pirate, Vriullop, JoshuaZ, MTSbot~enwiki,

Aeons, Dlohcierekim, JForget, CRGreathouse, Albert.white, Basawala, WeggeBot, Werratal, NoUser, Gregbard, Mike Christie, Christian75, Chrislk02, Epbr123, Hazmat2, Marek69, AntiVandalBot, JustOneJake, Seaphoto, JAnDbot, MER-C, The Transhumanist, Thenub314,

Avjoska, Jakob.scholbach, SwiftBot, BBar, David Eppstein, Vigyani, Tgeairn, Coppertwig, The Transhumanist (AWB), Ac3bf1, JohnBlackburne, LokiClock, VasilievVV, TXiKiBoT, Crohnie, Anna Lincoln, MarkMarek, Wikiisawesome, Lerdthenerd, Dmcq, Ohiostandard, EmxBot, Radagast3, Ivan Štambuk, Gerakibot, Atlandau, Xelgen, Dcspc, Jorgen W, ClueBot, Justin W Smith, The Thing That

Should Not Be, ChandlerMapBot, Alexbot, Muhandes, Cenarium, Bos Gaurus, PCHS-NJROTC, Marc van Leeuwen, Addbot, NjardarBot, MrVanBot, AndersBot, West.andrew.g, Teles, Zorrobot, Luckas-bot, Yobot, KamikazeBot, AnomieBOT, 1exec1, Galoubet,

JackieBot, Rtyq2, Citation bot, Wrelwser43, ArthurBot, Xqbot, Hydrox24, Tyrol5, GrouchoBot, Deadbeatatdawn, Shirik, Point-set

topologist, Charvest, Shadowjams, Adrignola, FrescoBot, Mark Renier, Citation bot 1, Kiefer.Wolfowitz, Carlc75, Mercy11, Trappist

the monk, Hurricoaster, Gf uip, EmausBot, Racerx11, Wikipelli, Darkﬁght, Bethnim, Akutagawa10, Werieth, Bollyjeﬀ, TomasMartin,

Lorem Ip, EdoBot, Petrb, ClueBot NG, Raiden10, Wcherowi, Bped1985, Rocker202, Jorgenev, Helpful Pixie Bot, Discreto, Cgnorthcutt,

Paolo Lipparini, SoniyaR, Ved.rigved, Yogesh2611, Brad7777, EricEnfermero, Cleanton, ChrisGualtieri, TheJJJunk, Frosty, Dhriti pati

sarkar 1641981, BenCluﬀ, GrettoBob, Jianhui67, Lagoset, Monkbot, SoSivr and Anonymous: 194

• Glossary of graph theory Source: https://en.wikipedia.org/wiki/Glossary_of_graph_theory?oldid=666492791 Contributors: Damian

Yerrick, XJaM, Nonenmac, Tomo, Edward, Patrick, Michael Hardy, Wshun, Booyabazooka, Dcljr, TakuyaMurata, GTBacchus, Eric119,

Charles Matthews, Dcoetzee, Dysprosia, Doradus, Reina riemann, Markhurd, Maximus Rex, Hyacinth, Populus, Altenmann, MathMartin,

Bkell, Giftlite, Dbenbenn, Brona, Sundar, GGordonWorleyIII, HorsePunchKid, Peter Kwok, D6, Rich Farmbrough, ArnoldReinhold, Paul

August, Bender235, Zaslav, Kjoonlee, Elwikipedista~enwiki, El C, Yitzhak, TheSolomon, A1kmm, 3mta3, Jérôme, Ricky81682, Rdvdijk, Oleg Alexandrov, Joriki, Linas, MattGiuca, Ruud Koot, Jwanders, Xiong, Lasunncty, SixWingedSeraph, Grammarbot, Tizio, Salix

alba, Mathbot, Margosbot~enwiki, Sunayana, Pojo, Quuxplusone, Vonkje, N8wilson, Chobot, Algebraist, YurikBot, Me and, Altoid,

Grubber, Archelon, Gaius Cornelius, Rick Norwood, Ott2, Closedmouth, SmackBot, Stux, Achab, Brick Thrower, Mgreenbe, Mcld,

[email protected], Lansey, Thechao, JLeander, DVanDyck, Quaeler, RekishiEJ, CmdrObot, Csaracho, Citrus538, Jokes Free4Me,

Cydebot, Starcrab, Quintopia, Ferris37, Scarpy, Headbomb, Salgueiro~enwiki, Spanningtree, David Eppstein, JoergenB, Kope, CopyToWiktionaryBot, R'n'B, Leyo, Mikhail Dvorkin, The Transliterator, Ratfox, MentorMentorum, Skaraoke, SanitySolipsism, Anonymous

Dissident, PaulTanenbaum, Ivan Štambuk, Whorush, Eggwadi, Thehotelambush, Doc honcho, Anchor Link Bot, Rsdetsch, Denisarona,

Justin W Smith, Unbuttered Parsnip, Happynomad, Alexey Muranov, Addbot, Aarond144, Jﬁtzell, Nate Wessel, Yobot, Jalal0, Ian Kelling,

Citation bot, Buenasdiaz, Twri, Kinewma, Miym, Prunesqualer, Mzamora2, JZacharyG, Pmq20, Shadowjams, Hobsonlane, DixonDBot, Reaper Eternal, EmausBot, John of Reading, Wikipelli, Bethnim, Mastergreg82, ClueBot NG, EmanueleMinotto, Warumwarum,

DavidRideout, BG19bot, Andrey.gric, Szebenisz, ChrisGualtieri, Deltahedron, Jw489kent, Jmerm, Morgoth106, SofjaKovalevskaja and

Anonymous: 131

• Graph (mathematics) Source: https://en.wikipedia.org/wiki/Graph_(mathematics)?oldid=668424269 Contributors: The Anome, Manning Bartlett, XJaM, Tomo, Stevertigo, Patrick, Michael Hardy, W~enwiki, Zocky, Wshun, Booyabazooka, Karada, Ahoerstemeier, Den

fjättrade ankan~enwiki, Jiang, Dcoetzee, Dysprosia, Doradus, Zero0000, McKay, BenRG, Robbot, LuckyWizard, Mountain, Altenmann,

9.5. TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES

119

Mayooranathan, Gandalf61, MathMartin, Timrollpickering, Bkell, Tobias Bergemann, Tosha, Giftlite, Dbenbenn, Harp, Tom harrison,

Chinasaur, Jason Quinn, Matt Crypto, Neilc, Erhudy, Knutux, Yath, Joeblakesley, Tomruen, Peter Kwok, Aknorals, Chmod007, Abdull, Corti, PhotoBox, Discospinster, Rich Farmbrough, Andros 1337, Paul August, Zaslav, Gauge, Tompw, Crisóﬁlax, Yitzhak, Kine,

Bobo192, Jpiw~enwiki, Mdd, Jumbuck, Zachlipton, Sswn, Liao, Rgclegg, Paleorthid, Super-Magician, Mahanga, Joriki, Mindmatrix,

Wesley Moy, Oliphaunt, Brentdax, Jwanders, Tbc2, Cbdorsett, Ch'marr, Davidfstr, Xiong, Marudubshinki, Tslocum, Magister Mathematicae, Ilya, SixWingedSeraph, Sjö, Rjwilmsi, Salix alba, Bhadani, FlaBot, Nowhither, Mathbot, Gurch, MikeBorkowski~enwiki,

Chronist~enwiki, Silversmith, Chobot, Peterl, Siddhant, Borgx, Karlscherer3, Hairy Dude, Gene.arboit, Michael Slone, Gaius Cornelius,

Shanel, Gwaihir, Dtrebbien, Dureo, Doetoe, Wknight94, Netrapt, RobertBorgersen, Cjfsyntropy, Burnin1134, SmackBot, Nihonjoe,

Stux, McGeddon, BiT, Algont, Ohnoitsjamie, Chris the speller, Bluebot, TimBentley, Theone256, Cornﬂake pirate, Zven, Anabus, Can't

sleep, clown will eat me, Tamfang, Cybercobra, Jon Awbrey, Kuru, Nat2, Tomhubbard, Dicklyon, Cbuckley, Quaeler, BranStark, Wandrer2, George100, Ylloh, Vaughan Pratt, Repied, CRGreathouse, Citrus538, Jokes Free4Me, Requestion, Myasuda, Danrah, Robertsteadman, Eric Lengyel, Headbomb, Urdutext, AntiVandalBot, Hannes Eder, JAnDbot, MER-C, Dreamster, Struthious Bandersnatch, JNW,

Catgut, David Eppstein, JoergenB, MartinBot, Rettetast, R'n'B, J.delanoy, Hans Dunkelberg, Yecril, Pafcu, Ijdejter, Deor, ABF, Maghnus, TXiKiBoT, Sdrucker, Someguy1221, PaulTanenbaum, Lambyte, Ilia Kr., Jpeeling, Falcon8765, RaseaC, Insanity Incarnate, Zenek.k,

Radagast3, Debamf, Debeolaurus, SieBot, Minder2k, Dawn Bard, Cwkmail, Jon har, SophomoricPedant, Oxymoron83, Henry Delforn

(old), Ddxc, Svick, Phegyi81, Anchor Link Bot, ClueBot, Vacio, Nsk92, JuPitEer, Huynl, JP.Martin-Flatin, Xavexgoem, UKoch, Mitmaro, Editor70, Watchduck, Hans Adler, Suchap, Wikidsp, Muro Bot, 3ICE, Aitias, Versus22, Djk3, Kruusamägi, SoxBot III, XLinkBot,

Marc van Leeuwen, Libcub, WikiDao, Tangi-tamma, Addbot, Gutin, Athenray, Willking1979, Royerloic, West.andrew.g, Tyw7, Zorrobot, LuK3, Luckas-bot, Yobot, TaBOT-zerem, THEN WHO WAS PHONE?, E mraedarab, Tempodivalse, Пика Пика, Ulric1313,

RandomAct, Materialscientist, Twri, Dockﬁsh, Anand jeyahar, Miym, Prunesqualer, Andyman100, VictorPorton, JonDePlume, Shadowjams, A.amitkumar, Kracekumar, Edgars2007, Citation bot 1, DrilBot, Amintora, Pinethicket, Calmer Waters, RobinK, Barras, Tgv8925,

DARTH SIDIOUS 2, Powerthirst123, DRAGON BOOSTER, Mymyhoward16, Kerrick Staley, Ajraddatz, Wgunther, Bethnim, Akutagawa10, White Trillium, Josve05a, D.Lazard, L Kensington, Maschen, Inka 888, Chewings72, ClueBot NG, Wcherowi, MelbourneStar,

Kingmash, O.Koslowski, Joel B. Lewis, Andrewsky00, Timﬂutre, Helpful Pixie Bot, HMSSolent, Grolmusz, Mrjohncummings, Stevetihi,

Канеюку, Void-995, MRG90, Vanischenu, Tman159, Ekren, Lugia2453, Jeﬀ Erickson, CentroBabbage, Nina Cerutti, Chip Wildon

Forster, Yloreander, Manul, JaconaFrere, Monkbot, Hou710, Anon124 and Anonymous: 351

• Graph theory Source: https://en.wikipedia.org/wiki/Graph_theory?oldid=667682086 Contributors: AxelBoldt, Kpjas, LC~enwiki, Robert

Merkel, Zundark, Taw, Jeronimo, BlckKnght, Dze27, Oskar Flordal, Andre Engels, Karl E. V. Palmen, Shd~enwiki, XJaM, JeLuF,

Arvindn, Gianfranco, Matusz, PierreAbbat, Miguel~enwiki, Boleslav Bobcik, FvdP, Camembert, Hirzel, Tomo, Patrick, Chas zzz brown,

Michael Hardy, Wshun, Booyabazooka, Glinos, Meekohi, Jakob Voss, TakuyaMurata, GTBacchus, Grog~enwiki, Pcb21, Dgrant, CesarB,

Looxix~enwiki, Ellywa, Ams80, Ronz, Nanshu, Gyan, Nichtich~enwiki, Mark Foskey, Александър, Poor Yorick, Caramdir~enwiki,

Mxn, Charles Matthews, Berteun, Almi, Hbruhn, Dysprosia, Daniel Quinlan, Gutza, Doradus, Zoicon5, Roachmeister, Populus, Zero0000,

Doctorbozzball, McKay, Shizhao, Optim, Robbot, Brent Gulanowski, Fredrik, Altenmann, Dittaeva, Gandalf61, MathMartin, Sverdrup,

Puckly, KellyCoinGuy, Thesilverbail, Bkell, Paul Murray, Fuelbottle, ElBenevolente, Aknxy, Dina, Tobias Bergemann, Giftlite, Dbenbenn, Thv, The Cave Troll, Elf, Lupin, Brona, Pashute, Duncharris, Andris, Jorge Stolﬁ, Tyir, Sundar, GGordonWorleyIII, Alan Au,

Bact, Knutux, APH, Tomruen, Tyler McHenry, Naerbnic, Peter Kwok, Robin klein, Ratiocinate, Andreas Kaufmann, Chmod007, Madewokherd, Discospinster, Solitude, Guanabot, Qutezuce, Mani1, Paul August, Bender235, Zaslav, Tompw, Diego UFCG~enwiki, Chalst,

Shanes, Renice, C S, Csl77, Jojit fb, Photonique, Jonsafari, Obradovic Goran, Jumbuck, Msh210, Alansohn, Liao, Mailer diablo, Marianocecowski, Aquae, Blair Azzopardi, Oleg Alexandrov, Youngster68, Linas, LOL, Ruud Koot, Tckma, Astrophil, Davidfstr, GregorB,

SCEhardt, Stochata, Xiong, Graham87, Magister Mathematicae, SixWingedSeraph, Rjwilmsi, Gmelli, George Burgess, Eugeneiiim, Arbor, Kalogeropoulos, Fred Bradstadt, FayssalF, FlaBot, PaulHoadley, RexNL, Vonkje, Chobot, Jinma, YurikBot, Wavelength, Michael

Slone, Gaius Cornelius, Alex Bakharev, Morphh, SEWilcoBot, Jaxl, Ino5hiro, Xdenizen, Daniel Mietchen, Shepazu, Rev3nant, Lt-wikibot, Jwissick, Arthur Rubin, LeonardoRob0t, Agro1986, Eric.weigle, Allens, Sardanaphalus, Melchoir, Brick Thrower, Ohnoitsjamie, Oli

Filth, OrangeDog, Taxipom, DHN-bot~enwiki, Tsca.bot, Onorem, GraphTheoryPwns, Lpgeﬀen, Jon Awbrey, Henning Makholm, Mlpkr,

SashatoBot, Whyﬁsh, Disavian, MynameisJayden, Idiosyncratic-bumblebee, Dicklyon, Quaeler, Lanem, Tawkerbot2, Ylloh, Mahlerite,

CRGreathouse, Dycedarg, Requestion, Bumbulski, Myasuda, RUVARD, The Isiah, Ntsimp, Abeg92, Corpx, DumbBOT, Anthonynow12,

Thijs!bot, Jheuristic, King Bee, Pstanton, Hazmat2, Headbomb, Marek69, Eleuther, AntiVandalBot, Whiteknox, Hannes Eder, Spacefarer, Myanw, JAnDbot, MER-C, Igodard, Restname, Sangak, Tmusgrove, Feeeshboy, Usien6, Ldecola, David Eppstein, Kope, DerHexer, Oroso, MartinBot, R'n'B, Uncle Dick, Joespiﬀ, Ignatzmice, Shikhar1986, Tarotcards, Policron, XxjwuxX, Yecril, JohnBlackburne,

Dggreen, Anonymous Dissident, Alcidesfonseca, Anna Lincoln, Ocolon, Magmi, PaulTanenbaum, Geometry guy, Fivelittlemonkeys, Sacredmint, Spitﬁre8520, Radagast3, SieBot, Dawn Bard, Toddst1, Jon har, Bananastalktome, Titanic4000, Beda42, Maxime.Debosschere,

Damien Karras, ClueBot, DFRussia, PipepBot, Justin W Smith, Vacio, Wraithful, Garyzx, Mild Bill Hiccup, DragonBot, Fchristo, Hans

Adler, Dafyddg, Razorﬂame, Rmiesen, Kruusamägi, Pugget, Darkicebot, BodhisattvaBot, Tangi-tamma, Addbot, Dr.S.Ramachandran,

Cerber, DOI bot, Ronhjones, Low-frequency internal, CanadianLinuxUser, Protonk, LaaknorBot, Smoke73, Delaszk, Favonian, Maurobio, Lightbot, Jarble, Ettrig, Luckas-bot, Yobot, Kilom691, Trinitrix, Jean.julius, AnomieBOT, Womiller99, Sonia, Jim1138, Piano non

troppo, Gragragra, RandomAct, Citation bot, Ayda D, Xqbot, Jerome zhu, Capricorn42, Nasnema, Miym, GiveAFishABone, RibotBOT,

Jalpar75, Aaditya 7, Ankitbhatt, FrescoBot, Mark Renier, SlumdogAramis, Citation bot 1, Sibian, Pinethicket, RobinK, Wsu-dm-jb,

D75304, Wsu-f, Xnn, Obankston, Andrea105, RjwilmsiBot, TjBot, Powerthirst123, Aaronzat, EmausBot, Domesticenginerd, EleferenBot, Jmencisom, Slawekb, Akutagawa10, D.Lazard, Netha Hussain, Tolly4bolly, ChuispastonBot, EdoBot, ClueBot NG, Watersmeetfreak, Matthiaspaul, MelbourneStar, Outraged duck, OMurgo, Bazuz, Aks1521, Masssly, Joel B. Lewis, Johnsopc, HMSSolent, 4368a,

BG19bot, Ajweinstein, Канеюку, MusikAnimal, AvocatoBot, Bereziny, Brad7777, Soﬁa karampataki, ChrisGualtieri, GoShow, Dexbot,

Cerabot~enwiki, Omgigotanaccount, Wikiisgreat123, Faizan, Maxwell bernard, Bg9989, Zsoftua, SakeUPenn, Yloreander, StaticElectricity, Gold4444, Cyborgbadger, Zachwaltman, Gr pbi, KasparBot and Anonymous: 378

• Loop (graph theory) Source: https://en.wikipedia.org/wiki/Loop_(graph_theory)?oldid=640385005 Contributors: Booyabazooka, McKay,

MathMartin, Paul August, Cburnett, Oliphaunt, Dmharvey, Dtrebbien, Gadget850, SmackBot, BiT, Tsca.bot, Lambiam, 16@r, CmdrObot, Letranova, Kylemahan, David Eppstein, Rovnet, ClueBot, JP.Martin-Flatin, Addbot, Twri, Asfarer, FrescoBot, Ricardo Ferreira

de Oliveira, Sinuhe20, MerlIwBot, Ibraheemmoosa and Anonymous: 6

• Mathematics Source: https://en.wikipedia.org/wiki/Mathematics?oldid=667759115 Contributors: AxelBoldt, Magnus Manske, LC~enwiki,

Brion VIBBER, Eloquence, Mav, Bryan Derksen, Zundark, The Anome, Tarquin, Koyaanis Qatsi, Ap, Gareth Owen, -- April, RK,

Iwnbap, LA2, Youssefsan, XJaM, Arvindn, Christian List, Matusz, Toby Bartels, PierreAbbat, Little guru, Miguel~enwiki, Rade Kutil,

DavidLevinson, FvdP, Daniel C. Boyer, David spector, Camembert, Netesq, Zippy, Olivier, Ram-Man, Stevertigo, Spiﬀ~enwiki, Edward,

Quintessent, Ghyll~enwiki, D, Chas zzz brown, JohnOwens, Michael Hardy, Booyabazooka, JakeVortex, Lexor, Isomorphic, Dominus,

120

CHAPTER 9. VERTEX (GRAPH THEORY)

Nixdorf, Grizzly, Kku, Mic, Ixfd64, Firebirth, Alireza Hashemi, Dcljr, Sannse, TakuyaMurata, Karada, Minesweeper, Alﬁo, Tregoweth,

Dgrant, CesarB, Ahoerstemeier, Cyp, Ronz, Muriel Gottrop~enwiki, Snoyes, Notheruser, Angela, Den fjättrade ankan~enwiki, Kingturtle, LittleDan, Kevin Baas, Salsa Shark, Glenn, Jschwa1, Bogdangiusca, BenKovitz, Poor Yorick, Rossami, Tim Retout, Rotem Dan,

Evercat, Rl, Jonik, Madir, Mxn, Smack, Silverﬁsh, Vargenau, Pizza Puzzle, Nikola Smolenski, Charles Matthews, Guaka, Timwi, Spacemonkey~enwiki, Nohat, Ralesk, MarcusVox, Dysprosia, Jitse Niesen, Fuzheado, Gutza, Piolinfax, Selket, DJ Clayworth, Markhurd, Vancouverguy, Tpbradbury, Maximus Rex, Hyacinth, Saltine, AndrewKepert, Fibonacci, Zero0000, Phys, Ed g2s, Wakka, Samsara, Bevo,

McKay, Traroth, Fvw, Babaloulou, Secretlondon, Jusjih, Cvaneg, Flockmeal, Guppy, Francs2000, Dmytro, Lumos3, Jni, PuzzletChung,

Donarreiskoﬀer, Robbot, Fredrik, RedWolf, Peak, Romanm, Lowellian, Gandalf61, Georg Muntingh, Merovingian, HeadCase, Sverdrup,

Henrygb, Academic Challenger, IIR, Thesilverbail, Hadal, Mark Krueger, Wereon, Robinh, Borislav, GarnetRChaney, Ilya (usurped),

Michael Snow, Fuelbottle, ElBenevolente, Lupo, PrimeFan, Zhymkus~enwiki, Dmn, Cutler, Dina, Mlk, Alan Liefting, Rock69~enwiki,

Cedars, Ancheta Wis, Fabiform, Centrx, Giftlite, Dbenbenn, Christopher Parham, Fennec, Markus Krötzsch, Mikez, Inter, Wolfkeeper,

Ævar Arnfjörð Bjarmason, Netoholic, Lethe, Tom harrison, Lupin, MathKnight, Bﬁnn, Ayman, Everyking, No Guru, Curps, Jorend, Ssd,

Niteowlneils, Gareth Wyn, Andris, Guanaco, Sundar, Daniel Brockman, Siroxo, Node ue, Eequor, Arne List, Matt Crypto, Python eggs,

Avala, Jackol, Marlonbraga, Bobblewik, Deus Ex, Golbez, Gubbubu, Kennethduncan, Cap601, Geoﬀspear, Utcursch, Andycjp, CryptoDerk, LucasVB, Quadell, Frogjim~enwiki, Antandrus, BozMo, Rajasekaran Deepak, Beland, WhiteDragon, Bcameron54, Kaldari,

PDH, Profvk, Jossi, Alexturse, Adamsan, CSTAR, Rdsmith4, APH, John Foley, Elektron, Pethan, Mysidia, Pmanderson, Elroch, Sam

Hocevar, Arcturus, Gscshoyru, Stephen j omalley, Jcw69, Ukexpat, Eduardoporcher, Qef, Random account 47, Zondor, Adashiel, Trevor

MacInnis, Grunt, Kate, Bluemask, PhotoBox, Mike Rosoft, Vesta~enwiki, Shahab, Oskar Sigvardsson, Brianjd, D6, CALR, DanielCD,

Olga Raskolnikova, EugeneZelenko, Discospinster, Rich Farmbrough, Guanabot, FiP, Clawed, Inkypaws, Spundun, Andrewferrier, ArnoldReinhold, HeikoEvermann, Smyth, Notinasnaid, AlanBarrett, Paul August, MarkS, DcoetzeeBot~enwiki, Bender235, ESkog, Geoking66, Ben Standeven, Tompw, GabrielAPetrie, RJHall, MisterSheik, Mr. Billion, El C, Chalst, Shanes, Haxwell, Briséis~enwiki, Art

LaPella, RoyBoy, Lyght, Jpgordon, JRM, Porton, Bobo192, Ntmatter, Fir0002, Mike Schwartz, Wood Thrush, Func, Teorth, Flxmghvgvk,

Archfalhwyl, Jung dalglish, Maurreen, Man vyi, Alphax, Rje, Sam Korn, Krellis, Sean Kelly, Jonathunder, Mdd, Tsirel, Passw0rd,

Lawpjc, Vesal, Storm Rider, Stephen G. Brown, Danski14, Msh210, Poweroid, Alansohn, Gary, JYolkowski, Anthony Appleyard,

Blackmail~enwiki, Mo0, Polarscribe, ChristopherWillis, Lordthees, Rgclegg, Jet57, Muﬃn~enwiki, Mmmready, Riana, AzaToth, Lectonar, Lightdarkness, Giant toaster, Cjnm, Mysdaao, Hu, Malo, Avenue, Blobglob, LavosBacons, Schapel, Orionix, BanyanTree, Saga

City, Knowledge Seeker, ReyBrujo, Danhash, Garzo, Huerlisi, Jon Cates, RainbowOfLight, CloudNine, TenOfAllTrades, Mcmillin24,

Bsadowski1, Itsmine, Blaxthos, HenryLi, Bookandcoﬀee, Kz8, Oleg Alexandrov, Ashujo, Stemonitis, Novacatz, Angr, DealPete, Kelly

Martin, Wikiworkerindividual***, TSP, OwenX, Woohookitty, Linas, Masterjamie, Yansa, Brunnock, Carcharoth, BillC, Ruud Koot,

WadeSimMiser, Orz, Hdante, MONGO, Mpatel, Abhilaa, Al E., Wikiklrsc, Bbatsell, Damicatz, Terence, MFH, Sengkang, Zzyzx11,

Noetica,

, Xiong Chiamiov, Gimboid13, Liface, Asdfdsa, PeregrineAY, Thirty-seven, Graham87, Magister Mathematicae,

BD2412, Chun-hian, FreplySpang, JIP, Island, Zoz, Icey, BorgHunter, Josh Parris, Paul13~enwiki, Rjwilmsi, Mayumashu, MJSkia1, Prateekrr, Vary, MarSch, Amire80, Tangotango, Staecker, Omnieiunium, Salix alba, Tawker, Zhurovai, Crazynas, Ligulem, Juan Marquez,

Slac, R.e.b., The wub, Sango123, Yamamoto Ichiro, Kasparov, Staples, Titoxd, Pruneau, RobertG, Latka, Mathbot, Harmil, Narxysus,

Andy85719, RexNL, Gurch, Short Verses, Quuxplusone, Celendin, Ichudov, Jagginess, Alphachimp, Malhonen, David H Braun (1964),

Snailwalker, Mongreilf, Chobot, Jersey Devil, DONZOR, DVdm, Cactus.man, John-Haggerty, Gwernol, Elfguy, Buggi22, Roboto de

Ajvol, Raelx, JPD, YurikBot, Wavelength, Karlscherer3, Jeremybub, Doug Alford, Grifter84, RobotE, Elapsed, Dmharvey, Gmackematix, 4C~enwiki, RussBot, Michael Slone, Geologician, Red Slash, Jtkiefer, Muchness, Anonymous editor, Albert Einsteins pipe,

Nobs01, Soltras, Bhny, Piet Delport, CanadianCaesar, Polyvios, Akamad, Stephenb, Yakuzai, Sacre, Bovineone, Tungsten, Ugur Basak,

David R. Ingham, NawlinWiki, Vanished user kjdioejh329io3rksdkj, Rick Norwood, Misos, SEWilcoBot, Wiki alf, Mipadi, Armindo,

Deskana, Johann Wolfgang, Trovatore, Joel7687, GrumpyTroll, LMSchmitt, Schlaﬂy, Eighty~enwiki, Herve661, JocK, Mccready, Tearlach, Apokryltaros, JDoorjam, Abb3w, Misza13, My Cat inn, Vikvik, Mvsmith, Brucevdk, DryaUnda, SFC9394, Font, Tachyon01,

Mgnbar, Jemebius, Nlu, Mike92591, Dna-webmaster, Tonywalton, Joshurtree, Wknight94, Pooryorick~enwiki, Avraham, Mkns, Googl,

Noosfractal, SimonMorgan, Tigershrike, FF2010, Cursive, Scheinwerfermann, Enormousdude, TheKoG, Donald Albury, Zsynopsis,

Skullﬁssion, Claygate, MaNeMeBasat, GraemeL, JoanneB, Bentong Isles, Donhalcon, JLaTondre, Jaranda, Spliﬀy, Flowersofnight, 158152-12-77, RunOrDie, Kungfuadam, Canadianism, Ben D., Greatal386, JDspeeder1, Saboteur~enwiki, Asterion, Shmm70, Pentasyllabic, Lunch, DVD R W, Finell, Capitalist, Sardanaphalus, Crystallina, JJL, SmackBot, RDBury, YellowMonkey, Selfworm, Smitz, Bobet,

Diggyba, Warhawkhalo101, Estoy Aquí, Reedy, Tarret, KnowledgeOfSelf, Royalguard11, Melchoir, McGeddon, Pavlovič, Masparasol,

Pgk, C.Fred, AndyZ, Kilo-Lima, Jagged 85, PizzaMargherita, CapitalSasha, Antibubbles, AnOddName, Canthusus, BiT, Nscheﬀey,

Amystreet, Ekilfeather, Papep, Jaichander, Ohnoitsjamie, Hmains, Skizzik, Richﬁfe, ERcheck, Hopper5, Squiddy, Armeria, Durova,

Qtoktok, Wigren, Keegan, Woofboy, Rmt2m, Fplay, Christopher denman, Miquonranger03, MalafayaBot, Silly rabbit, Alink, Dlohcierekim’s sock, Richard Woods, Kungming2, Go for it!, Baa, Rdt~enwiki, Spellchecker, Baronnet, Colonies Chris, Ulises Sarry~enwiki,

Nevada, Zachorious, Chendy, J•A•K, Can't sleep, clown will eat me, RyanEberhart, Timothy Clemans, Милан Јелисавчић, TheGerm,

HoodedMan, Chlewbot, Vanished User 0001, Joshua Boniface, TheKMan, Rrburke, Addshore, Mr.Z-man, SundarBot, AndySimpson,

Emre D., Iapetus, Jwy, CraigDesjardins, Daqu, Nakon, VegaDark, Jiddisch~enwiki, Maxwahrhaftig, Salt Yeung, Danielkwalsh, Diocles,

Pg2114, Jon Awbrey, Ruwanraj, Jklin, Xen 1986, Just plain Bill, Knuckles sonic8, Where, Bart v M, ScWizard, Pilotguy, Nov ialiste,

JoeTrumpet, Math hater, Lambiam, Nishkid64, TachyonP, ArglebargleIV, Doug Bell, Harryboyles, Srikeit, Dbtfz, Kuru, JackLumber, Simonkoldyk, Vgy7ujm, Nat2, Cronholm144, Heimstern, Gobonobo, Mﬁshergt, Coastergeekperson04, Sir Nicholas de Mimsy-Porpington,

Dumelow, Jazriel, Gnevin, Unterdenlinden, Ckatz, Loadmaster, Special-T, Dozing, Mr Stephen, Mudcower, AxG, Optakeover, SandyGeorgia, Mets501, Funnybunny, Markjdb, Ryulong, Gﬀ~enwiki, RichardF, Limaner, Jose77, Asyndeton, Stephen B Streater, Politepunk,

DabMachine, Levineps, Hetar, BranStark, Roland Deschain, Kevlar992, Iridescent, K, Kencf0618, Zootsuits, Onestone, Nilamdoc, C.

Lee, CzarB, Polymerbringer, Joseph Solis in Australia, Newone, White wolf753, Muéro, David Little, Igoldste, Amakuru, Marysunshine,

Maelor, Masshaj, Jatrius, Experiment123, Tawkerbot2, Daniel5127, Joshuagross, Emote, Pikminiman, Heyheyhey99, JForget, Smkumar0, Sakowski, Wolfdog, Sleeping123, CRGreathouse, Wafulz, Sir Vicious, Triage, Iced Kola, CBM, Page Up, Jester-Tester, Taylorhewitt, Nczempin, GHe, Green caterpillar, Phanu9000, Yarnalgo, Thomasmeeks, McVities, Requestion, FlyingToaster, MarsRover,

Tac-Tics, Some P. Erson, Tim1988, Tuluat, Alaymehta, MrFish, Oo7565, Gregbard, Captmog, El3m3nt09, Antiwiki~enwiki, Cydebot,

Meznaric, Cantras, Funwithbig, MC10, Meno25, Gogo Dodo, DVokes, ST47, Srinath555, Pascal.Tesson, Goldencako, Benjiboi, Andrewm1986, Michael C Price, Tawkerbot4, Dragomiloﬀ, Juansempere, M a s, Chrislk02, Brotown3, Mamounjo, 5300abc, Roccorossi,

Abtract, Daven200520, Omicronpersei8, Vanished User jdksfajlasd, Daniel Olsen, Ventifact, TAU710, Aditya Kabir, BetacommandBot,

Thijs!bot, Epbr123, Bezking, Jpark3591, Daemen, TheEmaciatedStilson, MCrawford, Opabinia regalis, Mattyboy500, Kilva, Daniel,

Loudsox, Ucanlookitup, Hazmat2, Wootwootwoot, Brian G. Wilson, Timo3, Mojo Hand, Djfeldman, Pjvpjv, West Brom 4ever, John254,

Alientraveller, Mnemeson, Ollyrobotham, BadKarma14, Sethdoe92, Dfrg.msc, RobHar, CharlotteWebb, Dawnseeker2000, RoboServien,

Escarbot, Itsfrankie1221, Thomaswgc, Thadius856, Sidasta, AntiVandalBot, Ais523, RobotG, Gioto, Luna Santin, Dark Load, DarkAu-

9.5. TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES

121

dit, Ringleader1489, Dylan Lake, Doktor Who, Chill doubt, AxiomShell, Abc30, Matheor, Archmagusrm, Falconleaf, Labongo, Spacefarer, Chocolatepizza, JAnDbot, Kaobear, MyNamesLogan, MER-C, The Transhumanist, Db099221, AussieOzborn au, Thenub314,

Mosesroses, Hut 8.5, Kipholbeck, Xact, Twospoonfuls, .anacondabot, Yahel Guhan, Bencherlite, Yurei-eggtart, Bongwarrior, VoABot

II, JamesBWatson, Swpb, EdwardLockhart, SineWave, Charlielee111, Cic, Ryeterrell, Caesarjbsquitti, Wikiwhat?, Bubba hotep, KConWiki, Meb43, Faustnh, Hiplibrarianship, Johnbibby, Seberle, MetsBot, Pawl Kennedy, 28421u2232nfenfcenc, Systemlover, Bmeguru,

Hotmedal, Just James, EstebanF, Glen, Rajpaj, Memorymentor, TheRanger, Calltech, Gun Powder Ma, Welshleprechaun, Robin S,

Seba5618, SquidSK, 0612, J0equ1nn, Riccardobot, Jtir, Hdt83, MartinBot, Vladimir m, Arjun01, Quanticle, Nocklas, Rettetast, Fuzzyhair2, R'n'B, Pbroks13, Cmurphy au, Snozzer, Ben2then, PrestonH, Crazybobson, Thefutureschannel, RockMFR, Hrishikesh.24889,

J.delanoy, Nev1, Unlockitall, Phoenix1177, Numbo3, Sp3000, Maurice Carbonaro, Nigholith, Hellonicole, -jmac-, Boris Allen, 2boobies, Jerry, TheSeven, NerdyNSK, Syphertext, Yadar677, Taop, G. Campbell, Wayp123, Keesiewonder, Matt1314, Ksucemfof, Gzkn,

Ivelnaps, Smeira, DarkFalls, Thomas Larsen, Vishi-vie, Washington8785, Xyzaxis, Arkuski, JDQuimby, Batmanfan77, Alphapeta, Trd89,

HiLo48, The Transhumanist (AWB), NewEnglandYankee, RANDP, MKoltnow, MhordeXsnipa, Milogardner, Nacrha, Balaam42, Mviergujerghs89fhsdifds, Cfrehr, Elvisfan2095, Tiyoringo, Juliancolton, Cometstyles, DavidCBryant, SlightlyMad, Jamesontai, Remember

the dot, Ilya Voyager, Huzefahamid, Dandy mandy, Andreas2001, Ishap, Sarregouset, CANUTELOOL2, CANUTELOOL3, Devonboy69, Jeyarathan, Death blaze, Emo kid you?, Thedudester, Samlyn.josfyn, Mother69, Vinsfan368, Cartiod, Helldude99, Sternkampf,

Steel1943, CardinalDan, RJASE1, Idioma-bot, Remi0o, Lights, Tamillimat, Bandaidboy, C.lettingaAV, VolkovBot, Somebodyreallycool, Pleasantville, Jeﬀ G., JohnBlackburne, Hhjk, The Catcher in The Rye D:, Alexandria, AlnoktaBOT, Dboerstl, NikolaiLobachevsky,

Bangvang, 62 (number), Tseay11, Soliloquial, Headforaheadeyeforaneye, Barneca, Sześćsetsześćdziesiątsześć, Zeuron, Yoyoyo9, Trehansiddharth, TXiKiBoT, Katoa, Jacob Lundberg, Candy-Panda, Chickenclucker, Antoni Barau, Walor, Anonymous Dissident, Qxz, Nukemason4, Retiono Virginian, Ocolon, Savagepine, DennyColt, Digby Tantrum, JhsBot, Leafyplant, Beanai, 20em89.01, Cremepuﬀ222,

Geometry guy, Canyonsupreme, Natural Philosopher, Teller33, Mathsmad, Unknown 987, Tarten5, Nickmuller, Robomonster, Wolfrock,

Jacob501, Kreemy, Synthebot, Tomaxer, Careercornerstone, Enviroboy, Rurik3, Sardonicone, Evanbrown326, Alliashax, Sylent, Rubentimothy, SMIE SMIE, Gamahucher, Braindamage3, Animalalley12895, Moohahaha, Thanatos666, Dillydumdum, AlleborgoBot, Voicework, Symane, Katzmik, Monkeynuts27, Demmy, Cam275, GoonerDP, SieBot, Mikemoral, James Banogon, BotMultichill, Timgregg96,

Triwbe, 5150pacer, Soler97, Andersmusician, Anubhav29, Keilana, Tiptoety, Arbor to SJ, Undead Herle King, Richardcraig, Paolo.dL,

Boogster, Oxymoron83, Henry Delforn (old), Avnjay, MiNombreDeGuerra, RW Marloe, SH84, Deejaye6, Musse-kloge, Jorgen W,

Kumioko, Correogsk, MadmanBot, Nomoneynotime, Nickm4c, Darkmyst932, Anchor Link Bot, Jacob.jose, Randomblue, Melcombe,

CaptainIron555, Yhkhoo, Dabomb87, Jat99, Pinkadelica, Francvs, Athenean, Ooswesthoesbes, ClueBot, Volcom5347, Gladysamuel,

GPdB, Bwfrank, DFRussia, PipepBot, Foxj, Dobermanji, C1932, Remus John Lupin, Chocoforfriends, Smithpith, ArdClose, IceUnshattered, Plastikspork, Lawrence Cohen, Gawaxay, Nnemo, Ukabia, Michael.Urban, Niceguyedc, Xenon54, Mspraveen, DragonBot,

Isaac25, 4pario, Donkeyboya, Excirial, CBOrgatrope, Bedsandbellies, Soccermaster3112, Alexbot, TonyBallioni, Pjb14, 0na01der, Andy

pyro, Wikibobspider, BrentLeah, Eeekster, Anonymous1324354657687980897867564534231, Mycatiscool, Greenjuice, Chance Jeong,

Arunta007, Greenjuice3.0, Greenjuice4, AnimeFan7, MacedonianBoy, ZuluPapa5, NuclearWarfare, JoelDick, Honeyspots3121, Blondeychck7, Faty148, Jotterbot, RC-0722, Wulfric1, Thingg, Franklin.vp, Aitias, DerBorg, Versus22, Hwalee76, SoxBot III, Apparition11,

Mofeed.sawan, Slayerteez, XLinkBot, Marc van Leeuwen, Moocow444, Joejill67~enwiki, Little Mountain 5, Drumbeatsofeden, SilvonenBot, Planb 89, Alexius08, Vianello, MystBot, Zodon, RyanCross, Aetherealize, Zoltan808, T.M.M. Dowd, Aceleo, Jetsboy101, Willking1979, Mattguzy, 3Nigma, DOI bot, Cdt laurence, Fgnievinski, Yobmod, Aaronthegr8, CanadianLinuxUser, Potatoscrub, Download,

Protonk, Chamal N, CarsracBot, Favonian, LinkFA-Bot, ViskonBot, Barak Sh, Aldermalhir, Jubeidono, PRL42, Lightbot, Ann Logsdon,

Floccinocin123, Matěj Grabovský, Fivexthethird, TeH nOmInAtOr, Jarble, Herve1729, Sitehut, Ptbotgourou, Senator Palpatine, TaBOTzerem, Legobot II, Kan8eDie, Nirvana888, Gugtup, Washburnmav, Mikeedla, THEN WHO WAS PHONE?, Skyeliam, MeatJustice,

Wierdox, AnomieBOT, Nastor, ThaddeusB, Connectonline, Taskualads, Themantheman, Galoubet, Neko85, Noahschultz, JackieBot,

Commander Shepard, Chingchangriceball, Piano non troppo, Supersmashballs123, Agroose, Pm11189, Riekuh, Hamletö, Deverenn,

Frank2710, Chief Heath, Easton12, Codycash33, Archaeopteryx, Citation bot, Merlissimo, ArthurBot, Tatarian, MauritsBot, Xqbot,

TinucherianBot II, Sketchmoose, Timir2, Capricorn42, Johnferrer, Jmundo, Locos epraix, Br77rino, Isheden, Inferno, Lord of Penguins,

Uarrin, LevenBoy, Quixotex, GrouchoBot, Resident Mario, ProtectionTaggingBot, Omnipaedista, Point-set topologist, Gott wisst, RibotBOT, Charvest, KrazyKosbyKidz, MarilynCP, Gingerninja12, Caleb7693, Deathiscomin90919, VictorPorton, Grg222, Daryl7569,

Petes2176, GhalyBot, ThibautLienart, Prozo3190, Family400005, Bupsiij, Aaron Kauppi, Har56, Dr. Klim, Velblod, CES1596, GliderMaven, Thomascjackson, FrescoBot, RTFVerterra, Triwikanto, Tobby72, Mark Renier, Oneﬁve15, VS6507, Alpboyraz, ParaDoxus,

Sławomir Biały, Xefer, Zhentmdfan, Tzurvah MeRabannan, Citation bot 1, Amplitude101, Tkuvho, Rotje66, Kiefer.Wolfowitz, AwesomeHersh, ElNuevoEinstein, Gamewizard71, FoxBot, TobeBot, DixonDBot, Burritoburritoburrito, Lotje, Dinamik-bot, Raiden09, Mrjames99, DJTrickyM, Stephen MUFC, Tbhotch, RjwilmsiBot, TjBot, Ripchip Bot, Galois fu, Alphanumeric Sheep Pig, BertSeghers,

Mr magnolias, DarkLightA, LibertyDodzo, EmausBot, PrisonerOfIce, Nima1024, WikitanvirBot, Surlyduﬀ50, AThornyKoanz, Mehdiirfani, Legajoe, Wham Bam Rock II, Bethnim, ZéroBot, John Cline, Josve05a, Leaﬁest of Futures, Battoe19, Anmol9999, Scythia,

Brandmeister, Vanished user ﬁjtji34toksdcknqrjn54yoimascj, Ain92, Agatecat2700, Herk1955, Teapeat, Mjbmrbot, Liuthar, ClueBot

NG, Incompetence, Wcherowi, Movses-bot, Kindyin, LJosil, SilentResident, Braincricket, Rbellini, Zackaback, MillingMachine, Helpful

Pixie Bot, Thisthat2011, Curb Chain, AnandVivekSatpathi, Nashhinton, EmilyREditor, Ariel C.M.K., Fraqtive42, AvocatoBot, Davidiad,

Ropestring, Edward Gordon Gey, EliteforceMMA, Karthickraj007, VirusKA, MYustin, Brad7777, Idresjafary, Nbrothers, IkamusumeFan, Kavy32, Sklange, Blevintron, BlevintronBot, Sulphuric Glue, Dexbot, Rezonansowy, Mudcap, Augustus Leonhardus Cartesius,

Pankaj Jyoti Mahanta, Ybidzian, TycoonSaad, Jarash, Chern038, FireﬂySixtySeven, Kind Tennis Fan, Justin86789, 12visakhva, Dodi

8238, Rcehy, Vanisheduser00348374562342, 115ash, AdditionSubtraction, Mario Castelán Castro, Arvind asia, Rctillinghast, KasparBot, Kaﬁshabbir and Anonymous: 1222

• Matrix (mathematics) Source: https://en.wikipedia.org/wiki/Matrix_(mathematics)?oldid=667651227 Contributors: AxelBoldt, Tarquin, Tbackstr, Hajhouse, XJaM, Ramin Nakisa, Stevertigo, Patrick, Michael Hardy, Wshun, Cole Kitchen, SGBailey, Chinju, Zeno

Gantner, Dcljr, Ejrh, Looxix~enwiki, Muriel Gottrop~enwiki, Angela, Александър, Poor Yorick, Rmilson, Andres, Schneelocke, Charles

Matthews, Dysprosia, Jitse Niesen, Lou Sander, Dtgm, Bevo, Francs2000, Robbot, Mazin07, Sander123, Chrism, Fredrik, R3m0t, Gandalf61, MathMartin, Sverdrup, Rasmus Faber, Bkell, Paul Murray, Neckro, Tobias Bergemann, Tosha, Giftlite, Jao, Arved, BenFrantzDale, Netoholic, Dissident, Dratman, Michael Devore, Waltpohl, Duncharris, Macrakis, Utcursch, Alexf, MarkSweep, Profvk, Wiml,

Urhixidur, Sam nead, Azuredu, Barnaby dawson, Porges, PhotoBox, Shahab, Rich Farmbrough, FiP, ArnoldReinhold, Pavel Vozenilek,

Paul August, ZeroOne, El C, Rgdboer, JRM, NetBot, The strategy freak, La goutte de pluie, Obradovic Goran, Mdd, Tsirel, LutzL,

Landroni, Jumbuck, Jigen III, Alansohn, ABCD, Fritzpoll, Wanderingstan, Mlm42, Jheald, Simone, RJFJR, Dirac1933, AN(Ger),

Adrian.benko, Oleg Alexandrov, Nessalc, Woohookitty, Igny, LOL, Webdinger, David Haslam, UbiquitousUK, Username314, Tabletop, Waldir, Prashanthns, Mandarax, SixWingedSeraph, Grammarbot, Porcher, Sjakkalle, Koavf, Joti~enwiki, Watcharakorn, SchuminWeb, Old Moonraker, RexNL, Jrtayloriv, Krun, Fresheneesz, Srleﬄer, Vonkje, Masnevets, NevilleDNZ, Chobot, Krishnavedala, Karch,

122

CHAPTER 9. VERTEX (GRAPH THEORY)

DVdm, Bgwhite, YurikBot, Wavelength, Borgx, RussBot, Michael Slone, Bhny, NawlinWiki, Rick Norwood, Jfheche, 48v, Bayle Shanks,

Jimmyre, Misza13, Samuel Huang, Merosonox, DeadEyeArrow, Bota47, Glich, Szhaider, Jezzabr, Leptictidium, Mythobeast, Spondoolicks, Alasdair, Lunch, Sardanaphalus, SmackBot, RDBury, CyclePat, KocjoBot~enwiki, Jagged 85, GoonerW, Minglai, Scott Paeth,

Gilliam, Skizzik, Saros136, Chris the speller, Optikos, Bduke, Silly rabbit, DHN-bot~enwiki, Darth Panda, Foxjwill, Can't sleep, clown

will eat me, Smallbones, KaiserbBot, Rrburke, Mhym, SundarBot, Jon Awbrey, Tesseran, Aghitza, The undertow, Lambiam, Wvbailey, Attys, Nat2, Cronholm144, Terry Bollinger, Nijdam, Aleenf1, Jacobdyer, WhiteHatLurker, Beetstra, Kaarebrandt, Mets501, Neddyseagoon, Dr.K., P199, MTSbot~enwiki, Quaeler, Rschwieb, Levineps, JMK, Tawkerbot2, Dlohcierekim, DKqwerty, AbsolutDan,

Propower, CRGreathouse, JohnCD, INVERTED, SelfStudyBuddy, HalJor, MC10, Pascal.Tesson, Bkgoodman, Alucard (Dr.), Juansempere, Codetiger, Bellayet, הסרפד, Epbr123, Paragon12321, Markus Pössel, Aeriform, Gamer007, Headbomb, Marek69, RobHar, Urdutext, AntiVandalBot, Lself, Jj137, Hermel, Oatmealcookiemon, JAnDbot, Fullverse, MER-C, Yanngeﬀrotin~enwiki, Bennybp, VoABot

II, Fusionmix, T@nn, JNW, Jakob.scholbach, Rivertorch, EagleFan, JJ Harrison, Sullivan.t.j, David Eppstein, User A1, ANONYMOUS

COWARD0xC0DE, JoergenB, Philg88, Nevit, Hbent, Gjd001, Doccolinni, Yodalee327, R'n'B, Alfred Legrand, J.delanoy, Rlsheehan, Maurice Carbonaro, Richard777, Wayp123, Toghrul Talibzadeh, Aqwis, It Is Me Here, Cole the ninja, TomyDuby, Peskydan,

AntiSpamBot, JonMcLoone, Policron, Doug4, Fylwind, Kevinecahill, Ben R. Thomas, CardinalDan, OktayD, Egghead06, X!, Malik

Shabazz, UnicornTapestry, Shiggity, VolkovBot, Dark123, JohnBlackburne, LokiClock, VasilievVV, DoorsAjar, TXiKiBoT, Hlevkin,

Rei-bot, Anonymous Dissident, D23042304, PaulTanenbaum, LeaveSleaves, BigDunc, Wolfrock, Wdrev, Brianga, Dmcq, KjellG, AlleborgoBot, Symane, Anoko moonlight, W4chris, Typoﬁer, Neparis, T-9000, D. Recorder, ChrisMiddleton, GirasoleDE, Dogah, SieBot,

Ivan Štambuk, Bachcell, Gerakibot, Cwkmail, Yintan, Radon210, Elcobbola, Paolo.dL, Oxymoron83, Ddxc, Oculi, Manway, AlanUS,

Anchor Link Bot, Rinconsoleao, Denisarona, Canglesea, Myrvin, DEMcAdams, ClueBot, Sural, Wpoely86, Remag Kee, SuperHamster, LizardJr8, Masterpiece2000, Excirial, Da rulz07, Bender2k14, Ftbhrygvn, Muhandes, Brews ohare, Tyler, Livius3, Jotterbot, Hans

Adler, Manco Capac, MiraiWarren, Qwfp, Johnuniq, TimothyRias, Lakeworks, XLinkBot, Marc van Leeuwen, Rror, AndreNatas, Jaan

Vajakas, Porphyro, Stephen Poppitt, Addbot, Proofreader77, Deepmath, RPHv, Steve.jaramillov~enwiki, WardenWalk, Jccwiki, CactusWriter, Mohamed Magdy, MrOllie, Tide rolls, Gail, Jarble, CountryBot, LuK3, Luckas-bot, Yobot, Senator Palpatine, QueenCake,

TestEditBot, AnomieBOT, Autarkaw, Gazzawi, IDangerMouse, MattTait, Kingpin13, Materialscientist, Citation bot, Wrelwser43, LilHelpa, FactSpewer, Xqbot, Capricorn42, Drilnoth, HHahn, El Caro, BrainFRZ, J04n, Nickmn, RibotBOT, Cerniagigante, Smallman12q,

WaysToEscape, Much noise, LucienBOT, Tobby72, VS6507, Recognizance, Sławomir Biały, Izzedine, IT2000, HJ Mitchell, Sae1962,

Jamesooders, Cafreen, Citation bot 1, Swordsmankirby, I dream of horses, Kiefer.Wolfowitz, MarcelB612, NoFlyingCars, RedBot,

RobinK, Kallikanzarid, Jordgette, ItsZippy, Vairoj, SeoMac, MathInclined, The last username left was taken, Birat lamichhane, Katovatzschyn, Soupjvc, Sfbaldbear, Salvio giuliano, Mandolinface, EmausBot, Lkh2099, Nurath224, DesmondSteppe, RIS cody, Slawekb,

Quondum, Chocochipmuﬃn, U+003F, Rcorcs, තඹරු විජේසේකර, Maschen, Babababoshka, Adjointh, Donner60, Puﬃn, JFB80,

Anita5192, Petrb, ClueBot NG, Wcherowi, Michael P. Barnett, Rtucker913, Satellizer, Rank Penguin, Tyrantbrian, Dsperlich, Helpful

Pixie Bot, Rxnt, Christian Matt, MarcoPotok, BG19bot, Wiki13, Muscularmussel, MusikAnimal, Brad7777, René Vápeník, Soﬁa karampataki, BattyBot, Freesodas, IkamusumeFan, Lucaspentzlp, OwenGage, APerson, Dexbot, Mark L MacDonald, Numbermaniac, Frosty,

JustAMuggle, Reatlas, Acetotyce, Debouch, Wamiq, Ugog Nizdast, Zenibus, SwimmerOfAwesome, Jianhui67, OrthogonalFrog, Airwoz, Derpghvdyj, Mezafo, CarnivorousBunny, Xxhihi, Sordin, Username89911998, Gronk Oz, Hidrolandense, Kellywacko, JArnold99,

Kavya l and Anonymous: 624

• Vertex (graph theory) Source: https://en.wikipedia.org/wiki/Vertex_(graph_theory)?oldid=628902495 Contributors: XJaM, Altenmann,

MathMartin, Giftlite, Dbenbenn, Purestorm, Rich Farmbrough, Cburnett, RussBot, Anomalocaris, InverseHypercube, BiT, Chetvorno,

Ylloh, Univer, Escarbot, David Eppstein, JaGa, Mange01, Hans Dunkelberg, Geekdiva, VolkovBot, TXiKiBoT, Synthebot, Anoko moonlight, SieBot, Hxhbot, Kl4m, JP.Martin-Flatin, Niemeyerstein en, Albambot, DOI bot, Zorrobot, Luckas-bot, TaBOT-zerem, KamikazeBot, Ciphers, Citation bot, Twri, Xqbot, Miym, Amaury, Phil1881, Trappist the monk, WillNess, DARTH SIDIOUS 2, TjBot, Orphan

Wiki, ZéroBot, ClueBot NG, Gchrupala, Maxwell bernard, SamX and Anonymous: 16

9.5.2

Images

• File:1u04-argonaute.png Source: https://upload.wikimedia.org/wikipedia/commons/0/02/1u04-argonaute.png License: CC-BY-SA3.0 Contributors: Self created from PDB entry 1U04 using the freely available visualization and analysis package VMD raytraced with

POV-Ray 3.6 Original artist: Opabinia regalis

• File:3-Tastenmaus_Microsoft.jpg Source: https://upload.wikimedia.org/wikipedia/commons/a/aa/3-Tastenmaus_Microsoft.jpg License: CC BY-SA 2.5 Contributors: Own work Original artist: Darkone

• File:4-critical_graph.png Source: https://upload.wikimedia.org/wikipedia/commons/7/73/4-critical_graph.png License: CC0 Contributors: Own work Original artist: Jmerm

• File:6n-graf.svg Source: https://upload.wikimedia.org/wikipedia/commons/5/5b/6n-graf.svg License: Public domain Contributors: Image:

6n-graf.png simlar input data Original artist: User:AzaToth

• File:6n-graph2.svg Source: https://upload.wikimedia.org/wikipedia/commons/2/28/6n-graph2.svg License: Public domain Contributors: ? Original artist: ?

• File:Abacus_6.png Source: https://upload.wikimedia.org/wikipedia/commons/a/af/Abacus_6.png License: Public domain Contributors:

• Article for “abacus”, 9th edition Encyclopedia Britannica, volume 1 (1875); scanned and uploaded by Malcolm Farmer Original artist:

Encyclopædia Britannica

• File:Ada_lovelace.jpg Source: https://upload.wikimedia.org/wikipedia/commons/0/0f/Ada_lovelace.jpg License: Public domain Contributors: www.fathom.com Original artist: Alfred Edward Chalon

• File:Arbitrary-gametree-solved.svg Source: https://upload.wikimedia.org/wikipedia/commons/d/d7/Arbitrary-gametree-solved.svg License: Public domain Contributors:

• Arbitrary-gametree-solved.png Original artist:

• derivative work: Qef (talk)

• File:Area_parallellogram_as_determinant.svg Source: https://upload.wikimedia.org/wikipedia/commons/a/ad/Area_parallellogram_

as_determinant.svg License: Public domain Contributors: Own work, created with Inkscape Original artist: Jitse Niesen

9.5. TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES

123

• File:Babbage40.png Source: https://upload.wikimedia.org/wikipedia/commons/6/67/Babbage40.png License: Public domain Contributors: The Mechanic’s Magazine, Museum, Register, Journal and Gazette, October 6, 1832-March 31, 1833. Vol. XVIII. Original artist:

AGoon, derivative work, original was 'Engraved by Roﬀe, by permifsion from an original Family Painting' 1833

• File:BernoullisLawDerivationDiagram.svg Source: https://upload.wikimedia.org/wikipedia/commons/2/20/BernoullisLawDerivationDiagram.

svg License: CC-BY-SA-3.0 Contributors: Image:BernoullisLawDerivationDiagram.png Original artist: MannyMax (original)

• File:Blochsphere.svg Source: https://upload.wikimedia.org/wikipedia/commons/f/f3/Blochsphere.svg License: CC-BY-SA-3.0 Contributors: Transferred from en.wikipedia to Commons. Original artist: MuncherOfSpleens at English Wikipedia

• File:Braid-modular-group-cover.svg Source: https://upload.wikimedia.org/wikipedia/commons/d/da/Braid-modular-group-cover.svg

License: Public domain Contributors: Own work, created as per: en:meta:Help:Displaying a formula#Commutative diagrams; source code

below. Original artist: Nils R. Barth

• File:CH4-structure.svg Source: https://upload.wikimedia.org/wikipedia/commons/0/0f/CH4-structure.svg License: ? Contributors:

File:Ch4-structure.png Original artist: Own work

• File:Caesar3.svg Source: https://upload.wikimedia.org/wikipedia/commons/2/2b/Caesar3.svg License: Public domain Contributors: Own

work Original artist: Cepheus

• File:Carl_Friedrich_Gauss.jpg Source: https://upload.wikimedia.org/wikipedia/commons/9/9b/Carl_Friedrich_Gauss.jpg License: Public domain Contributors: Gauß-Gesellschaft Göttingen e.V. (Foto: A. Wittmann). Original artist: Gottlieb Biermann

A. Wittmann (photo)

• File:Commons-logo.svg Source: https://upload.wikimedia.org/wikipedia/en/4/4a/Commons-logo.svg License: ? Contributors: ? Original artist: ?

• File:Commutative_diagram_for_morphism.svg Source: https://upload.wikimedia.org/wikipedia/commons/e/ef/Commutative_diagram_

for_morphism.svg License: Public domain Contributors: Own work, based on en:Image:MorphismComposition-01.png Original artist:

User:Cepheus

• File:Compiler.svg Source: https://upload.wikimedia.org/wikipedia/commons/6/6b/Compiler.svg License: CC-BY-SA-3.0 Contributors:

self-made SVG version of Image:Ideal compiler.png by User:Raul654. Incorporates Image:Computer n screen.svg and Image:Nuvola

mimetypes source.png. Original artist: Surachit

• File:Complete_graph_K5.svg Source: https://upload.wikimedia.org/wikipedia/commons/c/cf/Complete_graph_K5.svg License: Public domain Contributors: Own work Original artist: David Benbennick wrote this ﬁle.

• File:Composite_trapezoidal_rule_illustration_small.svg Source: https://upload.wikimedia.org/wikipedia/commons/d/dd/Composite_

trapezoidal_rule_illustration_small.svg License: Attribution Contributors:

• Composite_trapezoidal_rule_illustration_small.png Original artist:

• derivative work: Pbroks13 (talk)

• File:Conformal_grid_after_Möbius_transformation.svg Source: https://upload.wikimedia.org/wikipedia/commons/3/3f/Conformal_

grid_after_M%C3%B6bius_transformation.svg License: CC BY-SA 2.5 Contributors: By Lokal_Proﬁl Original artist: Lokal_Proﬁl

• File:Corner.png Source: https://upload.wikimedia.org/wikipedia/commons/5/5f/Corner.png License: Public domain Contributors: http:

//en.wikipedia.org/wiki/File:Corner.png Original artist: Retardo

• File:DFAexample.svg Source: https://upload.wikimedia.org/wikipedia/commons/9/9d/DFAexample.svg License: Public domain Contributors: Own work Original artist: Cepheus

• File:Determinant_example.svg Source: https://upload.wikimedia.org/wikipedia/commons/a/a7/Determinant_example.svg License: CC

BY-SA 3.0 Contributors: Own work Original artist: Krishnavedala

• File:Directed.svg Source: https://upload.wikimedia.org/wikipedia/commons/a/a2/Directed.svg License: Public domain Contributors: ?

Original artist: ?

• File:Directed_cycle.svg Source: https://upload.wikimedia.org/wikipedia/commons/d/dc/Directed_cycle.svg License: Public domain Contributors: en:Image:Directed cycle.png Original artist: en:User:Dcoetzee, User:Stannered

• File:Earth.png Source: https://upload.wikimedia.org/wikipedia/commons/1/1e/Earth.png License: Public domain Contributors: ? Original artist: ?

• File:Ellipse_in_coordinate_system_with_semi-axes_labelled.svg Source: https://upload.wikimedia.org/wikipedia/commons/8/8e/Ellipse_

in_coordinate_system_with_semi-axes_labelled.svg License: CC BY-SA 3.0 Contributors: Own work Original artist: Jakob.scholbach

• File:Elliptic_curve_simple.svg Source: https://upload.wikimedia.org/wikipedia/commons/d/da/Elliptic_curve_simple.svg License: CCBY-SA-3.0 Contributors:

• Elliptic_curve_simple.png Original artist:

• derivative work: Pbroks13 (talk)

• File:Emp_Tables_(Database).PNG Source: https://upload.wikimedia.org/wikipedia/commons/8/87/Emp_Tables_%28Database%29.

PNG License: Public domain Contributors: Own work Original artist: Jamesssss

• File:English.png Source: https://upload.wikimedia.org/wikipedia/commons/0/0a/English.png License: Public domain Contributors: ?

Original artist: ?

• File:Enigma.jpg Source: https://upload.wikimedia.org/wikipedia/commons/a/ae/Enigma.jpg License: Public domain Contributors: User:

Jszigetvari Original artist: ?

• File:Euclid.jpg Source: https://upload.wikimedia.org/wikipedia/commons/2/21/Euclid.jpg License: Public domain Contributors: ? Original artist: ?

• File:Fibonacci.jpg Source: https://upload.wikimedia.org/wikipedia/commons/a/a2/Fibonacci.jpg License: Public domain Contributors:

Scan from “Mathematical Circus” by Martin Gardner, published 1981 Original artist: unknown medieval artist

124

CHAPTER 9. VERTEX (GRAPH THEORY)

• File:Fivestagespipeline.png Source: https://upload.wikimedia.org/wikipedia/commons/2/21/Fivestagespipeline.png License: CC-BYSA-3.0 Contributors: ? Original artist: ?

• File:Flip_map.svg Source: https://upload.wikimedia.org/wikipedia/commons/3/3f/Flip_map.svg License: CC BY-SA 3.0 Contributors:

derived from File:Rotation_by_pi_over_6.svg Original artist: Jakob.scholbach

• File:Flowchart.png Source: https://upload.wikimedia.org/wikipedia/commons/9/9d/Flowchart.png License: CC SA 1.0 Contributors: ?

Original artist: ?

• File:Folder_Hexagonal_Icon.svg Source: https://upload.wikimedia.org/wikipedia/en/4/48/Folder_Hexagonal_Icon.svg License: Ccby-sa-3.0 Contributors: ? Original artist: ?

• File:Four_Colour_Map_Example.svg Source: https://upload.wikimedia.org/wikipedia/commons/8/8a/Four_Colour_Map_Example.

svg License: CC-BY-SA-3.0 Contributors: Based on a this raster image by chas zzz brown on en.wikipedia. Original artist: Inductiveload

• File:GDP_PPP_Per_Capita_IMF_2008.svg Source: https://upload.wikimedia.org/wikipedia/commons/d/d4/GDP_PPP_Per_Capita_

IMF_2008.svg License: CC BY 3.0 Contributors: Sbw01f’s work, but converted to an SVG ﬁle instead. Data from International Monetary

Fund World Economic Outlook Database April 2009 Original artist: Powerkeys

• File:GodfreyKneller-IsaacNewton-1689.jpg Source: https://upload.wikimedia.org/wikipedia/commons/3/39/GodfreyKneller-IsaacNewton-1689.

jpg License: Public domain Contributors: http://www.newton.cam.ac.uk/art/portrait.html Original artist: Sir Godfrey Kneller

• File:Gottfried_Wilhelm_von_Leibniz.jpg Source: https://upload.wikimedia.org/wikipedia/commons/6/6a/Gottfried_Wilhelm_von_

Leibniz.jpg License: Public domain Contributors: /gbrown/philosophers/leibniz/BritannicaPages/Leibniz/LeibnizGif.html Original artist:

Christoph Bernhard Francke

• File:Gravitation_space_source.png Source: https://upload.wikimedia.org/wikipedia/commons/2/26/Gravitation_space_source.png License: CC-BY-SA-3.0 Contributors: ? Original artist: ?

• File:Group_diagdram_D6.svg Source: https://upload.wikimedia.org/wikipedia/commons/0/0e/Group_diagdram_D6.svg License: Public domain Contributors: Own work Original artist: User:Cepheus

• File:HONDA_ASIMO.jpg Source: https://upload.wikimedia.org/wikipedia/commons/0/05/HONDA_ASIMO.jpg License: CC-BYSA-3.0 Contributors: ? Original artist: ?

• File:Human_eye,_rendered_from_Eye.png Source: https://upload.wikimedia.org/wikipedia/commons/5/51/Human_eye%2C_rendered_

from_Eye.png License: CC-BY-SA-3.0 Contributors: Own work by the original uploader This ﬁle was derived from: Eye.svg

Original artist: Kenny sh at English Wikipedia

• File:Hyperbola2_SVG.svg Source: https://upload.wikimedia.org/wikipedia/commons/d/d9/Hyperbola2_SVG.svg License: CC BY-SA

3.0 Contributors: Own work Original artist: IkamusumeFan

• File:Hyperbolic_triangle.svg Source: https://upload.wikimedia.org/wikipedia/commons/8/89/Hyperbolic_triangle.svg License: Public

domain Contributors: ? Original artist: ?

• File:Ideal_compiler.png Source: https://upload.wikimedia.org/wikipedia/commons/2/20/Ideal_compiler.png License: CC-BY-SA-3.0

Contributors: ? Original artist: ?

• File:Illustration_to_Euclid’{}s_proof_of_the_Pythagorean_theorem.svg Source: https://upload.wikimedia.org/wikipedia/commons/

2/26/Illustration_to_Euclid%27s_proof_of_the_Pythagorean_theorem.svg License: WTFPL Contributors: ? Original artist: ?

• File:Integral_as_region_under_curve.svg Source: https://upload.wikimedia.org/wikipedia/commons/f/f2/Integral_as_region_under_

curve.svg License: CC-BY-SA-3.0 Contributors: Own work, based on JPG version Original artist: 4C

• File:Internet_map_1024.jpg Source: https://upload.wikimedia.org/wikipedia/commons/d/d2/Internet_map_1024.jpg License: CC BY

2.5 Contributors: Originally from the English Wikipedia; description page is/was here. Original artist: The Opte Project

• File:Jordan_blocks.svg Source: https://upload.wikimedia.org/wikipedia/commons/4/4f/Jordan_blocks.svg License: CC BY-SA 3.0

Contributors: Own work Original artist: Jakob.scholbach

• File:Julia_iteration_data.png Source: https://upload.wikimedia.org/wikipedia/commons/4/47/Julia_iteration_data.png License: GFDL

Contributors: Own work Original artist: Adam majewski

• File:Kapitolinischer_Pythagoras_adjusted.jpg Source: https://upload.wikimedia.org/wikipedia/commons/1/1a/Kapitolinischer_Pythagoras_

adjusted.jpg License: CC-BY-SA-3.0 Contributors: First upload to Wikipedia: de.wikipedia; description page is/was here.

Original artist: The original uploader was Galilea at German Wikipedia

• File:KnnClassification.svg Source: https://upload.wikimedia.org/wikipedia/commons/e/e7/KnnClassification.svg License: CC-BYSA-3.0 Contributors: Own work Original artist: Antti Ajanki AnAj

• File:Konigsberg_bridges.png Source: https://upload.wikimedia.org/wikipedia/commons/5/5d/Konigsberg_bridges.png License: CCBY-SA-3.0 Contributors: Public domain (PD), based on the image

• <a href='//commons.wikimedia.org/wiki/File:Image-Koenigsberg,_Map_by_Merian-Erben_1652.jpg' class='image'><img alt='ImageKoenigsberg, Map by Merian-Erben 1652.jpg' src='//upload.wikimedia.org/wikipedia/commons/thumb/1/15/Image-Koenigsberg%

2C_Map_by_Merian-Erben_1652.jpg/120px-Image-Koenigsberg%2C_Map_by_Merian-Erben_1652.jpg' width='120' height='84'

srcset='//upload.wikimedia.org/wikipedia/commons/thumb/1/15/Image-Koenigsberg%2C_Map_by_Merian-Erben_1652.jpg/180px-Image-Koenigsbe

2C_Map_by_Merian-Erben_1652.jpg 1.5x, //upload.wikimedia.org/wikipedia/commons/thumb/1/15/Image-Koenigsberg%2C_

Map_by_Merian-Erben_1652.jpg/240px-Image-Koenigsberg%2C_Map_by_Merian-Erben_1652.jpg 2x' data-ﬁle-width='628' dataﬁle-height='437' /></a>

Original artist: Bogdan Giuşcă

• File:Labelled_undirected_graph.svg Source: https://upload.wikimedia.org/wikipedia/commons/a/a5/Labelled_undirected_graph.svg

License: CC BY-SA 3.0 Contributors: derived from http://en.wikipedia.org/wiki/File:6n-graph2.svg Original artist: Jakob.scholbach

• File:Lambda_lc.svg Source: https://upload.wikimedia.org/wikipedia/commons/3/39/Lambda_lc.svg License: Public domain Contributors: The Greek alphabet Original artist: User:Luks

9.5. TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES

125

• File:Lattice_of_the_divisibility_of_60.svg Source: https://upload.wikimedia.org/wikipedia/commons/5/51/Lattice_of_the_divisibility_

of_60.svg License: CC-BY-SA-3.0 Contributors: ? Original artist: ?

• File:Leonhard_Euler_2.jpg Source: https://upload.wikimedia.org/wikipedia/commons/6/60/Leonhard_Euler_2.jpg License: Public

domain Contributors:

• 2011-12-22 (upload, according to EXIF data)

Original artist: Jakob Emanuel Handmann

• File:Limitcycle.svg Source: https://upload.wikimedia.org/wikipedia/commons/9/91/Limitcycle.svg License: CC BY-SA 3.0 Contributors: Own work Original artist: Gargan

• File:Lorenz_attractor.svg Source: https://upload.wikimedia.org/wikipedia/commons/f/f4/Lorenz_attractor.svg License: CC BY 2.5

Contributors: ? Original artist: ?

• File:Lorenz_attractor_yb.svg Source: https://upload.wikimedia.org/wikipedia/commons/5/5b/Lorenz_attractor_yb.svg License: CCBY-SA-3.0 Contributors: ? Original artist: ?

• File:Mandel_zoom_07_satellite.jpg Source: https://upload.wikimedia.org/wikipedia/commons/b/b3/Mandel_zoom_07_satellite.jpg License: CC-BY-SA-3.0 Contributors: ? Original artist: ?

• File:Market_Data_Index_NYA_on_20050726_202628_UTC.png Source: https://upload.wikimedia.org/wikipedia/commons/4/46/Market_

Data_Index_NYA_on_20050726_202628_UTC.png License: Public domain Contributors: ? Original artist: ?

• File:Markov_chain_SVG.svg Source: https://upload.wikimedia.org/wikipedia/commons/2/29/Markov_chain_SVG.svg License: CC

BY-SA 3.0 Contributors: This graphic was created with matplotlib. Original artist: IkamusumeFan

• File:Matrix.svg Source: https://upload.wikimedia.org/wikipedia/commons/b/bb/Matrix.svg License: GFDL Contributors: Own work

Original artist: Lakeworks

• File:Matrix_multiplication_diagram_2.svg Source: https://upload.wikimedia.org/wikipedia/commons/e/eb/Matrix_multiplication_diagram_

2.svg License: CC-BY-SA-3.0 Contributors: This ﬁle was derived from: Matrix multiplication diagram.svg

Original artist: File:Matrix multiplication diagram.svg:User:Bilou

• File:Maximum_boxed.png Source: https://upload.wikimedia.org/wikipedia/commons/1/1a/Maximum_boxed.png License: Public domain Contributors: Created with the help of GraphCalc Original artist: Freiddy

• File:Maya.svg Source: https://upload.wikimedia.org/wikipedia/commons/1/1b/Maya.svg License: CC-BY-SA-3.0 Contributors: Image:

Maya.png Original artist: Bryan Derksen

• File:Measure_illustration.png Source: https://upload.wikimedia.org/wikipedia/commons/a/a6/Measure_illustration.png License: Public domain Contributors: self-made with en:Inkscape Original artist: Oleg Alexandrov

• File:MeningiomaMRISegmentation.png Source: https://upload.wikimedia.org/wikipedia/commons/e/e4/MeningiomaMRISegmentation.

png License: CC BY-SA 3.0 Contributors:

• Own work by the original uploader

• I used the open source package 3D slicer to segment the meningiom in a data set

Original artist: Rkikinis

• File:Multi-pseudograph.svg Source: https://upload.wikimedia.org/wikipedia/commons/c/c9/Multi-pseudograph.svg License: CC BYSA 3.0 Contributors: Own work Original artist: 0x24a537r9

• File:NOR_ANSI.svg Source: https://upload.wikimedia.org/wikipedia/commons/6/6c/NOR_ANSI.svg License: Public domain Contributors: Own Drawing, made in Inkscape 0.43 Original artist: jjbeard

• File:Naphthalene-3D-balls.png Source: https://upload.wikimedia.org/wikipedia/commons/3/3e/Naphthalene-3D-balls.png License: Public domain Contributors: ? Original artist: ?

• File:Navier_Stokes_Laminar.svg Source: https://upload.wikimedia.org/wikipedia/commons/7/73/Navier_Stokes_Laminar.svg License:

CC BY-SA 4.0 Contributors: Own work Original artist: IkamusumeFan

• File:Network_Library_LAN.svg Source: https://upload.wikimedia.org/wikipedia/commons/b/b9/Network_Library_LAN.svg License:

CC BY-SA 4.0 Contributors: <a href='//commons.wikimedia.org/wiki/File:NETWORK-Library-LAN.png' class='image'><img alt='NETWORKLibrary-LAN.png' src='//upload.wikimedia.org/wikipedia/commons/thumb/2/27/NETWORK-Library-LAN.png/100px-NETWORK-Library-LAN.

png' width='100' height='74' srcset='//upload.wikimedia.org/wikipedia/commons/thumb/2/27/NETWORK-Library-LAN.png/150px-NETWORK-Library-LA

png 1.5x, //upload.wikimedia.org/wikipedia/commons/thumb/2/27/NETWORK-Library-LAN.png/200px-NETWORK-Library-LAN.

png 2x' data-ﬁle-width='1465' data-ﬁle-height='1090' /></a> Original artist: Fred the Oyster

• File:Neuron.png Source: https://upload.wikimedia.org/wikipedia/commons/6/67/Neuron.png License: Public domain Contributors: Originally Neuron.svg -> originally Neuron.jpg taken from the US Federal (public domain) Original artist: Caiguanhao

• File:Neuron.svg Source: https://upload.wikimedia.org/wikipedia/commons/b/b5/Neuron.svg License: CC-BY-SA-3.0 Contributors: ?

Original artist: ?

• File:Nicolas_P._Rougier’{}s_rendering_of_the_human_brain.png Source: https://upload.wikimedia.org/wikipedia/commons/7/73/

Nicolas_P._Rougier%27s_rendering_of_the_human_brain.png License: GPL Contributors: http://www.loria.fr/~{}rougier Original artist:

Nicolas Rougier

• File:Nuvola_apps_atlantik.png Source: https://upload.wikimedia.org/wikipedia/commons/7/77/Nuvola_apps_atlantik.png License: LGPL

Contributors: http://icon-king.com Original artist: David Vignoni / ICON KING

• File:Nuvola_apps_edu_mathematics_blue-p.svg Source: https://upload.wikimedia.org/wikipedia/commons/3/3e/Nuvola_apps_edu_

mathematics_blue-p.svg License: GPL Contributors: Derivative work from Image:Nuvola apps edu mathematics.png and Image:Nuvola

apps edu mathematics-p.svg Original artist: David Vignoni (original icon); Flamurai (SVG convertion); bayo (color)

126

CHAPTER 9. VERTEX (GRAPH THEORY)

• File:Nuvola_apps_kaboodle.svg Source: https://upload.wikimedia.org/wikipedia/commons/1/1b/Nuvola_apps_kaboodle.svg License:

LGPL Contributors: http://ftp.gnome.org/pub/GNOME/sources/gnome-themes-extras/0.9/gnome-themes-extras-0.9.0.tar.gz Original artist:

David Vignoni / ICON KING

• File:Oldfaithful3.png Source: https://upload.wikimedia.org/wikipedia/commons/0/0f/Oldfaithful3.png License: Public domain Contributors: ? Original artist: ?

• File:Open_book_nae_02.svg Source: https://upload.wikimedia.org/wikipedia/commons/9/92/Open_book_nae_02.svg License: CC0

Contributors: OpenClipart Original artist: nae

• File:Operating_system_placement.svg Source: https://upload.wikimedia.org/wikipedia/commons/e/e1/Operating_system_placement.

svg License: CC BY-SA 3.0 Contributors: Own work Original artist: Golftheman

• File:Padlock.svg Source: https://upload.wikimedia.org/wikipedia/en/5/59/Padlock.svg License: PD Contributors: ? Original artist: ?

• File:People_icon.svg Source: https://upload.wikimedia.org/wikipedia/commons/3/37/People_icon.svg License: CC0 Contributors: OpenClipart Original artist: OpenClipart

• File:Pert_chart_colored.svg Source: https://upload.wikimedia.org/wikipedia/commons/3/37/Pert_chart_colored.svg License: Public

domain Contributors: This ﬁle was derived from: Pert chart colored.gif: <a href='//commons.wikimedia.org/wiki/File:Pert_chart_colored.

gif' class='image'><img alt='Pert chart colored.gif' src='//upload.wikimedia.org/wikipedia/commons/thumb/b/b9/Pert_chart_colored.

gif/50px-Pert_chart_colored.gif' width='50' height='31' srcset='//upload.wikimedia.org/wikipedia/commons/thumb/b/b9/Pert_chart_colored.

gif/75px-Pert_chart_colored.gif 1.5x, //upload.wikimedia.org/wikipedia/commons/thumb/b/b9/Pert_chart_colored.gif/100px-Pert_chart_

colored.gif 2x' data-ﬁle-width='309' data-ﬁle-height='190' /></a>

Original artist: Pert_chart_colored.gif: Original uploader was Jeremykemp at en.wikipedia

• File:Portal-puzzle.svg Source: https://upload.wikimedia.org/wikipedia/en/f/fd/Portal-puzzle.svg License: Public domain Contributors:

? Original artist: ?

• File:Python_add5_syntax.svg Source: https://upload.wikimedia.org/wikipedia/commons/e/e1/Python_add5_syntax.svg License: Copyrighted free use Contributors: http://en.wikipedia.org/wiki/Image:Python_add5_syntax.png Original artist: Xander89

• File:Quark_wiki.jpg Source: https://upload.wikimedia.org/wikipedia/commons/c/cb/Quark_wiki.jpg License: CC BY-SA 3.0 Contributors: Own work Original artist: Brianzero

• File:Question_book-new.svg Source: https://upload.wikimedia.org/wikipedia/en/9/99/Question_book-new.svg License: Cc-by-sa-3.0

Contributors:

Created from scratch in Adobe Illustrator. Based on Image:Question book.png created by User:Equazcion Original artist:

Tkgd2007

• File:Roomba_original.jpg Source: https://upload.wikimedia.org/wikipedia/commons/f/f5/Roomba_original.jpg License: CC BY-SA

3.0 Contributors: © 2006 Larry D. Moore Original artist: Larry D. Moore

• File:Rotation_by_pi_over_6.svg Source: https://upload.wikimedia.org/wikipedia/commons/8/8e/Rotation_by_pi_over_6.svg License:

Public domain Contributors: Own work using Inkscape Original artist: RobHar

• File:Rubik’{}s_cube.svg Source: https://upload.wikimedia.org/wikipedia/commons/a/a6/Rubik%27s_cube.svg License: CC-BY-SA3.0 Contributors: Based on Image:Rubiks cube.jpg Original artist: This image was created by me, Booyabazooka

• File:SIMD.svg Source: https://upload.wikimedia.org/wikipedia/commons/2/21/SIMD.svg License: CC-BY-SA-3.0 Contributors: Own

work in Inkscape Original artist: en:User:Cburnett

• File:Saddle_Point_SVG.svg Source: https://upload.wikimedia.org/wikipedia/commons/0/0d/Saddle_Point_SVG.svg License: CC BYSA 3.0 Contributors: This graphic was created with matplotlib. Original artist: IkamusumeFan

• File:Scaling_by_1.5.svg Source: https://upload.wikimedia.org/wikipedia/commons/c/c7/Scaling_by_1.5.svg License: Public domain

Contributors: Own work using Inkscape Original artist: RobHar

• File:Signal_transduction_pathways.svg Source: https://upload.wikimedia.org/wikipedia/commons/b/b0/Signal_transduction_pathways.

svg License: CC BY-SA 3.0 Contributors: http://en.wikipedia.org/wiki/File:Signal_transduction_v1.png Original artist: cybertory

• File:Simple_feedback_control_loop2.svg Source: https://upload.wikimedia.org/wikipedia/commons/9/90/Simple_feedback_control_

loop2.svg License: CC BY-SA 3.0 Contributors: This ﬁle was derived from: Simple feedback control loop2.png: <a href='//commons.

wikimedia.org/wiki/File:Simple_feedback_control_loop2.png' class='image'><img alt='Simple feedback control loop2.png' src='//upload.

wikimedia.org/wikipedia/commons/thumb/4/45/Simple_feedback_control_loop2.png/50px-Simple_feedback_control_loop2.png' width='50'

height='17' srcset='//upload.wikimedia.org/wikipedia/commons/thumb/4/45/Simple_feedback_control_loop2.png/75px-Simple_feedback_

control_loop2.png 1.5x, //upload.wikimedia.org/wikipedia/commons/thumb/4/45/Simple_feedback_control_loop2.png/100px-Simple_

feedback_control_loop2.png 2x' data-ﬁle-width='439' data-ﬁle-height='150' /></a>

Original artist: Simple_feedback_control_loop2.png: Corona

• File:SimplexRangeSearching.png Source: https://upload.wikimedia.org/wikipedia/commons/4/48/SimplexRangeSearching.png License:

Public domain Contributors: Transferred from en.wikipedia Original artist: Original uploader was Gfonsecabr at en.wikipedia. Later version(s) were uploaded by McLoaf at en.wikipedia.

• File:Singly_linked_list.png Source: https://upload.wikimedia.org/wikipedia/commons/3/37/Singly_linked_list.png License: Public domain Contributors: Copied from en. Originally uploaded by Dcoetzee. Original artist: Derrick Coetzee (User:Dcoetzee)

• File:Sinusvåg_400px.png Source: https://upload.wikimedia.org/wikipedia/commons/8/8c/Sinusv%C3%A5g_400px.png License: Public domain Contributors: ? Original artist: User Solkoll on sv.wikipedia

• File:Sky.png Source: https://upload.wikimedia.org/wikipedia/commons/0/08/Sky.png License: CC BY-SA 2.5 Contributors: selbst gemacht.

own work. Original artist: Manuel Strehl

• File:Sorting_quicksort_anim.gif Source: https://upload.wikimedia.org/wikipedia/commons/6/6a/Sorting_quicksort_anim.gif License:

CC-BY-SA-3.0 Contributors: originally upload on the English Wikipedia Original artist: Wikipedia:en:User:RolandH

• File:Sorting_quicksort_anim_frame.png Source: https://upload.wikimedia.org/wikipedia/commons/1/1e/Sorting_quicksort_anim_frame.

png License: CC-BY-SA-3.0 Contributors: Image:Sorting quicksort anim.gif Original artist: en:User:RolandH

9.5. TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES

127

• File:Squeeze_r=1.5.svg Source: https://upload.wikimedia.org/wikipedia/commons/6/67/Squeeze_r%3D1.5.svg License: Public domain

Contributors: Own work Original artist: RobHar

• File:Symbol_book_class2.svg Source: https://upload.wikimedia.org/wikipedia/commons/8/89/Symbol_book_class2.svg License: CC

BY-SA 2.5 Contributors: Mad by Lokal_Proﬁl by combining: Original artist: Lokal_Proﬁl

• File:TSP_Deutschland_3.png Source: https://upload.wikimedia.org/wikipedia/commons/c/c4/TSP_Deutschland_3.png License: Public domain Contributors: https://www.cia.gov/cia/publications/factbook/maps/gm-map.gif Original artist: The original uploader was

Kapitän Nemo at German Wikipedia

• File:Text_document_with_red_question_mark.svg Source: https://upload.wikimedia.org/wikipedia/commons/a/a4/Text_document_

with_red_question_mark.svg License: Public domain Contributors: Created by bdesham with Inkscape; based upon Text-x-generic.svg

from the Tango project. Original artist: Benjamin D. Esham (bdesham)

• File:Torus.png Source: https://upload.wikimedia.org/wikipedia/commons/1/17/Torus.png License: Public domain Contributors: ? Original artist: ?

• File:Tree_graph.svg Source: https://upload.wikimedia.org/wikipedia/commons/2/24/Tree_graph.svg License: Public domain Contributors: ? Original artist: ?

• File:TruncatedTetrahedron.gif Source: https://upload.wikimedia.org/wikipedia/commons/0/0f/TruncatedTetrahedron.gif License: Public domain Contributors: Own work Original artist: Radagast3

• File:Two_red_dice_01.svg Source: https://upload.wikimedia.org/wikipedia/commons/3/36/Two_red_dice_01.svg License: CC0 Contributors: Open Clip Art Library Original artist: Stephen Silver

• File:Ulam_1.png Source: https://upload.wikimedia.org/wikipedia/commons/6/69/Ulam_1.png License: CC-BY-SA-3.0 Contributors:

Transferred from en.wikipedia to Commons. Original artist: Grontesca at English Wikipedia

• File:Undirected.svg Source: https://upload.wikimedia.org/wikipedia/commons/b/bf/Undirected.svg License: Public domain Contributors: ? Original artist: ?

• File:User-FastFission-brain.gif Source: https://upload.wikimedia.org/wikipedia/commons/c/c7/User-FastFission-brain.gif License: CCBY-SA-3.0 Contributors: ? Original artist: ?

• File:Utah_teapot_simple_2.png Source: https://upload.wikimedia.org/wikipedia/commons/5/5f/Utah_teapot_simple_2.png License:

CC BY-SA 3.0 Contributors: Own work Original artist: Dhatﬁeld

• File:Vector_field.svg Source: https://upload.wikimedia.org/wikipedia/commons/c/c2/Vector_field.svg License: Public domain Contributors: Own work Original artist: Fibonacci.

• File:Venn_A_intersect_B.svg Source: https://upload.wikimedia.org/wikipedia/commons/6/6d/Venn_A_intersect_B.svg License: Public domain Contributors: Own work Original artist: Cepheus

• File:VerticalShear_m=1.25.svg Source: https://upload.wikimedia.org/wikipedia/commons/9/92/VerticalShear_m%3D1.25.svg License:

Public domain Contributors: Own work using Inkscape Original artist: RobHar

• File:Wacom_graphics_tablet_and_pen.png Source: https://upload.wikimedia.org/wikipedia/commons/d/d4/Wacom_graphics_tablet_

and_pen.png License: CC BY-SA 3.0 Contributors:

• Wacom_Pen-tablet_without_mouse.jpg Original artist: Wacom_Pen-tablet_without_mouse.jpg: *Wacom_Pen-tablet.jpg: photographed

by Tobias Rütten,Metoc

• File:Wang_tiles.png Source: https://upload.wikimedia.org/wikipedia/commons/0/06/Wang_tiles.png License: Public domain Contributors: ? Original artist: ?

• File:Wikibooks-logo-en-noslogan.svg Source: https://upload.wikimedia.org/wikipedia/commons/d/df/Wikibooks-logo-en-noslogan.

svg License: CC BY-SA 3.0 Contributors: Own work Original artist: User:Bastique, User:Ramac et al.

• File:Wikibooks-logo.svg Source: https://upload.wikimedia.org/wikipedia/commons/f/fa/Wikibooks-logo.svg License: CC BY-SA 3.0

Contributors: Own work Original artist: User:Bastique, User:Ramac et al.

• File:Wikinews-logo.svg Source: https://upload.wikimedia.org/wikipedia/commons/2/24/Wikinews-logo.svg License: CC BY-SA 3.0

Contributors: This is a cropped version of Image:Wikinews-logo-en.png. Original artist: Vectorized by Simon 01:05, 2 August 2006

(UTC) Updated by Time3000 17 April 2007 to use oﬃcial Wikinews colours and appear correctly on dark backgrounds. Originally

uploaded by Simon.

• File:WikipediaBinary.svg Source: https://upload.wikimedia.org/wikipedia/commons/b/bb/WikipediaBinary.svg License: CC BY 2.5

Contributors: Transferred from en.wikipedia; transferred to Commons by User:Sfan00_IMG using CommonsHelper.

Original artist: User:Spinningspark.

• File:Wikipedia_multilingual_network_graph_July_2013.svg Source: https://upload.wikimedia.org/wikipedia/commons/5/5b/Wikipedia_

multilingual_network_graph_July_2013.svg License: CC BY-SA 3.0 Contributors: Own work Original artist: Computermacgyver

• File:Wikiquote-logo.svg Source: https://upload.wikimedia.org/wikipedia/commons/f/fa/Wikiquote-logo.svg License: Public domain

Contributors: ? Original artist: ?

• File:Wikisource-logo.svg Source: https://upload.wikimedia.org/wikipedia/commons/4/4c/Wikisource-logo.svg License: CC BY-SA

3.0 Contributors: Rei-artur Original artist: Nicholas Moreau

• File:Wikiversity-logo-Snorky.svg Source: https://upload.wikimedia.org/wikipedia/commons/1/1b/Wikiversity-logo-en.svg License:

CC BY-SA 3.0 Contributors: Own work Original artist: Snorky

• File:Wikiversity-logo.svg Source: https://upload.wikimedia.org/wikipedia/commons/9/91/Wikiversity-logo.svg License: CC BY-SA

3.0 Contributors: Snorky (optimized and cleaned up by verdy_p) Original artist: Snorky (optimized and cleaned up by verdy_p)

• File:Wiktionary-logo-en.svg Source: https://upload.wikimedia.org/wikipedia/commons/f/f8/Wiktionary-logo-en.svg License: Public

domain Contributors: Vector version of Image:Wiktionary-logo-en.png. Original artist: Vectorized by Fvasconcellos (talk · contribs),

based on original logo tossed together by Brion Vibber

128

CHAPTER 9. VERTEX (GRAPH THEORY)

9.5.3

Content license

• Creative Commons Attribution-Share Alike 3.0

From Wikipedia, the free encyclopedia

Contents

1

Computer science

1

1.1

History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1

1.1.1

Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

4

Philosophy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

5

1.2.1

Name of the ﬁeld . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

5

Areas of computer science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

6

1.3.1

Theoretical computer science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

7

1.3.2

Applied computer science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

8

1.4

The great insights of computer science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

10

1.5

Academia . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

11

1.5.1

Conferences . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

11

1.5.2

Journals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

11

1.6

Education . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

11

1.7

See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

12

1.8

Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

13

1.9

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

13

1.10 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

15

1.11 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

16

Discrete mathematics

18

2.1

Grand challenges, past and present . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

19

2.2

Topics in discrete mathematics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

19

2.2.1

Theoretical computer science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

19

2.2.2

Information theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

21

2.2.3

Logic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

21

2.2.4

Set theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

21

2.2.5

Combinatorics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

23

2.2.6

Graph theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

23

2.2.7

Probability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

24

2.2.8

Number theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

24

2.2.9

Algebra . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

25

2.2.10 Calculus of ﬁnite diﬀerences, discrete calculus or discrete analysis . . . . . . . . . . . . . .

25

2.2.11 Geometry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

25

1.2

1.3

2

i

ii

3

CONTENTS

2.2.12 Topology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

25

2.2.13 Operations research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

25

2.2.14 Game theory, decision theory, utility theory, social choice theory . . . . . . . . . . . . . .

26

2.2.15 Discretization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

27

2.2.16 Discrete analogues of continuous mathematics . . . . . . . . . . . . . . . . . . . . . . . .

27

2.2.17 Hybrid discrete and continuous mathematics . . . . . . . . . . . . . . . . . . . . . . . . .

27

2.3

See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

28

2.4

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

28

2.5

Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

28

2.6

External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

29

Glossary of graph theory

30

3.1

Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

30

3.1.1

Subgraphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

32

3.1.2

Walks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

33

3.1.3

Trees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

34

3.1.4

Cliques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

34

3.1.5

Strongly connected component . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

35

3.1.6

Hypercubes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

36

3.1.7

Knots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

36

3.1.8

Minors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

36

3.1.9

Embedding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

37

Adjacency and degree . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

37

3.2.1

Independence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

38

3.3

Complexity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

38

3.4

Connectivity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

38

3.5

Distance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

39

3.6

Genus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

39

3.7

Weighted graphs and networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

40

3.8

Direction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

40

3.8.1

Directed acyclic graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

41

Colouring . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

41

3.10 Various . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

42

3.11 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

42

3.12 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

42

Graph (mathematics)

44

4.1

Deﬁnitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

45

4.1.1

Graph . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

45

4.1.2

Adjacency relation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

45

Types of graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

45

4.2.1

45

3.2

3.9

4

4.2

Distinction in terms of the main deﬁnition . . . . . . . . . . . . . . . . . . . . . . . . . .

CONTENTS

4.2.2

5

iii

Important graph classes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

47

4.3

Properties of graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

48

4.4

Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

49

4.5

Important graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

50

4.6

Operations on graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

50

4.7

Generalizations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

51

4.8

See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

51

4.9

Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

52

4.10 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

52

4.11 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

53

4.12 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

53

Graph theory

54

5.1

Deﬁnitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

55

5.1.1

Graph . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

55

5.2

Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

55

5.3

History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

57

5.4

Graph drawing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

58

5.5

Graph-theoretic data structures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

58

5.6

Problems in graph theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

59

5.6.1

Enumeration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

59

5.6.2

Subgraphs, induced subgraphs, and minors . . . . . . . . . . . . . . . . . . . . . . . . . .

59

5.6.3

Graph coloring . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

59

5.6.4

Subsumption and uniﬁcation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

60

5.6.5

Route problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

60

5.6.6

Network ﬂow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

60

5.6.7

Visibility problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

60

5.6.8

Covering problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

60

5.6.9

Decomposition problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

61

5.6.10 Graph classes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

61

See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

61

5.7.1

Related topics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

61

5.7.2

Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

62

5.7.3

Subareas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

63

5.7.4

Related areas of mathematics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

63

5.7.5

Generalizations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

63

5.7.6

Prominent graph theorists . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

63

5.8

Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

64

5.9

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

65

5.10 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

65

5.10.1 Online textbooks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

65

5.7

iv

6

7

CONTENTS

Loop (graph theory)

66

6.1

Degree . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

67

6.2

Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

67

6.3

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

67

6.4

External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

67

6.5

See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

67

Mathematics

69

7.1

History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

70

7.1.1

Evolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

70

7.1.2

Etymology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

72

Deﬁnitions of mathematics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

73

7.2.1

Mathematics as science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

73

7.3

Inspiration, pure and applied mathematics, and aesthetics . . . . . . . . . . . . . . . . . . . . . . .

76

7.4

Notation, language, and rigor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

77

7.5

Fields of mathematics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

77

7.5.1

Foundations and philosophy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

78

7.5.2

Pure mathematics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

79

7.5.3

Applied mathematics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

81

7.6

Mathematical awards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

81

7.7

See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

82

7.8

Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

82

7.9

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

84

7.10 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

85

7.11 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

85

Matrix (mathematics)

87

8.1

Deﬁnition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

88

8.1.1

Size . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

88

8.2

Notation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

89

8.3

Basic operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

89

8.3.1

Addition, scalar multiplication and transposition . . . . . . . . . . . . . . . . . . . . . . .

89

8.3.2

Matrix multiplication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

90

8.3.3

Row operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

91

8.3.4

Submatrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

91

8.4

Linear equations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

92

8.5

Linear transformations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

92

8.6

Square matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

92

8.6.1

Main types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

93

8.6.2

Main operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

95

8.7

Computational aspects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

96

8.8

Decomposition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

97

7.2

8

CONTENTS

v

8.9

Abstract algebraic aspects and generalizations . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

98

8.9.1

Matrices with more general entries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

98

8.9.2

Relationship to linear maps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

99

8.9.3

Matrix groups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

99

8.9.4

Inﬁnite matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100

8.9.5

Empty matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100

8.10 Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101

8.10.1 Graph theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101

8.10.2 Analysis and geometry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101

8.10.3 Probability theory and statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102

8.10.4 Symmetries and transformations in physics . . . . . . . . . . . . . . . . . . . . . . . . . . 103

8.10.5 Linear combinations of quantum states . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104

8.10.6 Normal modes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104

8.10.7 Geometrical optics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105

8.10.8 Electronics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105

8.11 History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105

8.11.1 Other historical usages of the word “matrix” in mathematics . . . . . . . . . . . . . . . . . 106

8.12 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106

8.13 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107

8.14 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110

8.14.1 Physics references . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112

8.14.2 Historical references . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113

8.15 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113

9

Vertex (graph theory)

115

9.1

Types of vertices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116

9.2

See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116

9.3

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116

9.4

External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116

9.5

Text and image sources, contributors, and licenses . . . . . . . . . . . . . . . . . . . . . . . . . . 117

9.5.1

Text . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117

9.5.2

Images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122

9.5.3

Content license . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128

Chapter 1

Computer science

Computer science deals with the theoretical foundations of information and computation, together with practical

techniques for the implementation and application of these foundations

Computer science is the scientiﬁc and practical approach to computation and its applications. It is the systematic

study of the feasibility, structure, expression, and mechanization of the methodical procedures (or algorithms) that

underlie the acquisition, representation, processing, storage, communication of, and access to information, whether

such information is encoded as bits in a computer memory or transcribed in genes and protein structures in a biological

cell.[1] An alternate, more succinct deﬁnition of computer science is the study of automating algorithmic processes

that scale. A computer scientist specializes in the theory of computation and the design of computational systems.[2]

Its subﬁelds can be divided into a variety of theoretical and practical disciplines. Some ﬁelds, such as computational

complexity theory (which explores the fundamental properties of computational and intractable problems), are highly

abstract, while ﬁelds such as computer graphics emphasize real-world visual applications. Still other ﬁelds focus on the

challenges in implementing computation. For example, programming language theory considers various approaches

to the description of computation, while the study of computer programming itself investigates various aspects of

the use of programming language and complex systems. Human–computer interaction considers the challenges in

making computers and computations useful, usable, and universally accessible to humans.

1.1 History

Main article: History of computer science

The earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating ﬁxed numerical tasks such as the abacus have existed since antiquity, aiding in computations such as multiplication and division. Further, algorithms for performing computations have existed since

antiquity, even before sophisticated computing equipment were created. The ancient Sanskrit treatise Shulba Sutras,

or “Rules of the Chord”, is a book of algorithms written in 800 BCE for constructing geometric objects like altars

using a peg and chord, an early precursor of the modern ﬁeld of computational geometry.

Blaise Pascal designed and constructed the ﬁrst working mechanical calculator, Pascal’s calculator, in 1642.[3] In

1673 Gottfried Leibniz demonstrated a digital mechanical calculator, called the 'Stepped Reckoner'.[4] He may be

considered the ﬁrst computer scientist and information theorist, for, among other reasons, documenting the binary

number system. In 1820, Thomas de Colmar launched the mechanical calculator industry[note 1] when he released

his simpliﬁed arithmometer, which was the ﬁrst calculating machine strong enough and reliable enough to be used

daily in an oﬃce environment. Charles Babbage started the design of the ﬁrst automatic mechanical calculator, his

diﬀerence engine, in 1822, which eventually gave him the idea of the ﬁrst programmable mechanical calculator, his

Analytical Engine.[5] He started developing this machine in 1834 and “in less than two years he had sketched out

many of the salient features of the modern computer. A crucial step was the adoption of a punched card system

derived from the Jacquard loom”[6] making it inﬁnitely programmable.[note 2] In 1843, during the translation of a

French article on the analytical engine, Ada Lovelace wrote, in one of the many notes she included, an algorithm

to compute the Bernoulli numbers, which is considered to be the ﬁrst computer program.[7] Around 1885, Herman

Hollerith invented the tabulator, which used punched cards to process statistical information; eventually his company

1

2

CHAPTER 1. COMPUTER SCIENCE

Charles Babbage is credited with inventing the ﬁrst mechanical computer.

became part of IBM. In 1937, one hundred years after Babbage’s impossible dream, Howard Aiken convinced IBM,

which was making all kinds of punched card equipment and was also in the calculator business[8] to develop his giant

programmable calculator, the ASCC/Harvard Mark I, based on Babbage’s analytical engine, which itself used cards

and a central computing unit. When the machine was ﬁnished, some hailed it as “Babbage’s dream come true”.[9]

During the 1940s, as new and more powerful computing machines were developed, the term computer came to

refer to the machines rather than their human predecessors.[10] As it became clear that computers could be used for

more than just mathematical calculations, the ﬁeld of computer science broadened to study computation in general.

Computer science began to be established as a distinct academic discipline in the 1950s and early 1960s.[11][12] The

world’s ﬁrst computer science degree program, the Cambridge Diploma in Computer Science, began at the University

of Cambridge Computer Laboratory in 1953. The ﬁrst computer science degree program in the United States was

formed at Purdue University in 1962.[13] Since practical computers became available, many applications of computing

have become distinct areas of study in their own rights.

Although many initially believed it was impossible that computers themselves could actually be a scientiﬁc ﬁeld of

study, in the late ﬁfties it gradually became accepted among the greater academic population.[14][15] It is the now wellknown IBM brand that formed part of the computer science revolution during this time. IBM (short for International

Business Machines) released the IBM 704[16] and later the IBM 709[17] computers, which were widely used during the

exploration period of such devices. “Still, working with the IBM [computer] was frustrating ... if you had misplaced

as much as one letter in one instruction, the program would crash, and you would have to start the whole process

1.1. HISTORY

3

Ada Lovelace is credited with writing the ﬁrst algorithm intended for processing on a computer.

over again”.[14] During the late 1950s, the computer science discipline was very much in its developmental stages,

and such issues were commonplace.[15]

Time has seen signiﬁcant improvements in the usability and eﬀectiveness of computing technology. Modern society

has seen a signiﬁcant shift in the users of computer technology, from usage only by experts and professionals, to

a near-ubiquitous user base. Initially, computers were quite costly, and some degree of human aid was needed for

4

CHAPTER 1. COMPUTER SCIENCE

eﬃcient use - in part from professional computer operators. As computer adoption became more widespread and

aﬀordable, less human assistance was needed for common usage.

1.1.1

Contributions

The German military used the Enigma machine (shown here) during World War II for communications they wanted kept secret. The

large-scale decryption of Enigma traﬃc at Bletchley Park was an important factor that contributed to Allied victory in WWII.[18]

1.2. PHILOSOPHY

5

Despite its short history as a formal academic discipline, computer science has made a number of fundamental

contributions to science and society - in fact, along with electronics, it is a founding science of the current epoch of

human history called the Information Age and a driver of the Information Revolution, seen as the third major leap

in human technological progress after the Industrial Revolution (1750-1850 CE) and the Agricultural Revolution

(8000-5000 BCE).

These contributions include:

• The start of the "digital revolution", which includes the current Information Age and the Internet.[19]

• A formal deﬁnition of computation and computability, and proof that there are computationally unsolvable and

intractable problems.[20]

• The concept of a programming language, a tool for the precise expression of methodological information at

various levels of abstraction.[21]

• In cryptography, breaking the Enigma code was an important factor contributing to the Allied victory in World

War II.[18]

• Scientiﬁc computing enabled practical evaluation of processes and situations of great complexity, as well as

experimentation entirely by software. It also enabled advanced study of the mind, and mapping of the human genome became possible with the Human Genome Project.[19] Distributed computing projects such as

Folding@home explore protein folding.

• Algorithmic trading has increased the eﬃciency and liquidity of ﬁnancial markets by using artiﬁcial intelligence,

machine learning, and other statistical and numerical techniques on a large scale.[22] High frequency algorithmic

trading can also exacerbate volatility.[23]

• Computer graphics and computer-generated imagery have become ubiquitous in modern entertainment, particularly in television, cinema, advertising, animation and video games. Even ﬁlms that feature no explicit CGI

are usually “ﬁlmed” now on digital cameras, or edited or postprocessed using a digital video editor.

• Simulation of various processes, including computational ﬂuid dynamics, physical, electrical, and electronic

systems and circuits, as well as societies and social situations (notably war games) along with their habitats,

among many others. Modern computers enable optimization of such designs as complete aircraft. Notable

in electrical and electronic circuit design are SPICE, as well as software for physical realization of new (or

modiﬁed) designs. The latter includes essential design software for integrated circuits.

• Artiﬁcial intelligence is becoming increasingly important as it gets more eﬃcient and complex. There are many

applications of AI, some of which can be seen at home, such as robotic vacuum cleaners. It is also present in

video games and on the modern battleﬁeld in drones, anti-missile systems, and squad support robots.

1.2 Philosophy

Main article: Philosophy of computer science

A number of computer scientists have argued for the distinction of three separate paradigms in computer science.

Peter Wegner argued that those paradigms are science, technology, and mathematics.[24] Peter Denning's working

group argued that they are theory, abstraction (modeling), and design.[25] Amnon H. Eden described them as the

“rationalist paradigm” (which treats computer science as a branch of mathematics, which is prevalent in theoretical

computer science, and mainly employs deductive reasoning), the “technocratic paradigm” (which might be found in

engineering approaches, most prominently in software engineering), and the “scientiﬁc paradigm” (which approaches

computer-related artifacts from the empirical perspective of natural sciences, identiﬁable in some branches of artiﬁcial

intelligence).[26]

1.2.1

Name of the ﬁeld

Although ﬁrst proposed in 1956,[15] the term “computer science” appears in a 1959 article in Communications of the

ACM,[27] in which Louis Fein argues for the creation of a Graduate School in Computer Sciences analogous to the

6

CHAPTER 1. COMPUTER SCIENCE

creation of Harvard Business School in 1921,[28] justifying the name by arguing that, like management science, the

subject is applied and interdisciplinary in nature, while having the characteristics typical of an academic discipline.[27]

His eﬀorts, and those of others such as numerical analyst George Forsythe, were rewarded: universities went on to

create such programs, starting with Purdue in 1962.[29] Despite its name, a signiﬁcant amount of computer science

does not involve the study of computers themselves. Because of this, several alternative names have been proposed.[30]

Certain departments of major universities prefer the term computing science, to emphasize precisely that diﬀerence.

Danish scientist Peter Naur suggested the term datalogy,[31] to reﬂect the fact that the scientiﬁc discipline revolves

around data and data treatment, while not necessarily involving computers. The ﬁrst scientiﬁc institution to use the

term was the Department of Datalogy at the University of Copenhagen, founded in 1969, with Peter Naur being

the ﬁrst professor in datalogy. The term is used mainly in the Scandinavian countries. Also, in the early days of

computing, a number of terms for the practitioners of the ﬁeld of computing were suggested in the Communications

of the ACM – turingineer, turologist, ﬂow-charts-man, applied meta-mathematician, and applied epistemologist.[32]

Three months later in the same journal, comptologist was suggested, followed next year by hypologist.[33] The term

computics has also been suggested.[34] In Europe, terms derived from contracted translations of the expression “automatic information” (e.g. “informazione automatica” in Italian) or “information and mathematics” are often used,

e.g. informatique (French), Informatik (German), informatica (Italy, The Netherlands), informática (Spain, Portugal), informatika (Slavic languages and Hungarian) or pliroforiki (πληροφορική, which means informatics) in Greek.

Similar words have also been adopted in the UK (as in the School of Informatics of the University of Edinburgh).[35]

A folkloric quotation, often attributed to—but almost certainly not ﬁrst formulated by—Edsger Dijkstra, states that

“computer science is no more about computers than astronomy is about telescopes.”[note 3] The design and deployment of computers and computer systems is generally considered the province of disciplines other than computer

science. For example, the study of computer hardware is usually considered part of computer engineering, while the

study of commercial computer systems and their deployment is often called information technology or information

systems. However, there has been much cross-fertilization of ideas between the various computer-related disciplines.

Computer science research also often intersects other disciplines, such as philosophy, cognitive science, linguistics,

mathematics, physics, biology, statistics, and logic.

Computer science is considered by some to have a much closer relationship with mathematics than many scientiﬁc

disciplines, with some observers saying that computing is a mathematical science.[11] Early computer science was

strongly inﬂuenced by the work of mathematicians such as Kurt Gödel and Alan Turing, and there continues to be

a useful interchange of ideas between the two ﬁelds in areas such as mathematical logic, category theory, domain

theory, and algebra.[15]

The relationship between computer science and software engineering is a contentious issue, which is further muddied by disputes over what the term “software engineering” means, and how computer science is deﬁned.[36] David

Parnas, taking a cue from the relationship between other engineering and science disciplines, has claimed that the

principal focus of computer science is studying the properties of computation in general, while the principal focus of

software engineering is the design of speciﬁc computations to achieve practical goals, making the two separate but

complementary disciplines.[37]

The academic, political, and funding aspects of computer science tend to depend on whether a department formed

with a mathematical emphasis or with an engineering emphasis. Computer science departments with a mathematics

emphasis and with a numerical orientation consider alignment with computational science. Both types of departments

tend to make eﬀorts to bridge the ﬁeld educationally if not across all research.

1.3 Areas of computer science

As a discipline, computer science spans a range of topics from theoretical studies of algorithms and the limits of

computation to the practical issues of implementing computing systems in hardware and software.[38][39] CSAB,

formerly called Computing Sciences Accreditation Board – which is made up of representatives of the Association for

Computing Machinery (ACM), and the IEEE Computer Society (IEEE-CS)[40] – identiﬁes four areas that it considers

crucial to the discipline of computer science: theory of computation, algorithms and data structures, programming

methodology and languages, and computer elements and architecture. In addition to these four areas, CSAB also

identiﬁes ﬁelds such as software engineering, artiﬁcial intelligence, computer networking and telecommunications,

database systems, parallel computation, distributed computation, computer-human interaction, computer graphics,

operating systems, and numerical and symbolic computation as being important areas of computer science.[38]

1.3. AREAS OF COMPUTER SCIENCE

1.3.1

7

Theoretical computer science

Main article: Theoretical computer science

The broader ﬁeld of theoretical computer science encompasses both the classical theory of computation and a wide

range of other topics that focus on the more abstract, logical, and mathematical aspects of computing.

Theory of computation

Main article: Theory of computation

According to Peter J. Denning, the fundamental question underlying computer science is, “What can be (eﬃciently)

automated?" [11] The study of the theory of computation is focused on answering fundamental questions about what

can be computed and what amount of resources are required to perform those computations. In an eﬀort to answer

the ﬁrst question, computability theory examines which computational problems are solvable on various theoretical

models of computation. The second question is addressed by computational complexity theory, which studies the

time and space costs associated with diﬀerent approaches to solving a multitude of computational problems.

The famous "P=NP?" problem, one of the Millennium Prize Problems,[41] is an open problem in the theory of

computation.

Information and coding theory

Main articles: Information theory and Coding theory

Information theory is related to the quantiﬁcation of information. This was developed by Claude E. Shannon to ﬁnd

fundamental limits on signal processing operations such as compressing data and on reliably storing and communicating data.[42] Coding theory is the study of the properties of codes (systems for converting information from one

form to another) and their ﬁtness for a speciﬁc application. Codes are used for data compression, cryptography, error

detection and correction, and more recently also for network coding. Codes are studied for the purpose of designing

eﬃcient and reliable data transmission methods.

Algorithms and data structures

Algorithms and data structures is the study of commonly used computational methods and their computational eﬃciency.

Programming language theory

Main article: Programming language theory

Programming language theory is a branch of computer science that deals with the design, implementation, analysis,

characterization, and classiﬁcation of programming languages and their individual features. It falls within the discipline of computer science, both depending on and aﬀecting mathematics, software engineering and linguistics. It is

an active research area, with numerous dedicated academic journals.

Formal methods

Main article: Formal methods

Formal methods are a particular kind of mathematically based technique for the speciﬁcation, development and

veriﬁcation of software and hardware systems. The use of formal methods for software and hardware design is

motivated by the expectation that, as in other engineering disciplines, performing appropriate mathematical analysis

can contribute to the reliability and robustness of a design. They form an important theoretical underpinning for

8

CHAPTER 1. COMPUTER SCIENCE

software engineering, especially where safety or security is involved. Formal methods are a useful adjunct to software

testing since they help avoid errors and can also give a framework for testing. For industrial use, tool support is

required. However, the high cost of using formal methods means that they are usually only used in the development

of high-integrity and life-critical systems, where safety or security is of utmost importance. Formal methods are best

described as the application of a fairly broad variety of theoretical computer science fundamentals, in particular logic

calculi, formal languages, automata theory, and program semantics, but also type systems and algebraic data types to

problems in software and hardware speciﬁcation and veriﬁcation.

1.3.2

Applied computer science

Applied computer science aims at identifying certain computer science concepts that can be used directly in solving

real world problems.

Artiﬁcial intelligence

Main article: Artiﬁcial intelligence

This branch of computer science aims to or is required to synthesise goal-orientated processes such as problemsolving, decision-making, environmental adaptation, learning and communication found in humans and animals.

From its origins in cybernetics and in the Dartmouth Conference (1956), artiﬁcial intelligence (AI) research has been

necessarily cross-disciplinary, drawing on areas of expertise such as applied mathematics, symbolic logic, semiotics,

electrical engineering, philosophy of mind, neurophysiology, and social intelligence. AI is associated in the popular

mind with robotic development, but the main ﬁeld of practical application has been as an embedded component in

areas of software development, which require computational understanding and modeling such as ﬁnance and economics, data mining and the physical sciences. The starting-point in the late 1940s was Alan Turing's question “Can

computers think?", and the question remains eﬀectively unanswered although the "Turing test" is still used to assess

computer output on the scale of human intelligence. But the automation of evaluative and predictive tasks has been

increasingly successful as a substitute for human monitoring and intervention in domains of computer application

involving complex real-world data.

Computer architecture and engineering

Main articles: Computer architecture and Computer engineering

Computer architecture, or digital computer organization, is the conceptual design and fundamental operational structure of a computer system. It focuses largely on the way by which the central processing unit performs internally and

accesses addresses in memory.[43] The ﬁeld often involves disciplines of computer engineering and electrical engineering, selecting and interconnecting hardware components to create computers that meet functional, performance,

and cost goals.

Computer performance analysis

Main article: Computer performance

Computer performance analysis is the study of work ﬂowing through computers with the general goals of improving

throughput, controlling response time, using resources eﬃciently, eliminating bottlenecks, and predicting performance under anticipated peak loads.[44]

Computer graphics and visualization

Main article: Computer graphics (computer science)

1.3. AREAS OF COMPUTER SCIENCE

9

Computer graphics is the study of digital visual contents, and involves synthese and manipulations of image data.

The study is connected to many other ﬁelds in computer science, including computer vision, image processing, and

computational geometry, and is heavily applied in the ﬁelds of special eﬀects and video games.

Computer security and cryptography

Main articles: Computer security and Cryptography

Computer security is a branch of computer technology, whose objective includes protection of information from

unauthorized access, disruption, or modiﬁcation while maintaining the accessibility and usability of the system for

its intended users. Cryptography is the practice and study of hiding (encryption) and therefore deciphering (decryption) information. Modern cryptography is largely related to computer science, for many encryption and decryption

algorithms are based on their computational complexity.

Computational science

Computational science (or scientiﬁc computing) is the ﬁeld of study concerned with constructing mathematical models

and quantitative analysis techniques and using computers to analyze and solve scientiﬁc problems. In practical use, it

is typically the application of computer simulation and other forms of computation to problems in various scientiﬁc

disciplines.

Computer networks

Main article: Computer network

This branch of computer science aims to manage networks between computers worldwide.

Concurrent, parallel and distributed systems

Main articles: Concurrency (computer science) and Distributed computing

Concurrency is a property of systems in which several computations are executing simultaneously, and potentially

interacting with each other. A number of mathematical models have been developed for general concurrent computation including Petri nets, process calculi and the Parallel Random Access Machine model. A distributed system

extends the idea of concurrency onto multiple computers connected through a network. Computers within the same

distributed system have their own private memory, and information is often exchanged among themselves to achieve

a common goal.

Databases

Main article: Database

A database is intended to organize, store, and retrieve large amounts of data easily. Digital databases are managed

using database management systems to store, create, maintain, and search data, through database models and query

languages.

Health informatics

Main article: Health informatics

Health Informatics in computer science deals with computational techniques for solving problems in health care.

10

CHAPTER 1. COMPUTER SCIENCE

Information science

Main article: Information science

Software engineering

Main article: Software engineering

Software engineering is the study of designing, implementing, and modifying software in order to ensure it is of

high quality, aﬀordable, maintainable, and fast to build. It is a systematic approach to software design, involving the

application of engineering practices to software. Software engineering deals with the organizing and analyzing of

software— it doesn't just deal with the creation or manufacture of new software, but its internal maintenance and

arrangement. Both computer applications software engineers and computer systems software engineers are projected

to be among the fastest growing occupations from 2008 and 2018.

See also: Computer programming

1.4 The great insights of computer science

The philosopher of computing Bill Rapaport noted three Great Insights of Computer Science[45]

• Leibniz's, Boole's, Alan Turing's, Shannon's, & Morse's insight: There are only two objects that a computer

has to deal with in order to represent “anything”

All the information about any computable problem can be represented using only 0 and 1 (or any other

bistable pair that can ﬂip-ﬂop between two easily distinguishable states, such as “on"/"oﬀ”, “magnetized/demagnetized”, “high-voltage/low-voltage”, etc.).

See also: Digital physics

• Alan Turing's insight: There are only ﬁve actions that a computer has to perform in order to do “anything”

Every algorithm can be expressed in a language for a computer consisting of only ﬁve basic instructions:

* move left one location

* move right one location

* read symbol at current location

* print 0 at current location

* print 1 at current location

See also: Turing machine

• Böhm and Jacopini's insight: There are only three ways of combining these actions (into more complex ones)

that are needed in order for a computer to do “anything”

Only three rules are needed to combine any set of basic instructions into more complex ones:

sequence:

ﬁrst do this; then do that

selection :

1.5. ACADEMIA

11

IF such-and-such is the case,

THEN do this

ELSE do that

repetition:

WHILE such-and-such is the case DO this

Note that the three rules of Boehm’s and Jacopini’s insight can be further simpliﬁed with the use of goto (which

means it is more elementary than structured programming).

See also: Elementary function arithmetic § Friedman’s grand conjecture

1.5 Academia

1.5.1

Conferences

Further information: List of computer science conferences

Conferences are strategic events of the Academic Research in computer science. During those conferences, researchers from the public and private sectors present their recent work and meet. Proceedings of these conferences

are an important part of the computer science literature.

1.5.2

Journals

Further information: Category:Computer science journals

1.6 Education

Academic curricula in computer science include the following areas of study:

1. Structured and Object-oriented programming[46]

2.

3. Data structures[47]

4.

5. Analysis of Algorithms[48]

6.

7. Formal languages[49] and compiler construction[50]

8.

9. Computer Graphics Algorithms[51]

10.

11. Numerical Methods,[52] Optimization and Statistics[53]

12.

13. Artiﬁcial Intelligence[54] and Machine Learning[55]

12

CHAPTER 1. COMPUTER SCIENCE

14.

Some universities teach computer science as a theoretical study of computation and algorithmic reasoning. These

programs often feature the theory of computation, analysis of algorithms, formal methods, concurrency theory,

databases, computer graphics, and systems analysis, among others. They typically also teach computer programming, but treat it as a vessel for the support of other ﬁelds of computer science rather than a central focus of high-level

study. The ACM/IEEE-CS Joint Curriculum Task Force “Computing Curriculum 2005” (and 2008 update)[56] gives

a guideline for university curriculum.

Other colleges and universities, as well as secondary schools and vocational programs that teach computer science,

emphasize the practice of advanced programming rather than the theory of algorithms and computation in their

computer science curricula. Such curricula tend to focus on those skills that are important to workers entering the

software industry. The process aspects of computer programming are often referred to as software engineering.

While computer science professions increasingly drive the U.S. economy, computer science education is absent in

most American K-12 curricula. A report entitled “Running on Empty: The Failure to Teach K-12 Computer Science

in the Digital Age” was released in October 2010 by Association for Computing Machinery (ACM) and Computer

Science Teachers Association (CSTA), and revealed that only 14 states have adopted signiﬁcant education standards

for high school computer science. The report also found that only nine states count high school computer science

courses as a core academic subject in their graduation requirements. In tandem with “Running on Empty”, a new

non-partisan advocacy coalition - Computing in the Core (CinC) - was founded to inﬂuence federal and state policy,

such as the Computer Science Education Act, which calls for grants to states to develop plans for improving computer

science education and supporting computer science teachers.

Within the United States a gender gap in computer science education has been observed as well. Research conducted

by the WGBH Educational Foundation and the Association for Computing Machinery (ACM) revealed that more

than twice as many high school boys considered computer science to be a “very good” or “good” college major

than high school girls.[57] In addition, the high school Advanced Placement (AP) exam for computer science has

displayed a disparity in gender. Compared to other AP subjects it has the lowest number of female participants,

with a composition of about 15 percent women.[58] This gender gap in computer science is further witnessed at the

college level, where 31 percent of undergraduate computer science degrees are earned by women and only 8 percent

of computer science faculty consists of women.[59] According to an article published by the Epistemic Games Group

in August 2012, the number of women graduates in the computer science ﬁeld has declined to 13 percent.[60]

A 2014 Mother Jones article, “We Can Code It”, advocates for adding computer literacy and coding to the K-12

curriculum in the United States, and notes that computer science is not incorporated into the requirements for the

Common Core State Standards Initiative.[61] In fact, there has been a trend in the direction of removing advanced

placement tests and classes in American schools.[62][63]

1.7 See also

Main article: Outline of computer science

• Academic genealogy of computer scientists

• Informatics (academic ﬁeld)

• List of academic computer science departments

• List of computer science conferences

• List of computer scientists

• List of publications in computer science

• List of pioneers in computer science

• Technology transfer in computer science

• List of software engineering topics

1.8. NOTES

13

• List of unsolved problems in computer science

• Turing Award

• Women in computing

Computer science – Wikipedia book

1.8 Notes

[1] In 1851

[2] “The introduction of punched cards into the new engine was important not only as a more convenient form of control than

the drums, or because programs could now be of unlimited extent, and could be stored and repeated without the danger of

introducing errors in setting the machine by hand; it was important also because it served to crystallize Babbage’s feeling

that he had invented something really new, something much more than a sophisticated calculating machine.” Bruce Collier,

1970

[3] See the entry "Computer science" on Wikiquote for the history of this quotation.

1.9 References

[1] “What is Computer Science?" (PDF). Boston University Department of Computer Science. Spring 2003. Retrieved December 12, 2014.

[2] “WordNet Search - 3.1”. Wordnetweb.princeton.edu. Retrieved 2012-05-14.

[3] “Blaise Pascal”. School of Mathematics and Statistics University of St Andrews, Scotland.

[4] “A Brief History of Computing”.

[5] “Science Museum - Introduction to Babbage”. Archived from the original on 2006-09-08. Retrieved 2006-09-24.

[6] Anthony Hyman (1982). Charles Babbage, pioneer of the computer.

[7] “A Selection and Adaptation From Ada’s Notes found in Ada, The Enchantress of Numbers,” by Betty Alexandra Toole

Ed.D. Strawberry Press, Mill Valley, CA”. Retrieved 2006-05-04.

[8] “In this sense Aiken needed IBM, whose technology included the use of punched cards, the accumulation of numerical

data, and the transfer of numerical data from one register to another”, Bernard Cohen, p.44 (2000)

[9] Brian Randell, p. 187, 1975

[10] The Association for Computing Machinery (ACM) was founded in 1947.

[11] Denning, P.J. (2000). “Computer Science: The Discipline” (PDF). Encyclopedia of Computer Science. Archived from the

original (PDF) on 2006-05-25.

[12] “Some EDSAC statistics”. Cl.cam.ac.uk. Retrieved 2011-11-19.

[13] “Computer science pioneer Samuel D. Conte dies at 85”. Purdue Computer Science. July 1, 2002. Retrieved December

12, 2014.

[14] Levy, Steven (1984). Hackers: Heroes of the Computer Revolution. Doubleday. ISBN 0-385-19195-2.

[15] Tedre, Matti (2014). The Science of Computing: Shaping a Discipline. Taylor and Francis / CRC Press.

[16] “IBM 704 Electronic Data Processing System - CHM Revolution”. Computerhistory.org. Retrieved 2013-07-07.

[17] “IBM 709: a powerful new data processing system” (PDF). Computer History Museum. Retrieved December 12, 2014.

[18] David Kahn, The Codebreakers, 1967, ISBN 0-684-83130-9.

[19] http://www.cis.cornell.edu/Dean/Presentations/Slides/bgu.pdf[]

[20] Constable, R. L. (March 2000). “Computer Science: Achievements and Challenges circa 2000” (PDF).

14

CHAPTER 1. COMPUTER SCIENCE

[21] Abelson, H.; G.J. Sussman with J. Sussman (1996). Structure and Interpretation of Computer Programs (2nd ed.). MIT

Press. ISBN 0-262-01153-0. The computer revolution is a revolution in the way we think and in the way we express what

we think. The essence of this change is the emergence of what might best be called procedural epistemology — the study

of the structure of knowledge from an imperative point of view, as opposed to the more declarative point of view taken by

classical mathematical subjects.

[22] “Black box traders are on the march”. The Telegraph. August 26, 2006.

[23] “The Impact of High Frequency Trading on an Electronic Market”. Papers.ssrn.com. doi:10.2139/ssrn.1686004. Retrieved

2012-05-14.

[24] Wegner, P. (October 13–15, 1976). Proceedings of the 2nd international Conference on Software Engineering. San Francisco, California, United States: IEEE Computer Society Press, Los Alamitos, CA.

[25] Denning, P. J.; Comer, D. E.; Gries, D.; Mulder, M. C.; Tucker, A.; Turner, A. J.; Young, P. R. (Jan 1989). “Computing

as a discipline”. Communications of the ACM 32: 9–23. doi:10.1145/63238.63239.

[26] Eden, A. H. (2007). “Three Paradigms of Computer Science” (PDF). Minds and Machines 17 (2): 135–167. doi:10.1007/s11023007-9060-8.

[27] Louis Fine (1959). “The Role of the University in Computers, Data Processing, and Related Fields”. Communications of

the ACM 2 (9): 7–14. doi:10.1145/368424.368427.

[28] “Stanford University Oral History”. Stanford University. Retrieved May 30, 2013.

[29] Donald Knuth (1972). “George Forsythe and the Development of Computer Science”. Comms. ACM.

[30] Matti Tedre (2006). “The Development of Computer Science: A Sociocultural Perspective” (PDF). p. 260. Retrieved

December 12, 2014.

[31] Peter Naur (1966). “The science of datalogy”. Communications of the ACM 9 (7): 485. doi:10.1145/365719.366510.

[32] “Communications of the ACM”. Communications of the ACM 1 (4): 6.

[33] Communications of the ACM 2(1):p.4

[34] IEEE Computer 28(12):p.136

[35] P. Mounier-Kuhn, L'Informatique en France, de la seconde guerre mondiale au Plan Calcul. L'émergence d'une science,

Paris, PUPS, 2010, ch. 3 & 4.

[36] Tedre, M. (2011). “Computing as a Science: A Survey of Competing Viewpoints”. Minds and Machines 21 (3): 361–387.

doi:10.1007/s11023-011-9240-4.

[37] Parnas, D. L. (1998). “Software engineering programmes are not computer science programmes”. Annals of Software

Engineering 6: 19–37. doi:10.1023/A:1018949113292., p. 19: “Rather than treat software engineering as a subﬁeld of

computer science, I treat it as an element of the set, Civil Engineering, Mechanical Engineering, Chemical Engineering,

Electrical Engineering, [...]"

[38] Computing Sciences Accreditation Board (May 28, 1997). “Computer Science as a Profession”. Archived from the original

on 2008-06-17. Retrieved 2010-05-23.

[39] Committee on the Fundamentals of Computer Science: Challenges and Opportunities, National Research Council (2004).

Computer Science: Reﬂections on the Field, Reﬂections from the Field. National Academies Press. ISBN 978-0-309-093019.

[40] “CSAB Leading Computer Education”. CSAB. 2011-08-03. Retrieved 2011-11-19.

[41] Clay Mathematics Institute P=NP

[42] P. Collins, Graham (October 14, 2002). “Claude E. Shannon: Founder of Information Theory”. Scientiﬁc American.

Retrieved December 12, 2014.

[43] A. Thisted, Ronald (April 7, 1997). “Computer Architecture” (PDF). The University of Chicago.

[44] Wescott, Bob (2013). The Every Computer Performance Book, Chapter 3: Useful laws. CreateSpace. ISBN 1482657759.

[45] “What Is Computation?". buﬀalo.edu.

[46] Booch, Grady (1997). Object-Oriented Analysis and Design with Applications. Addison-Wesley.

1.10. FURTHER READING

15

[47] Peter Brass. (2008) Advanced Data Structures, Cambridge University Press

[48] Cormen, Thomas H.; Leiserson, Charles E.; Rivest, Ronald L. & Stein, Cliﬀord. (2001) Introduction to Algorithms, MIT

Press and McGraw-Hill.

[49] Hopcroft, John E. and Jeﬀrey D. Ullman, (1979) Introduction to Automata Theory, Languages, and Computation

[50] Aho, Alfred V., Sethi, Ravi, and Ullman, Jeﬀrey D. (1988). Compilers — Principles, Techniques, and Tools. AddisonWesley.

[51] Shirley, Peter. (2009) Fundamentals of Computer Graphics - 3rd edition

[52] Press, William H., Saul A. Teukolsky, William T. Vetterling, Brian P. Flannery. (2007) Numerical Recipes 3rd Edition:

The Art of Scientiﬁc Computing

[53] Baron, Michael. (2006) Probability and Statistics for Computer Scientists

[54] Russell, Stuart. (2009) Artiﬁcial Intelligence: A Modern Approach (3rd Edition)

[55] Mitchell, Tom. (1997) Machine Learning.

[56] “ACM Curricula Recommendations”. Retrieved 2012-11-18.

[57] “New Image for Computing Report on Market Research” (PDF). WGBH Educational Foundation and the Association for

Computing Machinery (ACM). April 2009. Retrieved December 12, 2014.

[58] Gilbert, Alorie. “Newsmaker: Computer science’s gender gap”. CNET News.

[59] Dovzan, Nicole. “Examining the Gender Gap in Technology”. University of Michigan.

[60] “Encouraging the next generation of women in computing”. Microsoft Research Connections Team. Retrieved September

3, 2013.

[61] Raja, Tasneem (August 2014). “Is Coding the New Literacy?". Mother Jones. Retrieved 2014-06-21.

[62] http://arstechnica.com/business/2014/12/to-address-techs-diversity-woes-start-with-the-vanishing-comp-sci-classroom/

Casey Johnston. Ars Technica. Dec 4 2014.

[63] http://apcentral.collegeboard.com/apc/members/exam/exam_information/1999.html

“Computer Software Engineer”. U.S. Bureau of Labor Statistics. U.S. Bureau of Labor Statistics, n.d. Web. February

5, 2013.

1.10 Further reading

Overview

• Tucker, Allen B. (2004). Computer Science Handbook (2nd ed.). Chapman and Hall/CRC. ISBN 1-58488360-X.

• “Within more than 70 chapters, every one new or signiﬁcantly revised, one can ﬁnd any kind of information and references about computer science one can imagine. [...] all in all, there is absolute nothing

about Computer Science that can not be found in the 2.5 kilogram-encyclopaedia with its 110 survey

articles [...].” (Christoph Meinel, Zentralblatt MATH)

• van Leeuwen, Jan (1994). Handbook of Theoretical Computer Science. The MIT Press. ISBN 0-262-72020-5.

• "[...] this set is the most unique and possibly the most useful to the [theoretical computer science] community, in support both of teaching and research [...]. The books can be used by anyone wanting simply

to gain an understanding of one of these areas, or by someone desiring to be in research in a topic, or by

instructors wishing to ﬁnd timely information on a subject they are teaching outside their major areas of

expertise.” (Rocky Ross, SIGACT News)

• Ralston, Anthony; Reilly, Edwin D.; Hemmendinger, David (2000). Encyclopedia of Computer Science (4th

ed.). Grove’s Dictionaries. ISBN 1-56159-248-X.

16

CHAPTER 1. COMPUTER SCIENCE

• “Since 1976, this has been the deﬁnitive reference work on computer, computing, and computer science. [...] Alphabetically arranged and classiﬁed into broad subject areas, the entries cover hardware,

computer systems, information and data, software, the mathematics of computing, theory of computation, methodologies, applications, and computing milieu. The editors have done a commendable job of

blending historical perspective and practical reference information. The encyclopedia remains essential

for most public and academic library reference collections.” (Joe Accardin, Northeastern Illinois Univ.,

Chicago)

• Edwin D. Reilly (2003). Milestones in Computer Science and Information Technology. Greenwood Publishing

Group. ISBN 978-1-57356-521-9.

Selected papers

• Knuth, Donald E. (1996). Selected Papers on Computer Science. CSLI Publications, Cambridge University

Press.

• Collier, Bruce. The little engine that could've: The calculating machines of Charles Babbage. Garland Publishing Inc. ISBN 0-8240-0043-9.

• Cohen, Bernard (2000). Howard Aiken, Portrait of a computer pioneer. The MIT press. ISBN 978-0-26253179-5.

• Tedre, Matti (2014). The Science of Computing: Shaping a Discipline. CRC Press, Taylor & Francis.

• Randell, Brian (1973). The origins of Digital computers, Selected Papers. Springer-Verlag. ISBN 3-540-06169X.

• “Covering a period from 1966 to 1993, its interest lies not only in the content of each of these papers

— still timely today — but also in their being put together so that ideas expressed at diﬀerent times

complement each other nicely.” (N. Bernard, Zentralblatt MATH)

Articles

• Peter J. Denning. Is computer science science?, Communications of the ACM, April 2005.

• Peter J. Denning, Great principles in computing curricula, Technical Symposium on Computer Science Education, 2004.

• Research evaluation for computer science, Informatics Europe report. Shorter journal version: Bertrand

Meyer, Christine Choppy, Jan van Leeuwen and Jorgen Staunstrup, Research evaluation for computer science,

in Communications of the ACM, vol. 52, no. 4, pp. 31–34, April 2009.

Curriculum and classiﬁcation

• Association for Computing Machinery. 1998 ACM Computing Classiﬁcation System. 1998.

• Joint Task Force of Association for Computing Machinery (ACM), Association for Information Systems (AIS)

and IEEE Computer Society (IEEE-CS). Computing Curricula 2005: The Overview Report. September 30,

2005.

• Norman Gibbs, Allen Tucker. “A model curriculum for a liberal arts degree in computer science”. Communications of the ACM, Volume 29 Issue 3, March 1986.

1.11 External links

• Computer science at DMOZ

• Scholarly Societies in Computer Science

• Best Papers Awards in Computer Science since 1996

1.11. EXTERNAL LINKS

17

• Photographs of computer scientists by Bertrand Meyer

• EECS.berkeley.edu

Bibliography and academic search engines

• CiteSeerx (article): search engine, digital library and repository for scientiﬁc and academic papers with a focus

on computer and information science.

• DBLP Computer Science Bibliography (article): computer science bibliography website hosted at Universität

Trier, in Germany.

• The Collection of Computer Science Bibliographies (article)

Professional organizations

• Association for Computing Machinery

• IEEE Computer Society

• Informatics Europe

• AAAI

• AAAS Computer Science

Misc

• Computer Science - Stack Exchange: a community-run question-and-answer site for computer science

• What is computer science

• Is computer science science?

Chapter 2

Discrete mathematics

For the mathematics journal, see Discrete Mathematics (journal).

Discrete mathematics is the study of mathematical structures that are fundamentally discrete rather than continuous.

6

5

4

1

2

3

Graphs like this are among the objects studied by discrete mathematics, for their interesting mathematical properties, their usefulness

as models of real-world problems, and their importance in developing computer algorithms.

In contrast to real numbers that have the property of varying “smoothly”, the objects studied in discrete mathematics

– such as integers, graphs, and statements in logic[1] – do not vary smoothly in this way, but have distinct, separated

values.[2] Discrete mathematics therefore excludes topics in “continuous mathematics” such as calculus and analysis.

Discrete objects can often be enumerated by integers. More formally, discrete mathematics has been characterized

as the branch of mathematics dealing with countable sets[3] (sets that have the same cardinality as subsets of the

natural numbers, including rational numbers but not real numbers). However, there is no exact deﬁnition of the term

“discrete mathematics.”[4] Indeed, discrete mathematics is described less by what is included than by what is excluded:

continuously varying quantities and related notions.

The set of objects studied in discrete mathematics can be ﬁnite or inﬁnite. The term ﬁnite mathematics is sometimes

applied to parts of the ﬁeld of discrete mathematics that deals with ﬁnite sets, particularly those areas relevant to

business.

18

2.1. GRAND CHALLENGES, PAST AND PRESENT

19

Research in discrete mathematics increased in the latter half of the twentieth century partly due to the development

of digital computers which operate in discrete steps and store data in discrete bits. Concepts and notations from

discrete mathematics are useful in studying and describing objects and problems in branches of computer science,

such as computer algorithms, programming languages, cryptography, automated theorem proving, and software development. Conversely, computer implementations are signiﬁcant in applying ideas from discrete mathematics to

real-world problems, such as in operations research.

Although the main objects of study in discrete mathematics are discrete objects, analytic methods from continuous

mathematics are often employed as well.

In the university curricula, “Discrete Mathematics” appeared in the 1980s, initially as a computer science support

course; its contents was somewhat haphazard at the time. The curriculum has thereafter developed in conjunction to

eﬀorts by ACM and MAA into a course that is basically intended to develop mathematical maturity in freshmen; as

such it is nowadays a prerequisite for mathematics majors in some universities as well.[5][6] Some high-school-level

discrete mathematics textbooks have appeared as well.[7] At this level, discrete mathematics it is sometimes seen a

preparatory course, not unlike precalculus in this respect.[8]

The Fulkerson Prize is awarded for outstanding papers in discrete mathematics.

2.1 Grand challenges, past and present

The history of discrete mathematics has involved a number of challenging problems which have focused attention

within areas of the ﬁeld. In graph theory, much research was motivated by attempts to prove the four color theorem,

ﬁrst stated in 1852, but not proved until 1976 (by Kenneth Appel and Wolfgang Haken, using substantial computer

assistance).[9]

In logic, the second problem on David Hilbert's list of open problems presented in 1900 was to prove that the axioms

of arithmetic are consistent. Gödel’s second incompleteness theorem, proved in 1931, showed that this was not

possible – at least not within arithmetic itself. Hilbert’s tenth problem was to determine whether a given polynomial

Diophantine equation with integer coeﬃcients has an integer solution. In 1970, Yuri Matiyasevich proved that this

could not be done.

The need to break German codes in World War II led to advances in cryptography and theoretical computer science,

with the ﬁrst programmable digital electronic computer being developed at England’s Bletchley Park with the guidance of Alan Turing and his seminal work, On Computable Numbers.[10] At the same time, military requirements

motivated advances in operations research. The Cold War meant that cryptography remained important, with fundamental advances such as public-key cryptography being developed in the following decades. Operations research

remained important as a tool in business and project management, with the critical path method being developed

in the 1950s. The telecommunication industry has also motivated advances in discrete mathematics, particularly

in graph theory and information theory. Formal veriﬁcation of statements in logic has been necessary for software

development of safety-critical systems, and advances in automated theorem proving have been driven by this need.

Computational geometry has been an important part of the computer graphics incorporated into modern video games

and computer-aided design tools.

Several ﬁelds of discrete mathematics, particularly theoretical computer science, graph theory, and combinatorics,

are important in addressing the challenging bioinformatics problems associated with understanding the tree of life.[11]

Currently, one of the most famous open problems in theoretical computer science is the P = NP problem, which

involves the relationship between the complexity classes P and NP. The Clay Mathematics Institute has oﬀered a $1

million USD prize for the ﬁrst correct proof, along with prizes for six other mathematical problems.[12]

2.2 Topics in discrete mathematics

2.2.1

Theoretical computer science

Main article: Theoretical computer science

Theoretical computer science includes areas of discrete mathematics relevant to computing. It draws heavily on

graph theory and mathematical logic. Included within theoretical computer science is the study of algorithms for

computing mathematical results. Computability studies what can be computed in principle, and has close ties to logic,

20

CHAPTER 2. DISCRETE MATHEMATICS

Much research in graph theory was motivated by attempts to prove that all maps, like this one, could be colored using only four colors

so that no areas of the same color touched. Kenneth Appel and Wolfgang Haken proved this in 1976.[9]

while complexity studies the time taken by computations. Automata theory and formal language theory are closely

related to computability. Petri nets and process algebras are used to model computer systems, and methods from

discrete mathematics are used in analyzing VLSI electronic circuits. Computational geometry applies algorithms

to geometrical problems, while computer image analysis applies them to representations of images. Theoretical

computer science also includes the study of various continuous computational topics.

2.2. TOPICS IN DISCRETE MATHEMATICS

21

Complexity studies the time taken by algorithms, such as this sorting routine.

2.2.2

Information theory

Main article: Information theory

Information theory involves the quantiﬁcation of information. Closely related is coding theory which is used to design

eﬃcient and reliable data transmission and storage methods. Information theory also includes continuous topics such

as: analog signals, analog coding, analog encryption.

2.2.3

Logic

Main article: Mathematical logic

Logic is the study of the principles of valid reasoning and inference, as well as of consistency, soundness, and

completeness. For example, in most systems of logic (but not in intuitionistic logic) Peirce’s law (((P→Q)→P)→P)

is a theorem. For classical logic, it can be easily veriﬁed with a truth table. The study of mathematical proof is particularly important in logic, and has applications to automated theorem proving and formal veriﬁcation of software.

Logical formulas are discrete structures, as are proofs, which form ﬁnite trees[13] or, more generally, directed acyclic

graph structures[14][15] (with each inference step combining one or more premise branches to give a single conclusion).

The truth values of logical formulas usually form a ﬁnite set, generally restricted to two values: true and false, but

logic can also be continuous-valued, e.g., fuzzy logic. Concepts such as inﬁnite proof trees or inﬁnite derivation trees

have also been studied,[16] e.g. inﬁnitary logic.

2.2.4

Set theory

Main article: Set theory

22

CHAPTER 2. DISCRETE MATHEMATICS

101

110

110

110

111

110

110

110

110

0111

1001

1011

1001

0000

0101

0100

1001

0001

Wikipedia

The ASCII codes for the word “Wikipedia”, given here in binary, provide a way of representing the word in information theory, as

well as for information-processing algorithms.

Set theory is the branch of mathematics that studies sets, which are collections of objects, such as {blue, white, red}

or the (inﬁnite) set of all prime numbers. Partially ordered sets and sets with other relations have applications in

several areas.

In discrete mathematics, countable sets (including ﬁnite sets) are the main focus. The beginning of set theory as a

branch of mathematics is usually marked by Georg Cantor's work distinguishing between diﬀerent kinds of inﬁnite

set, motivated by the study of trigonometric series, and further development of the theory of inﬁnite sets is outside

the scope of discrete mathematics. Indeed, contemporary work in descriptive set theory makes extensive use of

traditional continuous mathematics.

2.2. TOPICS IN DISCRETE MATHEMATICS

2.2.5

23

Combinatorics

Main article: Combinatorics

Combinatorics studies the way in which discrete structures can be combined or arranged. Enumerative combinatorics

concentrates on counting the number of certain combinatorial objects - e.g. the twelvefold way provides a uniﬁed

framework for counting permutations, combinations and partitions. Analytic combinatorics concerns the enumeration (i.e., determining the number) of combinatorial structures using tools from complex analysis and probability

theory. In contrast with enumerative combinatorics which uses explicit combinatorial formulae and generating functions to describe the results, analytic combinatorics aims at obtaining asymptotic formulae. Design theory is a study

of combinatorial designs, which are collections of subsets with certain intersection properties. Partition theory studies

various enumeration and asymptotic problems related to integer partitions, and is closely related to q-series, special

functions and orthogonal polynomials. Originally a part of number theory and analysis, partition theory is now considered a part of combinatorics or an independent ﬁeld. Order theory is the study of partially ordered sets, both ﬁnite

and inﬁnite.

2.2.6

Graph theory

Main article: Graph theory

Graph theory, the study of graphs and networks, is often considered part of combinatorics, but has grown large enough

Graph theory has close links to group theory. This truncated tetrahedron graph is related to the alternating group A4 .

and distinct enough, with its own kind of problems, to be regarded as a subject in its own right.[17] Graphs are one of

the prime objects of study in discrete mathematics. They are among the most ubiquitous models of both natural and

human-made structures. They can model many types of relations and process dynamics in physical, biological and

social systems. In computer science, they can represent networks of communication, data organization, computational

devices, the ﬂow of computation, etc. In mathematics, they are useful in geometry and certain parts of topology, e.g.

knot theory. Algebraic graph theory has close links with group theory. There are also continuous graphs, however

24

CHAPTER 2. DISCRETE MATHEMATICS

for the most part research in graph theory falls within the domain of discrete mathematics.

2.2.7

Probability

Main article: Discrete probability theory

Discrete probability theory deals with events that occur in countable sample spaces. For example, count observations

such as the numbers of birds in ﬂocks comprise only natural number values {0, 1, 2, ...}. On the other hand, continuous

observations such as the weights of birds comprise real number values and would typically be modeled by a continuous

probability distribution such as the normal. Discrete probability distributions can be used to approximate continuous

ones and vice versa. For highly constrained situations such as throwing dice or experiments with decks of cards,

calculating the probability of events is basically enumerative combinatorics.

2.2.8

Number theory

The Ulam spiral of numbers, with black pixels showing prime numbers. This diagram hints at patterns in the distribution of prime

numbers.

Main article: Number theory

2.2. TOPICS IN DISCRETE MATHEMATICS

25

Number theory is concerned with the properties of numbers in general, particularly integers. It has applications to

cryptography, cryptanalysis, and cryptology, particularly with regard to modular arithmetic, diophantine equations,

linear and quadratic congruences, prime numbers and primality testing. Other discrete aspects of number theory

include geometry of numbers. In analytic number theory, techniques from continuous mathematics are also used.

Topics that go beyond discrete objects include transcendental numbers, diophantine approximation, p-adic analysis

and function ﬁelds.

2.2.9

Algebra

Main article: Abstract algebra

Algebraic structures occur as both discrete examples and continuous examples. Discrete algebras include: boolean

algebra used in logic gates and programming; relational algebra used in databases; discrete and ﬁnite versions of

groups, rings and ﬁelds are important in algebraic coding theory; discrete semigroups and monoids appear in the

theory of formal languages.

2.2.10

Calculus of ﬁnite diﬀerences, discrete calculus or discrete analysis

Main article: ﬁnite diﬀerence

A function deﬁned on an interval of the integers is usually called a sequence. A sequence could be a ﬁnite sequence

from a data source or an inﬁnite sequence from a discrete dynamical system. Such a discrete function could be deﬁned

explicitly by a list (if its domain is ﬁnite), or by a formula for its general term, or it could be given implicitly by a

recurrence relation or diﬀerence equation. Diﬀerence equations are similar to a diﬀerential equations, but replace

diﬀerentiation by taking the diﬀerence between adjacent terms; they can be used to approximate diﬀerential equations

or (more often) studied in their own right. Many questions and methods concerning diﬀerential equations have

counterparts for diﬀerence equations. For instance where there are integral transforms in harmonic analysis for

studying continuous functions or analog signals, there are discrete transforms for discrete functions or digital signals.

As well as the discrete metric there are more general discrete or ﬁnite metric spaces and ﬁnite topological spaces.

2.2.11

Geometry

Main articles: discrete geometry and computational geometry

Discrete geometry and combinatorial geometry are about combinatorial properties of discrete collections of geometrical objects. A long-standing topic in discrete geometry is tiling of the plane. Computational geometry applies

algorithms to geometrical problems.

2.2.12

Topology

Although topology is the ﬁeld of mathematics that formalizes and generalizes the intuitive notion of “continuous deformation” of objects, it gives rise to many discrete topics; this can be attributed in part to the focus on topological invariants, which themselves usually take discrete values. See combinatorial topology, topological graph theory, topological

combinatorics, computational topology, discrete topological space, ﬁnite topological space, topology (chemistry).

2.2.13

Operations research

Main article: Operations research

Operations research provides techniques for solving practical problems in business and other ﬁelds — problems such

as allocating resources to maximize proﬁt, or scheduling project activities to minimize risk. Operations research

techniques include linear programming and other areas of optimization, queuing theory, scheduling theory, network

26

CHAPTER 2. DISCRETE MATHEMATICS

Computational geometry applies computer algorithms to representations of geometrical objects.

theory. Operations research also includes continuous topics such as continuous-time Markov process, continuoustime martingales, process optimization, and continuous and hybrid control theory.

2.2.14

Game theory, decision theory, utility theory, social choice theory

Decision theory is concerned with identifying the values, uncertainties and other issues relevant in a given decision,

its rationality, and the resulting optimal decision.

Utility theory is about measures of the relative economic satisfaction from, or desirability of, consumption of various

goods and services.

Social choice theory is about voting. A more puzzle-based approach to voting is ballot theory.

Game theory deals with situations where success depends on the choices of others, which makes choosing the best

course of action more complex. There are even continuous games, see diﬀerential game. Topics include auction

theory and fair division.

2.2. TOPICS IN DISCRETE MATHEMATICS

27

PERT charts like this provide a business management technique based on graph theory.

2.2.15

Discretization

Main article: Discretization

Discretization concerns the process of transferring continuous models and equations into discrete counterparts, often

for the purposes of making calculations easier by using approximations. Numerical analysis provides an important

example.

2.2.16

Discrete analogues of continuous mathematics

There are many concepts in continuous mathematics which have discrete versions, such as discrete calculus, discrete

probability distributions, discrete Fourier transforms, discrete geometry, discrete logarithms, discrete diﬀerential

geometry, discrete exterior calculus, discrete Morse theory, diﬀerence equations, discrete dynamical systems, and

discrete vector measures.

In applied mathematics, discrete modelling is the discrete analogue of continuous modelling. In discrete modelling,

discrete formulae are ﬁt to data. A common method in this form of modelling is to use recurrence relation.

In algebraic geometry, the concept of a curve can be extended to discrete geometries by taking the spectra of

polynomial rings over ﬁnite ﬁelds to be models of the aﬃne spaces over that ﬁeld, and letting subvarieties or spectra of

other rings provide the curves that lie in that space. Although the space in which the curves appear has a ﬁnite number

of points, the curves are not so much sets of points as analogues of curves in continuous settings. For example, every

point of the form V (x − c) ⊂ Spec K[x] = A1 for K a ﬁeld can be studied either as Spec K[x]/(x − c) ∼

= Spec K

, a point, or as the spectrum Spec K[x](x−c) of the local ring at (x-c), a point together with a neighborhood around

it. Algebraic varieties also have a well-deﬁned notion of tangent space called the Zariski tangent space, making many

features of calculus applicable even in ﬁnite settings.

2.2.17

Hybrid discrete and continuous mathematics

The time scale calculus is a uniﬁcation of the theory of diﬀerence equations with that of diﬀerential equations, which

has applications to ﬁelds requiring simultaneous modelling of discrete and continuous data. Another way of modeling

such a situation is the notion of hybrid dynamical system.

28

CHAPTER 2. DISCRETE MATHEMATICS

2.3 See also

• Outline of discrete mathematics

• CyberChase, a show that teaches Discrete Mathematics to children

2.4 References

[1] Richard Johnsonbaugh, Discrete Mathematics, Prentice Hall, 2008.

[2] Weisstein, Eric W., “Discrete mathematics”, MathWorld.

[3] Biggs, Norman L. (2002), Discrete mathematics, Oxford Science Publications (2nd ed.), New York: The Clarendon Press

Oxford University Press, p. 89, ISBN 9780198507178, MR 1078626, Discrete Mathematics is the branch of Mathematics

in which we deal with questions involving ﬁnite or countably inﬁnite sets.

[4] Brian Hopkins, Resources for Teaching Discrete Mathematics, Mathematical Association of America, 2008.

[5] Ken Levasseur; Al Doerr. Applied Discrete Structures. p. 8.

[6] Albert Geoﬀrey Howson, ed. (1988). Mathematics as a Service Subject. Cambridge University Press. pp. 77–78. ISBN

978-0-521-35395-3.

[7] Joseph G. Rosenstein. Discrete Mathematics in the Schools. American Mathematical Soc. p. 323. ISBN 978-0-82188578-9.

[8] http://ucsmp.uchicago.edu/secondary/curriculum/precalculus-discrete/

[9] Wilson, Robin (2002). Four Colors Suﬃce. London: Penguin Books. ISBN 978-0-691-11533-7.

[10] Hodges, Andrew. Alan Turing: the enigma. Random House, 1992.

[11] Trevor R. Hodkinson; John A. N. Parnell (2007). Reconstruction the Tree of Life: Taxonomy And Systematics of Large And

Species Rich Taxa. CRC PressINC. p. 97. ISBN 978-0-8493-9579-6.

[12] “Millennium Prize Problems”. 2000-05-24. Retrieved 2008-01-12.

[13] A. S. Troelstra; H. Schwichtenberg (2000-07-27). Basic Proof Theory. Cambridge University Press. p. 186. ISBN

978-0-521-77911-1.

[14] Samuel R. Buss (1998). Handbook of Proof Theory. Elsevier. p. 13. ISBN 978-0-444-89840-1.

[15] Franz Baader; Gerhard Brewka; Thomas Eiter (2001-10-16). KI 2001: Advances in Artiﬁcial Intelligence: Joint German/Austrian Conference on AI, Vienna, Austria, September 19-21, 2001. Proceedings. Springer. p. 325. ISBN 978-3540-42612-7.

[16] Brotherston, J.; Bornat, R.; Calcagno, C. (January 2008). “Cyclic proofs of program termination in separation logic”. ACM

SIGPLAN Notices 43 (1). CiteSeerX: 10.1.1.111.1105.

[17] Graphs on Surfaces, Bojan Mohar and Carsten Thomassen, Johns Hopkins University press, 2001

2.5 Further reading

• Norman L. Biggs (2002-12-19). Discrete Mathematics. Oxford University Press. ISBN 978-0-19-850717-8.

• John Dwyer (2010). An Introduction to Discrete Mathematics for Business & Computing. ISBN 978-1-90793400-1.

• Susanna S. Epp (2010-08-04). Discrete Mathematics With Applications. Thomson Brooks/Cole. ISBN 978-0495-39132-6.

• Ronald Graham, Donald E. Knuth, Oren Patashnik, Concrete Mathematics.

• Ralph P. Grimaldi (2004). Discrete and Combinatorial Mathematics: An Applied Introduction. Addison Wesley.

ISBN 978-0-201-72634-3.

2.6. EXTERNAL LINKS

29

• Donald E. Knuth (2011-03-03). The Art of Computer Programming, Volumes 1-4a Boxed Set. Addison-Wesley

Professional. ISBN 978-0-321-75104-1.

• Jiří Matoušek; Jaroslav Nešetřil (1998). Discrete Mathematics. Oxford University Press. ISBN 978-0-19850208-1.

• Obrenic, Bojana (2003-01-29). Practice Problems in Discrete Mathematics. Prentice Hall. ISBN 978-0-13045803-2.

• Kenneth H. Rosen; John G. Michaels (2000). Hand Book of Discrete and Combinatorial Mathematics. CRC

PressI Llc. ISBN 978-0-8493-0149-0.

• Kenneth H. Rosen (2007). Discrete Mathematics: And Its Applications. McGraw-Hill College. ISBN 978-007-288008-3.

• Andrew Simpson (2002). Discrete Mathematics by Example. McGraw-Hill Incorporated. ISBN 978-0-07709840-7.

• Veerarajan, T.(2007), Discrete mathematics with graph theory and combinatorics, Tata Mcgraw Hill

2.6 External links

• Discrete mathematics at the utk.edu Mathematics Archives, providiing links to syllabi, tutorials, programs, etc.

• Iowa Central: Electrical Technologies Program Discrete mathematics for Electrical engineering.

Chapter 3

Glossary of graph theory

Graph theory is a growing area in mathematical research, and has a large specialized vocabulary. Some authors use

the same word with diﬀerent meanings. Some authors use diﬀerent words to mean the same thing. This page attempts

to describe the majority of current usage.

3.1 Basics

A graph G consists of two types of elements, namely vertices and edges. Every edge has two endpoints in the set of

vertices, and is said to connect or join the two endpoints. An edge can thus be deﬁned as a set of two vertices (or an

ordered pair, in the case of a directed graph - see Section Direction). The two endpoints of an edge are also said to

be adjacent to each other.

Alternative models of graphs exist; e.g., a graph may be thought of as a Boolean binary function over the set of

vertices or as a square (0,1)-matrix.

A vertex is simply drawn as a node or a dot. The vertex set of G is usually denoted by V(G), or V when there is no

danger of confusion. The order of a graph is the number of its vertices, i.e. |V(G)|.

An edge (a set of two elements) is drawn as a line connecting two vertices, called endpoints or end vertices or

endvertices. An edge with endvertices x and y is denoted by xy (without any symbol in between). The edge set of

G is usually denoted by E(G), or E when there is no danger of confusion. An edge xy is called incident to a vertex

when this vertex is one of the endpoints x or y.

The size of a graph is the number of its edges, i.e. |E(G)|.[1]

A loop is an edge whose endpoints are the same vertex. A link has two distinct endvertices. An edge is multiple if

there is another edge with the same endvertices; otherwise it is simple. The multiplicity of an edge is the number

of multiple edges sharing the same end vertices; the multiplicity of a graph, the maximum multiplicity of its edges.

A graph is a simple graph if it has no multiple edges or loops, a multigraph if it has multiple edges, but no loops,

and a multigraph or pseudograph if it contains both multiple edges and loops (the literature is highly inconsistent).

When stated without any qualiﬁcation, a graph is usually assumed to be simple, except in the literature of category

theory, where it refers to a quiver.

Graphs whose edges or vertices have names or labels are known as labeled, those without as unlabeled. Graphs with

labeled vertices only are vertex-labeled, those with labeled edges only are edge-labeled. The diﬀerence between a

labeled and an unlabeled graph is that the latter has no speciﬁc set of vertices or edges; it is regarded as another way

to look upon an isomorphism type of graphs. (Thus, this usage distinguishes between graphs with identiﬁable vertex

or edge sets on the one hand, and isomorphism types or classes of graphs on the other.)

(Graph labeling usually refers to the assignment of labels (usually natural numbers, usually distinct) to the edges and

vertices of a graph, subject to certain rules depending on the situation. This should not be confused with a graph’s

merely having distinct labels or names on the vertices.)

A hyperedge is an edge that is allowed to take on any number of vertices, possibly more than 2. A graph that allows

any hyperedge is called a hypergraph. A simple graph can be considered a special case of the hypergraph, namely

the 2-uniform hypergraph. However, when stated without any qualiﬁcation, an edge is always assumed to consist of

30

3.1. BASICS

31

In this pseudograph the blue edges are loops and the red edges are multiple edges of multiplicity 2 and 3. The multiplicity of the

graph is 3.

at most 2 vertices, and a graph is never confused with a hypergraph.

A non-edge (or anti-edge) is an edge that is not present in the graph. More formally, for two vertices u and v , {u, v}

is a non-edge in a graph G whenever {u, v} is not an edge in G . This means that there is either no edge between the

two vertices or (for directed graphs) at most one of (u, v) and (v, u) from v is an arc in G.

Occasionally the term cotriangle or anti-triangle is used for a set of three vertices none of which are connected.

¯ of a graph G is a graph with the same vertex set as G but with an edge set such that xy is an edge

The complement G

¯ if and only if xy is not an edge in G.

in G

An edgeless graph or empty graph or null graph is a graph with zero or more vertices, but no edges. The empty

graph or null graph may also be the graph with no vertices and no edges. If it is a graph with no edges and any

number n of vertices, it may be called the null graph on n vertices. (There is no consistency at all in the literature.)

A graph is inﬁnite if it has inﬁnitely many vertices or edges or both; otherwise the graph is ﬁnite. An inﬁnite graph

where every vertex has ﬁnite degree is called locally ﬁnite. When stated without any qualiﬁcation, a graph is usually

assumed to be ﬁnite. See also continuous graph.

Two graphs G and H are said to be isomorphic, denoted by G ~ H, if there is a one-to-one correspondence, called

an isomorphism, between the vertices of the graph such that two vertices are adjacent in G if and only if their

32

CHAPTER 3. GLOSSARY OF GRAPH THEORY

6

4

3

5

1

2

A labeled simple graph with vertex set V = {1, 2, 3, 4, 5, 6} and edge set E = {{1,2}, {1,5}, {2,3}, {2,5}, {3,4}, {4,5}, {4,6}}.

corresponding vertices are adjacent in H. Likewise, a graph G is said to be homomorphic to a graph H if there

is a mapping, called a homomorphism, from V(G) to V(H) such that if two vertices are adjacent in G then their

corresponding vertices are adjacent in H.

3.1.1

Subgraphs

A subgraph, H, of a graph, G, is a graph whose vertices are a subset of the vertex set of G, and whose edges are a

subset of the edge set of G. In reverse, a supergraph of a graph G is a graph of which G is a subgraph. A graph, G,

contains a graph, H, if H is a subgraph of, or is isomorphic to G.

A subgraph, H, spans a graph, G, and is a spanning subgraph, or factor of G, if it has the same vertex set as G.

A subgraph, H, of a graph, G, is said to be induced (or full) if, for every pair of vertices x and y of H, xy is an edge

of H if and only if xy is an edge of G. In other words, H is an induced subgraph of G if it has exactly the edges that

appear in G over the same vertex set. If the vertex set of H is the subset S of V(G), then H can be written as G[S]

and is said to be induced by S.

A graph, G, is minimal with some property, P, provided that G has property P and no proper subgraph of G has

property P. In this deﬁnition, the term subgraph is usually understood to mean induced subgraph. The notion of

maximality is deﬁned dually: G is maximal with P provided that P(G) and G has no proper supergraph H such that

P(H).

A graph that does not contain H as an induced subgraph is said to be H-free, and more generally if F is a family of

graphs then the graphs that do not contain any induced subgraph isomorphic to a member of F are called F -free.[2]

For example the triangle-free graphs are the graphs that do not have a triangle graph as an induced subgraph. Many

important classes of graphs can be deﬁned by sets of forbidden subgraphs, the graphs that are not in the class and are

minimal with respect to subgraphs, induced subgraphs, or graph minors.

A universal graph in a class K of graphs is a simple graph in which every element in K can be embedded as a

subgraph.

3.1. BASICS

3.1.2

33

Walks

A walk is a sequence of vertices and edges, where each edge’s endpoints are the preceding and following vertices in

the sequence. A walk is closed if its ﬁrst and last vertices are the same, and open if they are diﬀerent.

The length l of a walk is the number of edges that it uses. For an open walk, l = n–1, where n is the number of

vertices visited (a vertex is counted each time it is visited). For a closed walk, l = n (the start/end vertex is listed

twice, but is not counted twice). In the example labeled simple graph, (1, 2, 5, 1, 2, 3) is an open walk with length 5,

and (4, 5, 2, 1, 5, 4) is a closed walk of length 5.

A trail is a walk in which all the edges are distinct. A closed trail has been called a tour or circuit, but these are not

universal, and the latter is often reserved for a regular subgraph of degree two.

A directed tour. This is not a simple cycle, since the blue vertices are used twice.

Traditionally, a path referred to what is now usually known as an open walk. Nowadays, when stated without any

qualiﬁcation, a path is usually understood to be simple, meaning that no vertices (and thus no edges) are repeated.

(The term chain has also been used to refer to a walk in which all vertices and edges are distinct.) In the example

labeled simple graph, (5, 2, 1) is a path of length 2. The closed equivalent to this type of walk, a walk that starts

and ends at the same vertex but otherwise has no repeated vertices or edges, is called a cycle. Like path, this term

traditionally referred to any closed walk, but now is usually understood to be simple by deﬁnition. In the example

labeled simple graph, (1, 5, 2, 1) is a cycle of length 3. (A cycle, unlike a path, is not allowed to have length 0.) Paths

and cycles of n vertices are often denoted by Pn and Cn, respectively. (Some authors use the length instead of the

number of vertices, however.)

C 1 is a loop, C 2 is a digon (a pair of parallel undirected edges in a multigraph, or a pair of antiparallel edges in a

34

CHAPTER 3. GLOSSARY OF GRAPH THEORY

directed graph), and C 3 is called a triangle.

A cycle that has odd length is an odd cycle; otherwise it is an even cycle. One theorem is that a graph is bipartite if

and only if it contains no odd cycles. (See complete bipartite graph.)

A graph is acyclic if it contains no cycles; unicyclic if it contains exactly one cycle; and pancyclic if it contains cycles

of every possible length (from 3 to the order of the graph).

A wheel graph is a graph with n vertices (n ≥ 4), formed by connecting a single vertex to all vertices of C -₁.

The girth of a graph is the length of a shortest (simple) cycle in the graph; and the circumference, the length of a

longest (simple) cycle. The girth and circumference of an acyclic graph are deﬁned to be inﬁnity ∞.

A path or cycle is Hamiltonian (or spanning) if it uses all vertices exactly once. A graph that contains a Hamiltonian path is traceable; and one that contains a Hamiltonian path for any given pair of (distinct) end vertices is a

Hamiltonian connected graph. A graph that contains a Hamiltonian cycle is a Hamiltonian graph.

A trail or circuit (or cycle) is Eulerian if it uses all edges precisely once. A graph that contains an Eulerian trail is

traversable. A graph that contains an Eulerian circuit is an Eulerian graph.

Two paths are internally disjoint (some people call it independent) if they do not have any vertex in common, except

the ﬁrst and last ones.

A theta graph is the union of three internally disjoint (simple) paths that have the same two distinct end vertices.[3]

A theta0 graph has seven vertices and eight edges that can be drawn as the perimeter and one diameter of a regular

hexagon. (The seventh vertex splits the diameter into two edges.) The smallest, excluding multigraphs, topological

minor of a theta0 graph consists of a square plus one of its diagonals.

3.1.3

Trees

A tree is a connected acyclic simple graph. For directed graphs, each vertex has at most one incoming edge. A vertex

of degree 1 is called a leaf, or pendant vertex. An edge incident to a leaf is a leaf edge, or pendant edge. (Some

people deﬁne a leaf edge as a leaf and then deﬁne a leaf vertex on top of it. These two sets of deﬁnitions are often

used interchangeably.) A non-leaf vertex is an internal vertex. Sometimes, one vertex of the tree is distinguished,

and called the root; in this case, the tree is called rooted. Rooted trees are often treated as directed acyclic graphs

with the edges pointing away from the root.

A subtree of the tree T is a connected subgraph of T.

A forest is an acyclic simple graph. For directed graphs, each vertex has at most one incoming edge. (That is, a tree

with the connectivity requirement removed; a graph containing multiple disconnected trees.)

A subforest of the forest F is a subgraph of F.

A spanning tree is a spanning subgraph that is a tree. Every graph has a spanning forest. But only a connected graph

has a spanning tree.

A special kind of tree called a star is K₁,k. An induced star with 3 edges is a claw.

A caterpillar is a tree in which all non-leaf nodes form a single path.

A k-ary tree is a rooted tree in which every internal vertex has no more than k children. A 1-ary tree is just a path.

A 2-ary tree is also called a binary tree.

3.1.4

Cliques

The complete graph Kn of order n is a simple graph with n vertices in which every vertex is adjacent to every other.

The pentagon-shaped graph to the right is complete. The complete graph on n vertices is often denoted by Kn. It has

n(n−1)/2 edges (corresponding to all possible choices of pairs of vertices).

A clique in a graph is a set of pairwise adjacent vertices. Since any subgraph induced by a clique is a complete

subgraph, the two terms and their notations are usually used interchangeably. A k-clique is a clique of order k. In the

example labeled simple graph above, vertices 1, 2 and 5 form a 3-clique, or a triangle. A maximal clique is a clique

that is not a subset of any other clique (some authors reserve the term clique for maximal cliques).

The clique number ω(G) of a graph G is the order of a largest clique in G.

3.1. BASICS

35

2

1

3

4

5

6

A labeled tree with 6 vertices and 5 edges. Nodes 1, 2, 3, and 6 are leaves, while 4 and 5 are internal vertices.

3.1.5

Strongly connected component

A related but weaker concept is that of a strongly connected component. Informally, a strongly connected component

of a directed graph is a subgraph where all nodes in the subgraph are reachable by all other nodes in the subgraph.

Reachability between nodes is established by the existence of a path between the nodes.

A directed graph can be decomposed into strongly connected components by running the depth-ﬁrst search (DFS)

algorithm twice: ﬁrst, on the graph itself and next on the transpose graph in decreasing order of the ﬁnishing times

of the ﬁrst DFS. Given a directed graph G, the transpose GT is the graph G with all the edge directions reversed.

36

CHAPTER 3. GLOSSARY OF GRAPH THEORY

K5 , a complete graph. If a subgraph looks like this, the vertices in that subgraph form a clique of size 5.

3.1.6

Hypercubes

A hypercube graph Qn is a regular graph with 2n vertices, 2n−1 n edges, and n edges touching each vertex. It can be

obtained as the one-dimensional skeleton of the geometric hypercube.

3.1.7

Knots

A knot in a directed graph is a collection of vertices and edges with the property that every vertex in the knot has

outgoing edges, and all outgoing edges from vertices in the knot terminate at other vertices in the knot. Thus it is

impossible to leave the knot while following the directions of the edges.

3.1.8

Minors

A minor G2 = (V2 , E2 ) of G1 = (V1 , E1 ) is an injection from V2 to V1 such that every edge in E2 corresponds to

a path (disjoint from all other such paths) in G1 such that every vertex in V1 is in one or more paths, or is part of

the injection from V2 to V1 . This can alternatively be phrased in terms of contractions, which are operations which

collapse a path and all vertices on it into a single edge (see Minor (graph theory)).

3.2. ADJACENCY AND DEGREE

3.1.9

37

Embedding

An embedding G2 = (V2 , E2 ) of G1 = (V1 , E1 ) is an injection from V2 to V1 such that every edge in E2 corresponds

to a path in G1 .[4]

3.2 Adjacency and degree

In graph theory, degree, especially that of a vertex, is usually a measure of immediate adjacency.

An edge connects two vertices; these two vertices are said to be incident to that edge, or, equivalently, that edge

incident to those two vertices. All degree-related concepts have to do with adjacency or incidence.

The degree, or valency, dG(v) of a vertex v in a graph G is the number of edges incident to v, with loops being

counted twice. A vertex of degree 0 is an isolated vertex. A vertex of degree 1 is a leaf. In the example labeled

simple graph, vertices 1 and 3 have a degree of 2, vertices 2, 4 and 5 have a degree of 3, and vertex 6 has a degree of

1. If E is ﬁnite, then the total sum of vertex degrees is equal to twice the number of edges.

The total degree of a graph is the sum of the degrees of all its vertices. Thus, for a graph without loops, it is equal to

the number of incidences between vertices and edges. The handshaking lemma states that the total degree is always

equal to two times the number of edges, loops included. This means that for a simple graph with 3 vertices with each

vertex having a degree of two (i.e. a triangle) the total degree would be six (e.g. 3 x 2 = 6).

A degree sequence is a list of degrees of a graph in non-increasing order (e.g. d1 ≥ d2 ≥ … ≥ dn). A sequence of

non-increasing integers is realizable if it is a degree sequence of some graph.

Two vertices u and v are called adjacent if an edge exists between them. We denote this by u ~ v or u ↓ v. In the

above graph, vertices 1 and 2 are adjacent, but vertices 2 and 4 are not. The set of neighbors of v, that is, vertices

adjacent to v not including v itself, forms an induced subgraph called the (open) neighborhood of v and denoted

NG(v). When v is also included, it is called a closed neighborhood and denoted by NG[v]. When stated without

any qualiﬁcation, a neighborhood is assumed to be open. The subscript G is usually dropped when there is no danger

of confusion; the same neighborhood notation may also be used to refer to sets of adjacent vertices rather than the

corresponding induced subgraphs. In the example labeled simple graph, vertex 1 has two neighbors: vertices 2 and

5. For a simple graph, the number of neighbors that a vertex has coincides with its degree.

A dominating set of a graph is a vertex subset whose closed neighborhood includes all vertices of the graph. A vertex

v dominates another vertex u if there is an edge from v to u. A vertex subset V dominates another vertex subset

U if every vertex in U is adjacent to some vertex in V. The minimum size of a dominating set is the domination

number γ(G).

In computers, a ﬁnite, directed or undirected graph (with n vertices, say) is often represented by its adjacency matrix:

an n-by-n matrix whose entry in row i and column j gives the number of edges from the i-th to the j-th vertex.

Spectral graph theory studies relationships between the properties of a graph and its adjacency matrix or other

matrices associated with the graph.

The maximum degree Δ(G) of a graph G is the largest degree over all vertices; the minimum degree δ(G), the

smallest.

A graph in which every vertex has the same degree is regular. It is k-regular if every vertex has degree k. A 0regular graph is an independent set. A 1-regular graph is a matching. A 2-regular graph is a vertex disjoint union of

cycles. A 3-regular graph is said to be cubic, or trivalent.

A k-factor is a k-regular spanning subgraph. A 1-factor is a perfect matching. A partition of edges of a graph into

k-factors is called a k-factorization. A k-factorable graph is a graph that admits a k-factorization.

A graph is biregular if it has unequal maximum and minimum degrees and every vertex has one of those two degrees.

A strongly regular graph is a regular graph such that any adjacent vertices have the same number of common

neighbors as other adjacent pairs and that any nonadjacent vertices have the same number of common neighbors as

other nonadjacent pairs.

38

3.2.1

CHAPTER 3. GLOSSARY OF GRAPH THEORY

Independence

In graph theory, the word independent usually carries the connotation of pairwise disjoint or mutually nonadjacent.

In this sense, independence is a form of immediate nonadjacency. An isolated vertex is a vertex not incident to any

edges. An independent set, or coclique, or stable set, is a set of vertices of which no pair is adjacent. Since the graph

induced by any independent set is an empty graph, the two terms are usually used interchangeably. In the example

labeled simple graph at the top of this page, vertices 1, 3, and 6 form an independent set; and 2 and 4 form another

one.

Two subgraphs are edge disjoint if they have no edges in common. Similarly, two subgraphs are vertex disjoint if

they have no vertices (and thus, also no edges) in common. Unless speciﬁed otherwise, a set of disjoint subgraphs

are assumed to be pairwise vertex disjoint.

The independence number α(G) of a graph G is the size of the largest independent set of G.

A graph can be decomposed into independent sets in the sense that the entire vertex set of the graph can be partitioned

into pairwise disjoint independent subsets. Such independent subsets are called partite sets, or simply parts.

A graph that can be decomposed into two partite sets bipartite; three sets, tripartite; k sets, k-partite; and an

unknown number of sets, multipartite. An 1-partite graph is the same as an independent set, or an empty graph. A

2-partite graph is the same as a bipartite graph. A graph that can be decomposed into k partite sets is also said to be

k-colourable.

A complete multipartite graph is a graph in which vertices are adjacent if and only if they belong to diﬀerent partite

sets. A complete bipartite graph is also referred to as a biclique; if its partite sets contain n and m vertices, respectively,

then the graph is denoted Kn,m.

A k-partite graph is semiregular if each of its partite sets has a uniform degree; equipartite if each partite set has

the same size; and balanced k-partite if each partite set diﬀers in size by at most 1 with any other.

The matching number α′ (G) of a graph G is the size of a largest matching, or pairwise vertex disjoint edges, of G.

A spanning matching, also called a perfect matching is a matching that covers all vertices of a graph.

3.3 Complexity

Complexity of a graph denotes the quantity of information that a graph contained, and can be measured in several

ways. For example, by counting the number of its spanning trees, or the value of a certain formula involving the

number of vertices, edges, and proper paths in a graph. [5]

3.4 Connectivity

Connectivity extends the concept of adjacency and is essentially a form (and measure) of concatenated adjacency.

If it is possible to establish a path from any vertex to any other vertex of a graph, the graph is said to be connected;

otherwise, the graph is disconnected. A graph is totally disconnected if there is no path connecting any pair of

vertices. This is just another name to describe an empty graph or independent set.

A cut vertex, or articulation point, is a vertex whose removal disconnects the remaining subgraph. A cut set, or

vertex cut or separating set, is a set of vertices whose removal disconnects the remaining subgraph. A bridge is an

analogous edge (see below).

If it is always possible to establish a path from any vertex to every other even after removing any k - 1 vertices,

then the graph is said to be k-vertex-connected or k-connected. Note that a graph is k-connected if and only if it

contains k internally disjoint paths between any two vertices. The example labeled simple graph above is connected

(and therefore 1-connected), but not 2-connected. The vertex connectivity or connectivity κ(G) of a graph G is the

minimum number of vertices that need to be removed to disconnect G. The complete graph Kn has connectivity n 1 for n > 1; and a disconnected graph has connectivity 0.

In network theory, a giant component is a connected subgraph that contains a majority of the entire graph’s nodes.

A bridge, or cut edge or isthmus, is an edge whose removal disconnects a graph. (For example, all the edges in a tree

are bridges.) A cut vertex is an analogous vertex (see above). A disconnecting set is a set of edges whose removal

3.5. DISTANCE

39

increases the number of components. An edge cut is the set of all edges which have one vertex in some proper vertex

subset S and the other vertex in V(G)\S. Edges of K 3 form a disconnecting set but not an edge cut. Any two edges

of K 3 form a minimal disconnecting set as well as an edge cut. An edge cut is necessarily a disconnecting set; and a

minimal disconnecting set of a nonempty graph is necessarily an edge cut. A bond is a minimal (but not necessarily

minimum), nonempty set of edges whose removal disconnects a graph.

A graph is k-edge-connected if any subgraph formed by removing any k - 1 edges is still connected. The edge

connectivity κ'(G) of a graph G is the minimum number of edges needed to disconnect G. One well-known result is

that κ(G) ≤ κ'(G) ≤ δ(G).

A component is a maximally connected subgraph. A block is either a maximally 2-connected subgraph, a bridge

(together with its vertices), or an isolated vertex. A biconnected component is a 2-connected component.

An articulation point (also known as a separating vertex) of a graph is a vertex whose removal from the graph

increases its number of connected components. A biconnected component can be deﬁned as a subgraph induced by

a maximal set of nodes that has no separating vertex.

3.5 Distance

The distance dG(u, v) between two (not necessary distinct) vertices u and v in a graph G is the length of a shortest

path (also called a graph geodesic) between them. The subscript G is usually dropped when there is no danger of

confusion. When u and v are identical, their distance is 0. When u and v are unreachable from each other, their

distance is deﬁned to be inﬁnity ∞.

The eccentricity εG(v) of a vertex v in a graph G is the maximum distance from v to any other vertex. The diameter

diam(G) of a graph G is the maximum eccentricity over all vertices in a graph; and the radius rad(G), the minimum.

When there are two components in G, diam(G) and rad(G) deﬁned to be inﬁnity ∞. Trivially, diam(G) ≤ 2 rad(G).

Vertices with maximum eccentricity are called peripheral vertices. Vertices of minimum eccentricity form the

center. A tree has at most two center vertices.

The Wiener index of a vertex v in a graph G, denoted by WG(v) is the sum of distances between v and all others.

The Wiener index of a graph G, denoted by W(G), is the sum of distances over all pairs of vertices. An undirected

graph’s Wiener polynomial is deﬁned to be Σ qd(u,v) over all unordered pairs of vertices u and v. Wiener index and

Wiener polynomial are of particular interest to mathematical chemists.

The k-th power Gk of a graph G is a supergraph formed by adding an edge between all pairs of vertices of G with

distance at most k. A second power of a graph is also called a square.

A k-spanner is a spanning subgraph, S, in which every two vertices are at most k times as far apart on S than on G.

The number k is the dilation. k-spanner is used for studying geometric network optimization.

3.6 Genus

A crossing is a pair of intersecting edges. A graph is embeddable on a surface if its vertices and edges can be

arranged on it without any crossing. The genus of a graph is the lowest genus of any surface on which the graph can

embed.

A planar graph is one which can be drawn on the (Euclidean) plane without any crossing; and a plane graph, one

which is drawn in such fashion. In other words, a planar graph is a graph of genus 0. The example labeled simple

graph is planar; the complete graph on n vertices, for n> 4, is not planar. Also, a tree is necessarily a planar graph.

When a graph is drawn without any crossing, any cycle that surrounds a region without any edges reaching from the

cycle into the region forms a face. Two faces on a plane graph are adjacent if they share a common edge. A dual,

or planar dual when the context needs to be clariﬁed, G* of a plane graph G is a graph whose vertices represent the

faces, including any outerface, of G and are adjacent in G* if and only if their corresponding faces are adjacent in G.

The dual of a planar graph is always a planar pseudograph (e.g. consider the dual of a triangle). In the familiar case

of a 3-connected simple planar graph G (isomorphic to a convex polyhedron P), the dual G* is also a 3-connected

simple planar graph (and isomorphic to the dual polyhedron P * ).

Furthermore, since we can establish a sense of “inside” and “outside” on a plane, we can identify an “outermost”

region that contains the entire graph if the graph does not cover the entire plane. Such outermost region is called

40

CHAPTER 3. GLOSSARY OF GRAPH THEORY

an outer face. An outerplanar graph is one which can be drawn in the planar fashion such that its vertices are all

adjacent to the outer face; and an outerplane graph, one which is drawn in such fashion.

The minimum number of crossings that must appear when a graph is drawn on a plane is called the crossing number.

The minimum number of planar graphs needed to cover a graph is the thickness of the graph.

3.7 Weighted graphs and networks

A weighted graph associates a label (weight) with every edge in the graph. Weights are usually real numbers. They

may be restricted to rational numbers or integers. Certain algorithms require further restrictions on weights; for

instance, Dijkstra’s algorithm works properly only for positive weights. The weight of a path or the weight of a

tree in a weighted graph is the sum of the weights of the selected edges. Sometimes a non-edge (a vertex pair with

no connecting edge) is indicated by labeling it with a special weight representing inﬁnity. Sometimes the word cost

is used instead of weight. When stated without any qualiﬁcation, a graph is always assumed to be unweighted. In

some writing on graph theory the term network is a synonym for a weighted graph. A network may be directed or

undirected, it may contain special vertices (nodes), such as source or sink. The classical network problems include:

• minimum cost spanning tree,

• shortest paths,

• maximal ﬂow (and the max-ﬂow min-cut theorem)

3.8 Direction

Main article: Digraph (mathematics)

A directed arc, or directed edge, is an ordered pair of endvertices that can be represented graphically as an arrow

drawn between the endvertices. In such an ordered pair the ﬁrst vertex is called the initial vertex or tail; the second

one is called the terminal vertex or head (because it appears at the arrow head). An undirected edge disregards

any sense of direction and treats both endvertices interchangeably. A loop in a digraph, however, keeps a sense of

direction and treats both head and tail identically. A set of arcs are multiple, or parallel, if they share the same head

and the same tail. A pair of arcs are anti-parallel if one’s head/tail is the other’s tail/head. A digraph, or directed

graph, or oriented graph, is analogous to an undirected graph except that it contains only arcs. A mixed graph may

contain both directed and undirected edges; it generalizes both directed and undirected graphs. When stated without

any qualiﬁcation, a graph is almost always assumed to be undirected.

A digraph is called simple if it has no loops and at most one arc between any pair of vertices. When stated without

any qualiﬁcation, a digraph is usually assumed to be simple. A quiver is a directed graph which is speciﬁcally allowed,

but not required, to have loops and more than one arc between any pair of vertices.

In a digraph Γ, we distinguish the out degree dΓ+ (v), the number of edges leaving a vertex v, and the in degree

dΓ− (v), the number of edges entering a vertex v. If the graph is oriented, the degree dΓ(v) of a vertex v is equal to the

sum of its out- and in- degrees. When the context is clear, the subscript Γ can be dropped. Maximum and minimum

out degrees are denoted by Δ+ (Γ) and δ+ (Γ); and maximum and minimum in degrees, Δ− (Γ) and δ− (Γ).

An out-neighborhood, or successor set, N + Γ(v) of a vertex v is the set of heads of arcs going from v. Likewise, an

in-neighborhood, or predecessor set, N − Γ(v) of a vertex v is the set of tails of arcs going into v.

A source is a vertex with 0 in-degree; and a sink, 0 out-degree.

A vertex v dominates another vertex u if there is an arc from v to u. A vertex subset S is out-dominating if every

vertex not in S is dominated by some vertex in S; and in-dominating if every vertex in S is dominated by some vertex

not in S.

A kernel in a (possibly directed) graph G is an independent set S such that every vertex in V(G) \ S dominates

some vertex in S. In undirected graphs, kernels are maximal independent sets.[6] A digraph is kernel perfect if every

induced sub-digraph has a kernel.[7]

An Eulerian digraph is a digraph with equal in- and out-degrees at every vertex.

3.9. COLOURING

41

The zweieck of an undirected edge e = (u, v) is the pair of diedges (u, v) and (v, u) which form the simple dicircuit.

An orientation is an assignment of directions to the edges of an undirected or partially directed graph. When

stated without any qualiﬁcation, it is usually assumed that all undirected edges are replaced by a directed one in an

orientation. Also, the underlying graph is usually assumed to be undirected and simple.

A tournament is a digraph in which each pair of vertices is connected by exactly one arc. In other words, it is an

oriented complete graph.

A directed path, or just a path when the context is clear, is an oriented simple path such that all arcs go the same

direction, meaning all internal vertices have in- and out-degrees 1. A vertex v is reachable from another vertex u if

there is a directed path that starts from u and ends at v. Note that in general the condition that u is reachable from v

does not imply that v is also reachable from u.

If v is reachable from u, then u is a predecessor of v and v is a successor of u. If there is an arc from u to v, then u

is a direct predecessor of v, and v is a direct successor of u.

A digraph is strongly connected if every vertex is reachable from every other following the directions of the arcs.

On the contrary, a digraph is weakly connected if its underlying undirected graph is connected. A weakly connected

graph can be thought of as a digraph in which every vertex is “reachable” from every other but not necessarily following

the directions of the arcs. A strong orientation is an orientation that produces a strongly connected digraph.

A directed cycle, or just a cycle when the context is clear, is an oriented simple cycle such that all arcs go the same

direction, meaning all vertices have in- and out-degrees 1. A digraph is acyclic if it does not contain any directed

cycle. A ﬁnite, acyclic digraph with no isolated vertices necessarily contains at least one source and at least one sink.

An arborescence, or out-tree or branching, is an oriented tree in which all vertices are reachable from a single vertex.

Likewise, an in-tree is an oriented tree in which a single vertex is reachable from every other one.

3.8.1

Directed acyclic graphs

Main article: directed acyclic graph

The partial order structure of directed acyclic graphs (or DAGs) gives them their own terminology.

If there is a directed edge from u to v, then we say u is a parent of v and v is a child of u. If there is a directed path

from u to v, we say u is an ancestor of v and v is a descendant of u.

The moral graph of a DAG is the undirected graph created by adding an (undirected) edge between all parents of

the same node (sometimes called marrying), and then replacing all directed edges by undirected edges. A DAG is

perfect if, for each node, the set of parents is complete (i.e. no new edges need to be added when forming the moral

graph).

3.9 Colouring

Main article: Graph colouring

Vertices in graphs can be given colours to identify or label them. Although they may actually be rendered in diagrams

in diﬀerent colours, working mathematicians generally pencil in numbers or letters (usually numbers) to represent the

colours.

Given a graph G (V,E) a k-colouring of G is a map ϕ : V → {1, ..., k} with the property that (u, v) ∈ E ⇒ ϕ(u) ≠

ϕ(v) - in other words, every vertex is assigned a colour with the condition that adjacent vertices cannot be assigned

the same colour.

The chromatic number χ(G) is the smallest k for which G has a k-colouring.

Given a graph and a colouring, the colour classes of the graph are the sets of vertices given the same colour.

A graph is called k-critical if its chromatic number is k but all of its proper subgraphs have chromatic number less

than k. An odd cycle is 3-critical, and the complete graph on k vertices is k-critical.

42

CHAPTER 3. GLOSSARY OF GRAPH THEORY

This graph is an example of a 4-critical graph. Its chromatic number is 4 but all of its proper subgraphs have a chromatic number

less than 4. This graph is also planar

3.10 Various

A graph invariant is a property of a graph G, usually a number or a polynomial, that depends only on the isomorphism

class of G. Examples are the order, genus, chromatic number, and chromatic polynomial of a graph.

3.11 See also

• Graph (mathematics)

• List of graph theory topics

3.12 References

[1] Harris, John M. (2000). Combinatorics and Graph Theory. New York: Springer-Verlag. p. 5. ISBN 0-387-98736-3.

[2] Brandstädt, Andreas; Le, Van Bang; Spinrad, Jeremy (1999), “Chapter 7: Forbidden Subgraph”, Graph Classes: A Survey,

SIAM Monographs on Discrete Mathematics and Applications, pp. 105–121, ISBN 0-89871-432-X.

[3] Mitchem, John (1969), “Hypo-properties in graphs”, The Many Facets of Graph Theory (Proc. Conf., Western Mich. Univ.,

Kalamazoo, Mich., 1968), Springer, pp. 223–230, doi:10.1007/BFb0060121, MR 0253932; Bondy, J. A. (1972), “The

“graph theory” of the Greek alphabet”, Graph theory and applications (Proc. Conf., Western Michigan Univ., Kalamazoo,

Mich., 1972; dedicated to the memory of J. W. T. Youngs), Lecture Notes in Mathematics 303, Springer, pp. 43–54,

doi:10.1007/BFb0067356, MR 0335362.

[4] Rosenberg, Arnold L. and Heath, Lenwood S. (2001). Graph separators with applications. (1st edition ed.). Kluwer. ISBN

978-0-306-46464-5.

3.12. REFERENCES

43

[5] Neel, David L. (2006), “The linear complexity of a graph”, The electronic journal of combinatorics

[6] Bondy, J.A., Murty, U.S.R., Graph Theory, p. 298

[7] Béla Bollobás, Modern Graph theory, p. 298

• Bollobás, Béla (1998). Modern Graph Theory. Graduate Texts in Mathematics 184. New York: SpringerVerlag. ISBN 0-387-98488-7. Zbl 0902.05016.. [Packed with advanced topics followed by a historical

overview at the end of each chapter.]

• Brandstädt, Andreas; Le, Van Bang; Spinrad, Jeremy P. (1999). Graph classes: a survey. SIAM Monographs

on Discrete Mathematics. and Applications 3. Philadelphia, PA: Society for Industrial and Applied Mathematics. ISBN 978-0-898714-32-6. Zbl 0919.05001.

• Diestel, Reinhard (2010). Graph Theory. Graduate Texts in Mathematics 173 (4th ed.). Springer-Verlag.

ISBN 978-3-642-14278-9. Zbl 1204.05001. [Standard textbook, most basic material and some deeper results,

exercises of various diﬃculty and notes at the end of each chapter; known for being quasi error-free.]

• West, Douglas B. (2001). Introduction to Graph Theory (2ed). Upper Saddle River: Prentice Hall. ISBN

0-13-014400-2. [Tons of illustrations, references, and exercises. The most complete introductory guide to the

subject.]

• Weisstein, Eric W., “Graph”, MathWorld.

• Zaslavsky, Thomas. Glossary of signed and gain graphs and allied areas. Electronic Journal of Combinatorics,

Dynamic Surveys in Combinatorics, # DS 8. http://www.combinatorics.org/Surveys/

Chapter 4

Graph (mathematics)

This article is about sets of vertices connected by edges. For graphs of mathematical functions, see Graph of a

function. For other uses, see Graph (disambiguation).

In mathematics, and more speciﬁcally in graph theory, a graph is a representation of a set of objects where some

6

5

4

1

2

3

A drawing of a labeled graph on 6 vertices and 7 edges.

pairs of objects are connected by links. The interconnected objects are represented by mathematical abstractions

called vertices, and the links that connect some pairs of vertices are called edges.[1] Typically, a graph is depicted in

diagrammatic form as a set of dots for the vertices, joined by lines or curves for the edges. Graphs are one of the

objects of study in discrete mathematics.

The edges may be directed or undirected. For example, if the vertices represent people at a party, and there is an

edge between two people if they shake hands, then this is an undirected graph, because if person A shook hands with

person B, then person B also shook hands with person A. In contrast, if there is an edge from person A to person B

when person A knows of person B, then this graph is directed, because knowledge of someone is not necessarily a

symmetric relation (that is, one person knowing another person does not necessarily imply the reverse; for example,

many fans may know of a celebrity, but the celebrity is unlikely to know of all their fans). This latter type of graph

is called a directed graph and the edges are called directed edges or arcs.

44

4.1. DEFINITIONS

45

Vertices are also called nodes or points, and edges are also called arcs or lines. Graphs are the basic subject studied

by graph theory. The word “graph” was ﬁrst used in this sense by J.J. Sylvester in 1878.[2][3]

4.1 Deﬁnitions

Deﬁnitions in graph theory vary. The following are some of the more basic ways of deﬁning graphs and related

mathematical structures.

4.1.1

Graph

In the most common sense of the term,[4] a graph is an ordered pair G = (V, E) comprising a set V of vertices or

nodes together with a set E of edges or links, which are 2-element subsets of V (i.e., an edge is related with two

vertices, and the relation is represented as an unordered pair of the vertices with respect to the particular edge). To

avoid ambiguity, this type of graph may be described precisely as undirected and simple.

Other senses of graph stem from diﬀerent conceptions of the edge set. In one more generalized notion,[5] E is a set

together with a relation of incidence that associates with each edge two vertices. In another generalized notion, E is

a multiset of unordered pairs of (not necessarily distinct) vertices. Many authors call this type of object a multigraph

or pseudograph.

All of these variants and others are described more fully below.

The vertices belonging to an edge are called the ends, endpoints, or end vertices of the edge. A vertex may exist in

a graph and not belong to an edge.

V and E are usually taken to be ﬁnite, and many of the well-known results are not true (or are rather diﬀerent) for

inﬁnite graphs because many of the arguments fail in the inﬁnite case. Moreover, V is often assumed to be nonempty, but E is allowed to be the empty set. The order of a graph is |V | (the number of vertices). A graph’s size

is |E| , the number of edges. The degree of a vertex is the number of edges that connect to it, where an edge that

connects to the vertex at both ends (a loop) is counted twice.

For an edge {u, v}, graph theorists usually use the somewhat shorter notation uv.

4.1.2

Adjacency relation

The edges E of an undirected graph G induce a symmetric binary relation ~ on V that is called the adjacency relation

of G. Speciﬁcally, for each edge {u, v} the vertices u and v are said to be adjacent to one another, which is denoted

u ~ v.

4.2 Types of graphs

4.2.1

Distinction in terms of the main deﬁnition

As stated above, in diﬀerent contexts it may be useful to reﬁne the term graph with diﬀerent degrees of generality.

Whenever it is necessary to draw a strict distinction, the following terms are used. Most commonly, in modern texts

in graph theory, unless stated otherwise, graph means “undirected simple ﬁnite graph” (see the deﬁnitions below).

A directed graph.

46

CHAPTER 4. GRAPH (MATHEMATICS)

A simple undirected graph with three vertices and three edges. Each vertex has degree two, so this is also a regular

graph.

Undirected graph

An undirected graph is one in which edges have no orientation. The edge (a, b) is identical to the edge (b, a), i.e., they

are not ordered pairs, but sets {u, v} (or 2-multisets) of vertices. The maximum number of edges in an undirected

graph without a self-loop is n(n - 1)/2.

Directed graph

Main article: Directed graph

A directed graph or digraph is an ordered pair D = (V, A) with

• V a set whose elements are called vertices or nodes, and

• A a set of ordered pairs of vertices, called arcs, directed edges, or arrows.

An arc a = (x, y) is considered to be directed from x to y; y is called the head and x is called the tail of the arc; y is

said to be a direct successor of x, and x is said to be a direct predecessor of y. If a path leads from x to y, then y is

said to be a successor of x and reachable from x, and x is said to be a predecessor of y. The arc (y, x) is called the

arc (x, y) inverted.

A directed graph D is called symmetric if, for every arc in D, the corresponding inverted arc also belongs to D. A

symmetric loopless directed graph D = (V, A) is equivalent to a simple undirected graph G = (V, E), where the pairs

of inverse arcs in A correspond 1-to-1 with the edges in E; thus the edges in G number |E| = |A|/2, or half the number

of arcs in D.

An oriented graph is a directed graph in which at most one of (x, y) and (y, x) may be arcs.

Mixed graph

Main article: Mixed graph

A mixed graph G is a graph in which some edges may be directed and some may be undirected. It is written as an

ordered triple G = (V, E, A) with V, E, and A deﬁned as above. Directed and undirected graphs are special cases.

Multigraph

A loop is an edge (directed or undirected) which starts and ends on the same vertex; these may be permitted or not

permitted according to the application. In this context, an edge with two diﬀerent ends is called a link.

The term "multigraph" is generally understood to mean that multiple edges (and sometimes loops) are allowed. Where

graphs are deﬁned so as to allow loops and multiple edges, a multigraph is often deﬁned to mean a graph without

loops,[6] however, where graphs are deﬁned so as to disallow loops and multiple edges, the term is often deﬁned to

mean a “graph” which can have both multiple edges and loops,[7] although many use the term "pseudograph" for this

meaning.[8]

4.2. TYPES OF GRAPHS

47

Quiver

A quiver or “multidigraph” is a directed graph which may have more than one arrow from a given source to a given

target. A quiver may also have directed loops in it.

Simple graph

As opposed to a multigraph, a simple graph is an undirected graph that has no loops (edges connected at both ends

to the same vertex) and no more than one edge between any two diﬀerent vertices. In a simple graph the edges of the

graph form a set (rather than a multiset) and each edge is a pair of distinct vertices. In a simple graph with n vertices,

the degree of every vertex is at most n-1.

Weighted graph

A graph is a weighted graph if a number (weight) is assigned to each edge.[9] Such weights might represent, for

example, costs, lengths or capacities, etc. depending on the problem at hand. Some authors call such a graph a

network.[10] Weighted correlation networks can be deﬁned by soft-thresholding the pairwise correlations among

variables (e.g. gene measurements).

Half-edges, loose edges

In certain situations it can be helpful to allow edges with only one end, called half-edges, or no ends (loose edges);

see for example signed graphs and biased graphs.

4.2.2

Important graph classes

Regular graph

Main article: Regular graph

A regular graph is a graph where each vertex has the same number of neighbours, i.e., every vertex has the same

degree or valency. A regular graph with vertices of degree k is called a k‑regular graph or regular graph of degree k.

Complete graph

Main article: Complete graph

Complete graphs have the feature that each pair of vertices has an edge connecting them.

Finite and inﬁnite graphs

A ﬁnite graph is a graph G = (V, E) such that V and E are ﬁnite sets. An inﬁnite graph is one with an inﬁnite set of

vertices or edges or both.

Most commonly in graph theory it is implied that the graphs discussed are ﬁnite. If the graphs are inﬁnite, that is

usually speciﬁcally stated.

Graph classes in terms of connectivity

Main article: Connectivity (graph theory)

48

CHAPTER 4. GRAPH (MATHEMATICS)

A complete graph with 5 vertices. Each vertex has an edge to every other vertex.

In an undirected graph G, two vertices u and v are called connected if G contains a path from u to v. Otherwise,

they are called disconnected. A graph is called connected if every pair of distinct vertices in the graph is connected;

otherwise, it is called disconnected.

A graph is called k-vertex-connected or k-edge-connected if no set of k-1 vertices (respectively, edges) exists that,

when removed, disconnects the graph. A k-vertex-connected graph is often called simply k-connected.

A directed graph is called weakly connected if replacing all of its directed edges with undirected edges produces

a connected (undirected) graph. It is strongly connected or strong if it contains a directed path from u to v and a

directed path from v to u for every pair of vertices u, v.

Category of all graphs

The category of all graphs is the slice category Set ↓ D where D : Set → Set is the functor taking a set s to s × s .

4.3 Properties of graphs

See also: Glossary of graph theory and Graph property

4.4. EXAMPLES

49

Two edges of a graph are called adjacent if they share a common vertex. Two arrows of a directed graph are called

consecutive if the head of the ﬁrst one is at the tail of the second one. Similarly, two vertices are called adjacent if

they share a common edge (consecutive if they are at the tail and at the head of an arrow), in which case the common

edge is said to join the two vertices. An edge and a vertex on that edge are called incident.

The graph with only one vertex and no edges is called the trivial graph. A graph with only vertices and no edges is

known as an edgeless graph. The graph with no vertices and no edges is sometimes called the null graph or empty

graph, but the terminology is not consistent and not all mathematicians allow this object.

In a weighted graph or digraph, each edge is associated with some value, variously called its cost, weight, length or

other term depending on the application; such graphs arise in many contexts, for example in optimal routing problems

such as the traveling salesman problem.

Normally, the vertices of a graph, by their nature as elements of a set, are distinguishable. This kind of graph may be

called vertex-labeled. However, for many questions it is better to treat vertices as indistinguishable; then the graph

may be called unlabeled. (Of course, the vertices may be still distinguishable by the properties of the graph itself,

e.g., by the numbers of incident edges). The same remarks apply to edges, so graphs with labeled edges are called

edge-labeled graphs. Graphs with labels attached to edges or vertices are more generally designated as labeled.

Consequently, graphs in which vertices are indistinguishable and edges are indistinguishable are called unlabeled.

(Note that in the literature the term labeled may apply to other kinds of labeling, besides that which serves only to

distinguish diﬀerent vertices or edges.)

4.4 Examples

6

4

3

5

1

2

A graph with six nodes.

• The diagram at right is a graphic representation of the following graph:

V = {1, 2, 3, 4, 5, 6}

E = {{1, 2}, {1, 5}, {2, 3}, {2, 5}, {3, 4}, {4, 5}, {4, 6}}.

• In category theory a small category can be represented by a directed multigraph in which the objects of the

50

CHAPTER 4. GRAPH (MATHEMATICS)

category represented as vertices and the morphisms as directed edges. Then, the functors between categories

induce some, but not necessarily all, of the digraph morphisms of the graph.

• In computer science, directed graphs are used to represent knowledge (e.g., Conceptual graph), ﬁnite state

machines, and many other discrete structures.

• A binary relation R on a set X deﬁnes a directed graph. An element x of X is a direct predecessor of an element

y of X iﬀ xRy.

• A directed edge can model information networks such as Twitter, with one user following another [11]

4.5 Important graphs

Basic examples are:

• In a complete graph, each pair of vertices is joined by an edge; that is, the graph contains all possible edges.

• In a bipartite graph, the vertex set can be partitioned into two sets, W and X, so that no two vertices in W are

adjacent and no two vertices in X are adjacent. Alternatively, it is a graph with a chromatic number of 2.

• In a complete bipartite graph, the vertex set is the union of two disjoint sets, W and X, so that every vertex in

W is adjacent to every vertex in X but there are no edges within W or X.

• In a linear graph or path graph of length n, the vertices can be listed in order, v0 , v1 , ..., v , so that the edges

are vᵢ₋₁vᵢ for each i = 1, 2, ..., n. If a linear graph occurs as a subgraph of another graph, it is a path in that

graph.

• In a cycle graph of length n ≥ 3, vertices can be named v1 , ..., v so that the edges are vᵢ₋₁vi for each i = 2,...,n

in addition to v v1 . Cycle graphs can be characterized as connected 2-regular graphs. If a cycle graph occurs

as a subgraph of another graph, it is a cycle or circuit in that graph.

• A planar graph is a graph whose vertices and edges can be drawn in a plane such that no two of the edges

intersect (i.e., embedded in a plane).

• A tree is a connected graph with no cycles.

• A forest is a graph with no cycles (i.e. the disjoint union of one or more trees).

More advanced kinds of graphs are:

• The Petersen graph and its generalizations

• Perfect graphs

• Cographs

• Chordal graphs

• Other graphs with large automorphism groups: vertex-transitive, arc-transitive, and distance-transitive graphs.

• Strongly regular graphs and their generalization distance-regular graphs.

4.6 Operations on graphs

Main article: Operations on graphs

There are several operations that produce new graphs from old ones, which might be classiﬁed into the following

categories:

4.7. GENERALIZATIONS

51

• Elementary operations, sometimes called “editing operations” on graphs, which create a new graph from the

original one by a simple, local change, such as addition or deletion of a vertex or an edge, merging and splitting

of vertices, etc.

• Graph rewrite operations replacing the occurrence of some pattern graph within the host graph by an instance

of the corresponding replacement graph.

• Unary operations, which create a signiﬁcantly new graph from the old one. Examples:

• Line graph

• Dual graph

• Complement graph

• Binary operations, which create new graph from two initial graphs. Examples:

• Disjoint union of graphs

• Cartesian product of graphs

• Tensor product of graphs

• Strong product of graphs

• Lexicographic product of graphs

4.7 Generalizations

In a hypergraph, an edge can join more than two vertices.

An undirected graph can be seen as a simplicial complex consisting of 1-simplices (the edges) and 0-simplices (the

vertices). As such, complexes are generalizations of graphs since they allow for higher-dimensional simplices.

Every graph gives rise to a matroid.

In model theory, a graph is just a structure. But in that case, there is no limitation on the number of edges: it can be

any cardinal number, see continuous graph.

In computational biology, power graph analysis introduces power graphs as an alternative representation of undirected

graphs.

In geographic information systems, geometric networks are closely modeled after graphs, and borrow many concepts

from graph theory to perform spatial analysis on road networks or utility grids.

4.8 See also

• Conceptual graph

• Dual graph

• Glossary of graph theory

• Graph (data structure)

• Graph database

• Graph drawing

• Graph theory

• Hypergraph

• List of graph theory topics

• List of publications in graph theory

• Network theory

52

CHAPTER 4. GRAPH (MATHEMATICS)

4.9 Notes

[1] Trudeau, Richard J. (1993). Introduction to Graph Theory (Corrected, enlarged republication. ed.). New York: Dover

Pub. p. 19. ISBN 978-0-486-67870-2. Retrieved 8 August 2012. A graph is an object consisting of two sets called its

vertex set and its edge set.

[2] See:

• J. J. Sylvester (February 7, 1878) “Chemistry and algebra,” Nature, 17 : 284. From page 284: “Every invariant and

covariant thus becomes expressible by a graph precisely identical with a Kekuléan diagram or chemicograph.”

• J. J. Sylvester (1878) “On an application of the new atomic theory to the graphical representation of the invariants

and covariants of binary quantics, — with three appendices,” American Journal of Mathematics, Pure and Applied,

1 (1) : 64-90. The term “graph” ﬁrst appears in this paper on page 65.

[3] Gross, Jonathan L.; Yellen, Jay (2004). Handbook of graph theory. CRC Press. p. 35. ISBN 978-1-58488-090-5.

[4] See, for instance, Iyanaga and Kawada, 69 J, p. 234 or Biggs, p. 4.

[5] See, for instance, Graham et al., p. 5.

[6] For example, see Balakrishnan, p. 1, Gross (2003), p. 4, and Zwillinger, p. 220.

[7] For example, see. Bollobás, p. 7 and Diestel, p. 25.

[8] Gross (1998), p. 3, Gross (2003), p. 205, Harary, p.10, and Zwillinger, p. 220.

[9] Fletcher, Peter; Hoyle, Hughes; Patty, C. Wayne (1991). Foundations of Discrete Mathematics (International student ed.

ed.). Boston: PWS-KENT Pub. Co. p. 463. ISBN 0-53492-373-9. A weighted graph is a graph in which a number

w(e), called its weight, is assigned to each edge e.

[10] Strang, Gilbert (2005), Linear Algebra and Its Applications (4th ed.), Brooks Cole, ISBN 0-03-010567-6

[11] Pankaj Gupta, Ashish Goel, Jimmy Lin, Aneesh Sharma, Dong Wang, and Reza Bosagh Zadeh WTF: The who-to-follow

system at Twitter, Proceedings of the 22nd international conference on World Wide Web

4.10 References

• Balakrishnan, V. K. (1997-02-01). Graph Theory (1st ed.). McGraw-Hill. ISBN 0-07-005489-4.

• Berge, Claude (1958). Théorie des graphes et ses applications (in French). Dunod, Paris: Collection Universitaire de Mathématiques, II. pp. viii+277. Translation: -. Dover, New York: Wiley. 2001 [1962].

• Biggs, Norman (1993). Algebraic Graph Theory (2nd ed.). Cambridge University Press. ISBN 0-521-45897-8.

• Bollobás, Béla (2002-08-12). Modern Graph Theory (1st ed.). Springer. ISBN 0-387-98488-7.

• Bang-Jensen, J.; Gutin, G. (2000). Digraphs: Theory, Algorithms and Applications. Springer.

• Diestel, Reinhard (2005). Graph Theory (3rd ed.). Berlin, New York: Springer-Verlag. ISBN 978-3-54026183-4..

• Graham, R.L., Grötschel, M., and Lovász, L, ed. (1995). Handbook of Combinatorics. MIT Press. ISBN

0-262-07169-X.

• Gross, Jonathan L.; Yellen, Jay (1998-12-30). Graph Theory and Its Applications. CRC Press. ISBN 0-84933982-0.

• Gross, Jonathan L., & Yellen, Jay, ed. (2003-12-29). Handbook of Graph Theory. CRC. ISBN 1-58488-0902.

• Harary, Frank (January 1995). Graph Theory. Addison Wesley Publishing Company. ISBN 0-201-41033-8.

• Iyanaga, Shôkichi; Kawada, Yukiyosi (1977). Encyclopedic Dictionary of Mathematics. MIT Press. ISBN

0-262-09016-3.

• Zwillinger, Daniel (2002-11-27). CRC Standard Mathematical Tables and Formulae (31st ed.). Chapman &

Hall/CRC. ISBN 1-58488-291-3.

4.11. FURTHER READING

53

4.11 Further reading

• Trudeau, Richard J. (1993). Introduction to Graph Theory (Corrected, enlarged republication. ed.). New York:

Dover Publications. ISBN 978-0-486-67870-2. Retrieved 8 August 2012.

4.12 External links

• Weisstein, Eric W., “Graph”, MathWorld.

Chapter 5

Graph theory

This article is about sets of vertices connected by edges. For graphs of mathematical functions, see Graph of a

function. For other uses, see Graph (disambiguation).

In mathematics and computer science, graph theory is the study of graphs, which are mathematical structures used

6

5

4

1

2

3

A drawing of a graph

to model pairwise relations between objects. A “graph” in this context is made up of "vertices" or “nodes” and lines

called edges that connect them. A graph may be undirected, meaning that there is no distinction between the two

vertices associated with each edge, or its edges may be directed from one vertex to another; see graph (mathematics)

for more detailed deﬁnitions and for other variations in the types of graph that are commonly considered. Graphs are

one of the prime objects of study in discrete mathematics.

Refer to the glossary of graph theory for basic deﬁnitions in graph theory.

54

5.1. DEFINITIONS

55

5.1 Deﬁnitions

Deﬁnitions in graph theory vary. The following are some of the more basic ways of deﬁning graphs and related

mathematical structures.

5.1.1

Graph

In the most common sense of the term,[1] a graph is an ordered pair G = (V, E) comprising a set V of vertices or

nodes together with a set E of edges or lines, which are 2-element subsets of V (i.e., an edge is related with two

vertices, and the relation is represented as an unordered pair of the vertices with respect to the particular edge). To

avoid ambiguity, this type of graph may be described precisely as undirected and simple.

Other senses of graph stem from diﬀerent conceptions of the edge set. In one more generalized notion,[2] V is a set

together with a relation of incidence that associates with each edge two vertices. In another generalized notion, E is

a multiset of unordered pairs of (not necessarily distinct) vertices. Many authors call this type of object a multigraph

or pseudograph.

All of these variants and others are described more fully below.

The vertices belonging to an edge are called the ends, endpoints, or end vertices of the edge. A vertex may exist in

a graph and not belong to an edge.

V and E are usually taken to be ﬁnite, and many of the well-known results are not true (or are rather diﬀerent) for

inﬁnite graphs because many of the arguments fail in the inﬁnite case. The order of a graph is |V | (the number of

vertices). A graph’s size is |E| , the number of edges. The degree or valency of a vertex is the number of edges that

connect to it, where an edge that connects to the vertex at both ends (a loop) is counted twice.

For an edge {u, v} , graph theorists usually use the somewhat shorter notation uv .

5.2 Applications

Graphs can be used to model many types of relations and processes in physical, biological,[4] social and information

systems. Many practical problems can be represented by graphs.

In computer science, graphs are used to represent networks of communication, data organization, computational

devices, the ﬂow of computation, etc. For instance, the link structure of a website can be represented by a directed

graph, in which the vertices represent web pages and directed edges represent links from one page to another. A

similar approach can be taken to problems in travel, biology, computer chip design, and many other ﬁelds. The

development of algorithms to handle graphs is therefore of major interest in computer science. The transformation

of graphs is often formalized and represented by graph rewrite systems. Complementary to graph transformation

systems focusing on rule-based in-memory manipulation of graphs are graph databases geared towards transactionsafe, persistent storing and querying of graph-structured data.

Graph-theoretic methods, in various forms, have proven particularly useful in linguistics, since natural language often

lends itself well to discrete structure. Traditionally, syntax and compositional semantics follow tree-based structures,

whose expressive power lies in the principle of compositionality, modeled in a hierarchical graph. More contemporary

approaches such as head-driven phrase structure grammar model the syntax of natural language using typed feature

structures, which are directed acyclic graphs. Within lexical semantics, especially as applied to computers, modeling

word meaning is easier when a given word is understood in terms of related words; semantic networks are therefore

important in computational linguistics. Still other methods in phonology (e.g. optimality theory, which uses lattice

graphs) and morphology (e.g. ﬁnite-state morphology, using ﬁnite-state transducers) are common in the analysis of

language as a graph. Indeed, the usefulness of this area of mathematics to linguistics has borne organizations such as

TextGraphs, as well as various 'Net' projects, such as WordNet, VerbNet, and others.

Graph theory is also used to study molecules in chemistry and physics. In condensed matter physics, the threedimensional structure of complicated simulated atomic structures can be studied quantitatively by gathering statistics

on graph-theoretic properties related to the topology of the atoms. In chemistry a graph makes a natural model for a

molecule, where vertices represent atoms and edges bonds. This approach is especially used in computer processing of

molecular structures, ranging from chemical editors to database searching. In statistical physics, graphs can represent

56

CHAPTER 5. GRAPH THEORY

hu

pt

ca

es

no

pl

fr

ro

sv

fi

ko

it

he

en

de

da

ja

bg

nl

ar

tr

uk

ru

zh

id

cs

fa

The network graph formed by Wikipedia editors (edges) contributing to diﬀerent Wikipedia language versions (nodes) during one

month in summer 2013.[3]

local connections between interacting parts of a system, as well as the dynamics of a physical process on such systems.

Graphs are also used to represent the micro-scale channels of porous media, in which the vertices represent the pores

and the edges represent the smaller channels connecting the pores.

Graph theory is also widely used in sociology as a way, for example, to measure actors’ prestige or to explore rumor

spreading, notably through the use of social network analysis software. Under the umbrella of social networks are

many diﬀerent types of graphs:[5] Acquaintanceship and friendship graphs describe whether people know each other.

Inﬂuence graphs model whether certain people can inﬂuence the behavior of others. Finally, collaboration graphs

model whether two people work together in a particular way, such as acting in a movie together.

Likewise, graph theory is useful in biology and conservation eﬀorts where a vertex can represent regions where

certain species exist (or habitats) and the edges represent migration paths, or movement between the regions. This

information is important when looking at breeding patterns or tracking the spread of disease, parasites or how changes

to the movement can aﬀect other species.

In mathematics, graphs are useful in geometry and certain parts of topology such as knot theory. Algebraic graph

theory has close links with group theory.

A graph structure can be extended by assigning a weight to each edge of the graph. Graphs with weights, or weighted

graphs, are used to represent structures in which pairwise connections have some numerical values. For example if a

graph represents a road network, the weights could represent the length of each road.

5.3. HISTORY

57

5.3 History

The Königsberg Bridge problem

The paper written by Leonhard Euler on the Seven Bridges of Königsberg and published in 1736 is regarded as the ﬁrst

paper in the history of graph theory.[6] This paper, as well as the one written by Vandermonde on the knight problem,

carried on with the analysis situs initiated by Leibniz. Euler’s formula relating the number of edges, vertices, and faces

of a convex polyhedron was studied and generalized by Cauchy[7] and L'Huillier,[8] and is at the origin of topology.

More than one century after Euler’s paper on the bridges of Königsberg and while Listing introduced topology, Cayley

was led by the study of particular analytical forms arising from diﬀerential calculus to study a particular class of graphs,

the trees.[9] This study had many implications in theoretical chemistry. The involved techniques mainly concerned the

enumeration of graphs having particular properties. Enumerative graph theory then rose from the results of Cayley

and the fundamental results published by Pólya between 1935 and 1937 and the generalization of these by De Bruijn

in 1959. Cayley linked his results on trees with the contemporary studies of chemical composition.[10] The fusion

of the ideas coming from mathematics with those coming from chemistry is at the origin of a part of the standard

terminology of graph theory.

In particular, the term “graph” was introduced by Sylvester in a paper published in 1878 in Nature, where he draws

an analogy between “quantic invariants” and “co-variants” of algebra and molecular diagrams:[11]

"[...] Every invariant and co-variant thus becomes expressible by a graph precisely identical with a

Kekuléan diagram or chemicograph. [...] I give a rule for the geometrical multiplication of graphs, i.e.

for constructing a graph to the product of in- or co-variants whose separate graphs are given. [...]" (italics

as in the original).

The ﬁrst textbook on graph theory was written by Dénes Kőnig, and published in 1936.[12] Another book by Frank

Harary, published in 1969, was “considered the world over to be the deﬁnitive textbook on the subject”,[13] and

enabled mathematicians, chemists, electrical engineers and social scientists to talk to each other. Harary donated all

of the royalties to fund the Pólya Prize.[14]

58

CHAPTER 5. GRAPH THEORY

One of the most famous and stimulating problems in graph theory is the four color problem: “Is it true that any

map drawn in the plane may have its regions colored with four colors, in such a way that any two regions having a

common border have diﬀerent colors?" This problem was ﬁrst posed by Francis Guthrie in 1852 and its ﬁrst written

record is in a letter of De Morgan addressed to Hamilton the same year. Many incorrect proofs have been proposed,

including those by Cayley, Kempe, and others. The study and the generalization of this problem by Tait, Heawood,

Ramsey and Hadwiger led to the study of the colorings of the graphs embedded on surfaces with arbitrary genus.

Tait’s reformulation generated a new class of problems, the factorization problems, particularly studied by Petersen

and Kőnig. The works of Ramsey on colorations and more specially the results obtained by Turán in 1941 was at the

origin of another branch of graph theory, extremal graph theory.

The four color problem remained unsolved for more than a century. In 1969 Heinrich Heesch published a method for

solving the problem using computers.[15] A computer-aided proof produced in 1976 by Kenneth Appel and Wolfgang

Haken makes fundamental use of the notion of “discharging” developed by Heesch.[16][17] The proof involved checking the properties of 1,936 conﬁgurations by computer, and was not fully accepted at the time due to its complexity.

A simpler proof considering only 633 conﬁgurations was given twenty years later by Robertson, Seymour, Sanders

and Thomas.[18]

The autonomous development of topology from 1860 and 1930 fertilized graph theory back through the works of

Jordan, Kuratowski and Whitney. Another important factor of common development of graph theory and topology

came from the use of the techniques of modern algebra. The ﬁrst example of such a use comes from the work of the

physicist Gustav Kirchhoﬀ, who published in 1845 his Kirchhoﬀ’s circuit laws for calculating the voltage and current

in electric circuits.

The introduction of probabilistic methods in graph theory, especially in the study of Erdős and Rényi of the asymptotic

probability of graph connectivity, gave rise to yet another branch, known as random graph theory, which has been a

fruitful source of graph-theoretic results.

5.4 Graph drawing

Main article: Graph drawing

Graphs are represented visually by drawing a dot or circle for every vertex, and drawing an arc between two vertices

if they are connected by an edge. If the graph is directed, the direction is indicated by drawing an arrow.

A graph drawing should not be confused with the graph itself (the abstract, non-visual structure) as there are several

ways to structure the graph drawing. All that matters is which vertices are connected to which others by how many

edges and not the exact layout. In practice it is often diﬃcult to decide if two drawings represent the same graph.

Depending on the problem domain some layouts may be better suited and easier to understand than others.

The pioneering work of W. T. Tutte was very inﬂuential in the subject of graph drawing. Among other achievements,

he introduced the use of linear algebraic methods to obtain graph drawings.

Graph drawing also can be said to encompass problems that deal with the crossing number and its various generalizations. The crossing number of a graph is the minimum number of intersections between edges that a drawing of

the graph in the plane must contain. For a planar graph, the crossing number is zero by deﬁnition.

Drawings on surfaces other than the plane are also studied.

5.5 Graph-theoretic data structures

Main article: Graph (abstract data type)

There are diﬀerent ways to store graphs in a computer system. The data structure used depends on both the graph

structure and the algorithm used for manipulating the graph. Theoretically one can distinguish between list and

matrix structures but in concrete applications the best structure is often a combination of both. List structures are

often preferred for sparse graphs as they have smaller memory requirements. Matrix structures on the other hand

provide faster access for some applications but can consume huge amounts of memory.

List structures include the incidence list, an array of pairs of vertices, and the adjacency list, which separately lists

5.6. PROBLEMS IN GRAPH THEORY

59

the neighbors of each vertex: Much like the incidence list, each vertex has a list of which vertices it is adjacent to.

Matrix structures include the incidence matrix, a matrix of 0’s and 1’s whose rows represent vertices and whose

columns represent edges, and the adjacency matrix, in which both the rows and columns are indexed by vertices. In

both cases a 1 indicates two adjacent objects and a 0 indicates two non-adjacent objects. The Laplacian matrix is a

modiﬁed form of the adjacency matrix that incorporates information about the degrees of the vertices, and is useful

in some calculations such as Kirchhoﬀ’s theorem on the number of spanning trees of a graph. The distance matrix,

like the adjacency matrix, has both its rows and columns indexed by vertices, but rather than containing a 0 or a 1 in

each cell it contains the length of a shortest path between two vertices.

5.6 Problems in graph theory

5.6.1

Enumeration

There is a large literature on graphical enumeration: the problem of counting graphs meeting speciﬁed conditions.

Some of this work is found in Harary and Palmer (1973).

5.6.2

Subgraphs, induced subgraphs, and minors

A common problem, called the subgraph isomorphism problem, is ﬁnding a ﬁxed graph as a subgraph in a given

graph. One reason to be interested in such a question is that many graph properties are hereditary for subgraphs,

which means that a graph has the property if and only if all subgraphs have it too. Unfortunately, ﬁnding maximal

subgraphs of a certain kind is often an NP-complete problem.

• Finding the largest complete subgraph is called the clique problem (NP-complete).

A similar problem is ﬁnding induced subgraphs in a given graph. Again, some important graph properties are hereditary with respect to induced subgraphs, which means that a graph has a property if and only if all induced subgraphs

also have it. Finding maximal induced subgraphs of a certain kind is also often NP-complete. For example,

• Finding the largest edgeless induced subgraph, or independent set, called the independent set problem (NPcomplete).

Still another such problem, the minor containment problem, is to ﬁnd a ﬁxed graph as a minor of a given graph. A

minor or subcontraction of a graph is any graph obtained by taking a subgraph and contracting some (or no) edges.

Many graph properties are hereditary for minors, which means that a graph has a property if and only if all minors

have it too. A famous example:

• A graph is planar if it contains as a minor neither the complete bipartite graph K3,3 (See the Three-cottage

problem) nor the complete graph K5 .

Another class of problems has to do with the extent to which various species and generalizations of graphs are determined by their point-deleted subgraphs, for example:

• The reconstruction conjecture.

5.6.3

Graph coloring

Many problems have to do with various ways of coloring graphs, for example:

• The four-color theorem

• The strong perfect graph theorem

• The Erdős–Faber–Lovász conjecture(unsolved)

60

CHAPTER 5. GRAPH THEORY

• The total coloring conjecture, also called Behzad's conjecture) (unsolved)

• The list coloring conjecture (unsolved)

• The Hadwiger conjecture (graph theory) (unsolved)

5.6.4

Subsumption and uniﬁcation

Constraint modeling theories concern families of directed graphs related by a partial order. In these applications,

graphs are ordered by speciﬁcity, meaning that more constrained graphs—which are more speciﬁc and thus contain

a greater amount of information—are subsumed by those that are more general. Operations between graphs include

evaluating the direction of a subsumption relationship between two graphs, if any, and computing graph uniﬁcation.

The uniﬁcation of two argument graphs is deﬁned as the most general graph (or the computation thereof) that is

consistent with (i.e. contains all of the information in) the inputs, if such a graph exists; eﬃcient uniﬁcation algorithms

are known.

For constraint frameworks which are strictly compositional, graph uniﬁcation is the suﬃcient satisﬁability and combination function. Well-known applications include automatic theorem proving and modeling the elaboration of

linguistic structure.

5.6.5

Route problems

• Hamiltonian path and cycle problems

• Minimum spanning tree

• Route inspection problem (also called the “Chinese Postman Problem”)

• Seven Bridges of Königsberg

• Shortest path problem

• Steiner tree

• Three-cottage problem

• Traveling salesman problem (NP-hard)

5.6.6

Network ﬂow

There are numerous problems arising especially from applications that have to do with various notions of ﬂows in

networks, for example:

• Max ﬂow min cut theorem

5.6.7

Visibility problems

• Museum guard problem

5.6.8

Covering problems

Covering problems in graphs are speciﬁc instances of subgraph-ﬁnding problems, and they tend to be closely related

to the clique problem or the independent set problem.

• Set cover problem

• Vertex cover problem

5.7. SEE ALSO

5.6.9

61

Decomposition problems

Decomposition, deﬁned as partitioning the edge set of a graph (with as many vertices as necessary accompanying the

edges of each part of the partition), has a wide variety of question. Often, it is required to decompose a graph into

subgraphs isomorphic to a ﬁxed graph; for instance, decomposing a complete graph into Hamiltonian cycles. Other

problems specify a family of graphs into which a given graph should be decomposed, for instance, a family of cycles,

or decomposing a complete graph Kn into n − 1 speciﬁed trees having, respectively, 1, 2, 3, ..., n − 1 edges.

Some speciﬁc decomposition problems that have been studied include:

• Arboricity, a decomposition into as few forests as possible

• Cycle double cover, a decomposition into a collection of cycles covering each edge exactly twice

• Edge coloring, a decomposition into as few matchings as possible

• Graph factorization, a decomposition of a regular graph into regular subgraphs of given degrees

5.6.10

Graph classes

Many problems involve characterizing the members of various classes of graphs. Some examples of such questions

are below:

• Enumerating the members of a class

• Characterizing a class in terms of forbidden substructures

• Ascertaining relationships among classes (e.g., does one property of graphs imply another)

• Finding eﬃcient algorithms to decide membership in a class

• Finding representations for members of a class.

5.7 See also

• Gallery of named graphs

• Glossary of graph theory

• List of graph theory topics

• Publications in graph theory

5.7.1

Related topics

• Algebraic graph theory

• Citation graph

• Conceptual graph

• Data structure

• Disjoint-set data structure

• Dual-phase evolution

• Entitative graph

• Existential graph

• Graph algebras

62

CHAPTER 5. GRAPH THEORY

• Graph automorphism

• Graph coloring

• Graph database

• Graph data structure

• Graph drawing

• Graph equation

• Graph rewriting

• Graph sandwich problem

• Graph property

• Intersection graph

• Logical graph

• Loop

• Network theory

• Null graph

• Pebble motion problems

• Percolation

• Perfect graph

• Quantum graph

• Random regular graphs

• Semantic networks

• Spectral graph theory

• Strongly regular graphs

• Symmetric graphs

• Transitive reduction

• Tree data structure

5.7.2

Algorithms

• Bellman–Ford algorithm

• Dijkstra’s algorithm

• Ford–Fulkerson algorithm

• Kruskal’s algorithm

• Nearest neighbour algorithm

• Prim’s algorithm

• Depth-ﬁrst search

• Breadth-ﬁrst search

5.7. SEE ALSO

5.7.3

Subareas

• Algebraic graph theory

• Geometric graph theory

• Extremal graph theory

• Probabilistic graph theory

• Topological graph theory

5.7.4

Related areas of mathematics

• Combinatorics

• Group theory

• Knot theory

• Ramsey theory

5.7.5

Generalizations

• Hypergraph

• Abstract simplicial complex

5.7.6

Prominent graph theorists

• Alon, Noga

• Berge, Claude

• Bollobás, Béla

• Bondy, Adrian John

• Brightwell, Graham

• Chudnovsky, Maria

• Chung, Fan

• Dirac, Gabriel Andrew

• Erdős, Paul

• Euler, Leonhard

• Faudree, Ralph

• Golumbic, Martin

• Graham, Ronald

• Harary, Frank

• Heawood, Percy John

• Kotzig, Anton

• Kőnig, Dénes

• Lovász, László

63

64

CHAPTER 5. GRAPH THEORY

• Murty, U. S. R.

• Nešetřil, Jaroslav

• Rényi, Alfréd

• Ringel, Gerhard

• Robertson, Neil

• Seymour, Paul

• Szemerédi, Endre

• Thomas, Robin

• Thomassen, Carsten

• Turán, Pál

• Tutte, W. T.

• Whitney, Hassler

5.8 Notes

[1] See, for instance, Iyanaga and Kawada, 69 J, p. 234 or Biggs, p. 4.

[2] See, for instance, Graham et al., p. 5.

[3] Hale, Scott A. (2013). “Multilinguals and Wikipedia Editing”. arXiv:1312.0976 [cs.CY].

[4] Mashaghi, A. et al. (2004). “Investigation of a protein complex network”. European Physical Journal B 41 (1): 113–121.

doi:10.1140/epjb/e2004-00301-0.

[5] Rosen, Kenneth H. Discrete mathematics and its applications (7th ed.). New York: McGraw-Hill. ISBN 978-0-07-3383095.

[6] Biggs, N.; Lloyd, E. and Wilson, R. (1986), Graph Theory, 1736-1936, Oxford University Press

[7] Cauchy, A.L. (1813), “Recherche sur les polyèdres - premier mémoire”, Journal de l'École Polytechnique, 9 (Cahier 16):

66–86.

[8] L'Huillier, S.-A.-J. (1861), “Mémoire sur la polyèdrométrie”, Annales de Mathématiques 3: 169–189.

[9] Cayley, A. (1857), “On the theory of the analytical forms called trees”, Philosophical Magazine, Series IV 13 (85): 172–

176, doi:10.1017/CBO9780511703690.046.

[10] Cayley, A. (1875), “Ueber die Analytischen Figuren, welche in der Mathematik Bäume genannt werden und ihre Anwendung auf die Theorie chemischer Verbindungen”, Berichte der deutschen Chemischen Gesellschaft 8 (2): 1056–1059,

doi:10.1002/cber.18750080252.

[11] Joseph Sylvester, John (1878). “Chemistry and Algebra”. Nature 17: 284. doi:10.1038/017284a0.

[12] Tutte, W.T. (2001), Graph Theory, Cambridge University Press, p. 30, ISBN 978-0-521-79489-3.

[13] Gardner, Martin (1992), Fractal Music, Hypercards, and more...Mathematical Recreations from Scientiﬁc American, W. H.

Freeman and Company, p. 203.

[14] Society for Industrial and Applied Mathematics (2002), “The George Polya Prize”, Looking Back, Looking Ahead: A SIAM

History (PDF), p. 26.

[15] Heinrich Heesch: Untersuchungen zum Vierfarbenproblem. Mannheim: Bibliographisches Institut 1969.

[16] Appel, K. and Haken, W. (1977), “Every planar map is four colorable. Part I. Discharging”, Illinois J. Math. 21: 429–490.

[17] Appel, K. and Haken, W. (1977), “Every planar map is four colorable. Part II. Reducibility”, Illinois J. Math. 21: 491–567.

[18] Robertson, N.; Sanders, D.; Seymour, P. and Thomas, R. (1997), “The four color theorem”, Journal of Combinatorial

Theory Series B 70: 2–44, doi:10.1006/jctb.1997.1750.

5.9. REFERENCES

65

5.9 References

• Berge, Claude (1958), Théorie des graphes et ses applications, Collection Universitaire de Mathématiques II,

Paris: Dunod. English edition, Wiley 1961; Methuen & Co, New York 1962; Russian, Moscow 1961; Spanish,

Mexico 1962; Roumanian, Bucharest 1969; Chinese, Shanghai 1963; Second printing of the 1962 ﬁrst English

edition, Dover, New York 2001.

• Biggs, N.; Lloyd, E.; Wilson, R. (1986), Graph Theory, 1736–1936, Oxford University Press.

• Bondy, J.A.; Murty, U.S.R. (2008), Graph Theory, Springer, ISBN 978-1-84628-969-9.

• Bondy, Riordan, O.M (2003), Mathematical results on scale-free random graphs in “Handbook of Graphs and

Networks” (S. Bornholdt and H.G. Schuster (eds)), Wiley VCH, Weinheim, 1st ed..

• Chartrand, Gary (1985), Introductory Graph Theory, Dover, ISBN 0-486-24775-9.

• Gibbons, Alan (1985), Algorithmic Graph Theory, Cambridge University Press.

• Reuven Cohen, Shlomo Havlin (2010), Complex Networks: Structure, Robustness and Function, Cambridge

University Press

• Golumbic, Martin (1980), Algorithmic Graph Theory and Perfect Graphs, Academic Press.

• Harary, Frank (1969), Graph Theory, Reading, MA: Addison-Wesley.

• Harary, Frank; Palmer, Edgar M. (1973), Graphical Enumeration, New York, NY: Academic Press.

• Mahadev, N.V.R.; Peled, Uri N. (1995), Threshold Graphs and Related Topics, North-Holland.

• Mark Newman (2010), Networks: An Introduction, Oxford University Press.

5.10 External links

• Graph theory with examples

• Hazewinkel, Michiel, ed. (2001), “Graph theory”, Encyclopedia of Mathematics, Springer, ISBN 978-1-55608010-4

• Graph theory tutorial

• A searchable database of small connected graphs

• Image gallery: graphs at the Wayback Machine (archived February 6, 2006)

• Concise, annotated list of graph theory resources for researchers

• rocs — a graph theory IDE

• The Social Life of Routers — non-technical paper discussing graphs of people and computers

• Graph Theory Software — tools to teach and learn graph theory

• Online books, and library resources in your library and in other libraries about graph theory

5.10.1

Online textbooks

• Phase Transitions in Combinatorial Optimization Problems, Section 3: Introduction to Graphs (2006) by Hartmann and Weigt

• Digraphs: Theory Algorithms and Applications 2007 by Jorgen Bang-Jensen and Gregory Gutin

• Graph Theory, by Reinhard Diestel

Chapter 6

Loop (graph theory)

6

3

2

4

5

1

A graph with a loop on vertex 1

66

6.1. DEGREE

67

In graph theory, a loop (also called a self-loop or a “buckle”) is an edge that connects a vertex to itself. A simple

graph contains no loops.

Depending on the context, a graph or a multigraph may be deﬁned so as to either allow or disallow the presence of

loops (often in concert with allowing or disallowing multiple edges between the same vertices):

• Where graphs are deﬁned so as to allow loops and multiple edges, a graph without loops or multiple edges is

often distinguished from other graphs by calling it a “simple graph”.

• Where graphs are deﬁned so as to disallow loops and multiple edges, a graph that does have loops or multiple edges is often distinguished from the graphs that satisfy these constraints by calling it a “multigraph” or

"pseudograph".

6.1 Degree

For an undirected graph, the degree of a vertex is equal to the number of adjacent vertices.

A special case is a loop, which adds two to the degree. This can be understood by letting each connection of the loop

edge count as its own adjacent vertex. In other words, a vertex with a loop “sees” itself as an adjacent vertex from

both ends of the edge thus adding two, not one, to the degree.

For a directed graph, a loop adds one to the in degree and one to the out degree

6.2 Notes

6.3 References

• Balakrishnan, V. K.; Graph Theory, McGraw-Hill; 1 edition (February 1, 1997). ISBN 0-07-005489-4.

• Bollobás, Béla; Modern Graph Theory, Springer; 1st edition (August 12, 2002). ISBN 0-387-98488-7.

• Diestel, Reinhard; Graph Theory, Springer; 2nd edition (February 18, 2000). ISBN 0-387-98976-5.

• Gross, Jonathon L, and Yellen, Jay; Graph Theory and Its Applications, CRC Press (December 30, 1998).

ISBN 0-8493-3982-0.

• Gross, Jonathon L, and Yellen, Jay; (eds); Handbook of Graph Theory. CRC (December 29, 2003). ISBN

1-58488-090-2.

• Zwillinger, Daniel; CRC Standard Mathematical Tables and Formulae, Chapman & Hall/CRC; 31st edition

(November 27, 2002). ISBN 1-58488-291-3.

6.4 External links

• Black, Paul E. “Self loop”. Dictionary of Algorithms and Data Structures. NIST.

6.5 See also

Loops in Graph Theory

• Cycle (graph theory)

• Graph theory

• Glossary of graph theory

68

CHAPTER 6. LOOP (GRAPH THEORY)

Loops in Topology

• Möbius ladder

• Möbius strip

• Strange loop

• Klein bottle

Chapter 7

Mathematics

This article is about the study of topics such as quantity and structure. For other uses, see Mathematics (disambiguation).

“Math” redirects here. For other uses, see Math (disambiguation).

Euclid (holding calipers), Greek mathematician, 3rd century BC, as imagined by Raphael in this detail from The School of Athens.[1]

Mathematics (from Greek μάθημα máthēma, “knowledge, study, learning”) is the study of topics such as quantity

(numbers),[2] structure,[3] space,[2] and change.[4][5][6] There is a range of views among mathematicians and philosophers as to the exact scope and deﬁnition of mathematics.[7][8]

Mathematicians seek out patterns[9][10] and use them to formulate new conjectures. Mathematicians resolve the truth

69

70

CHAPTER 7. MATHEMATICS

or falsity of conjectures by mathematical proof. When mathematical structures are good models of real phenomena,

then mathematical reasoning can provide insight or predictions about nature. Through the use of abstraction and

logic, mathematics developed from counting, calculation, measurement, and the systematic study of the shapes and

motions of physical objects. Practical mathematics has been a human activity for as far back as written records exist.

The research required to solve mathematical problems can take years or even centuries of sustained inquiry.

Rigorous arguments ﬁrst appeared in Greek mathematics, most notably in Euclid's Elements. Since the pioneering

work of Giuseppe Peano (1858–1932), David Hilbert (1862–1943), and others on axiomatic systems in the late 19th

century, it has become customary to view mathematical research as establishing truth by rigorous deduction from

appropriately chosen axioms and deﬁnitions. Mathematics developed at a relatively slow pace until the Renaissance,

when mathematical innovations interacting with new scientiﬁc discoveries led to a rapid increase in the rate of mathematical discovery that has continued to the present day.[11]

Galileo Galilei (1564–1642) said, “The universe cannot be read until we have learned the language and become

familiar with the characters in which it is written. It is written in mathematical language, and the letters are triangles, circles and other geometrical ﬁgures, without which means it is humanly impossible to comprehend a single

word. Without these, one is wandering about in a dark labyrinth.”[12] Carl Friedrich Gauss (1777–1855) referred to

mathematics as “the Queen of the Sciences”.[13] Benjamin Peirce (1809–1880) called mathematics “the science that

draws necessary conclusions”.[14] David Hilbert said of mathematics: “We are not speaking here of arbitrariness in

any sense. Mathematics is not like a game whose tasks are determined by arbitrarily stipulated rules. Rather, it is a

conceptual system possessing internal necessity that can only be so and by no means otherwise.”[15] Albert Einstein

(1879–1955) stated that “as far as the laws of mathematics refer to reality, they are not certain; and as far as they

are certain, they do not refer to reality.”[16] French mathematician Claire Voisin states “There is creative drive in

mathematics, it’s all about movement trying to express itself.” [17]

Mathematics is used throughout the world as an essential tool in many ﬁelds, including natural science, engineering,

medicine, ﬁnance and the social sciences. Applied mathematics, the branch of mathematics concerned with application of mathematical knowledge to other ﬁelds, inspires and makes use of new mathematical discoveries, which has

led to the development of entirely new mathematical disciplines, such as statistics and game theory. Mathematicians

also engage in pure mathematics, or mathematics for its own sake, without having any application in mind. There is

no clear line separating pure and applied mathematics, and practical applications for what began as pure mathematics

are often discovered.[18]

7.1 History

7.1.1

Evolution

Main article: History of mathematics

The evolution of mathematics can be seen as an ever-increasing series of abstractions. The ﬁrst abstraction, which

is shared by many animals,[19] was probably that of numbers: the realization that a collection of two apples and a

collection of two oranges (for example) have something in common, namely quantity of their members.

As evidenced by tallies found on bone, in addition to recognizing how to count physical objects, prehistoric peoples

may have also recognized how to count abstract quantities, like time – days, seasons, years.[20]

More complex mathematics did not appear until around 3000 BC, when the Babylonians and Egyptians began using

arithmetic, algebra and geometry for taxation and other ﬁnancial calculations, for building and construction, and for

astronomy.[21] The earliest uses of mathematics were in trading, land measurement, painting and weaving patterns

and the recording of time.

In Babylonian mathematics elementary arithmetic (addition, subtraction, multiplication and division) ﬁrst appears in

the archaeological record. Numeracy pre-dated writing and numeral systems have been many and diverse, with the

ﬁrst known written numerals created by Egyptians in Middle Kingdom texts such as the Rhind Mathematical Papyrus.

Between 600 and 300 BC the Ancient Greeks began a systematic study of mathematics in its own right with Greek

mathematics.[22]

Mathematics has since been greatly extended, and there has been a fruitful interaction between mathematics and

science, to the beneﬁt of both. Mathematical discoveries continue to be made today. According to Mikhail B.

Sevryuk, in the January 2006 issue of the Bulletin of the American Mathematical Society, “The number of papers and

7.1. HISTORY

71

Greek mathematician Pythagoras (c. 570 – c. 495 BC), commonly credited with discovering the Pythagorean theorem

books included in the Mathematical Reviews database since 1940 (the ﬁrst year of operation of MR) is now more

than 1.9 million, and more than 75 thousand items are added to the database each year. The overwhelming majority

of works in this ocean contain new mathematical theorems and their proofs.”[23]

72

CHAPTER 7. MATHEMATICS

0

1

2

3

4

5

6

7

8

9

10

11 12 13 14

15 16 17 18 19

Mayan numerals

7.1.2

Etymology

The word mathematics comes from the Greek μάθημα (máthēma), which, in the ancient Greek language, means “that

which is learnt”,[24] “what one gets to know”, hence also “study” and “science”, and in modern Greek just “lesson”. The

word máthēma is derived from μανθάνω (manthano), while the modern Greek equivalent is μαθαίνω (mathaino),

both of which mean “to learn”. In Greece, the word for “mathematics” came to have the narrower and more technical

meaning “mathematical study” even in Classical times.[25] Its adjective is μαθηματικός (mathēmatikós), meaning

“related to learning” or “studious”, which likewise further came to mean “mathematical”. In particular, μαθηματικὴ

τέχνη (mathēmatikḗ tékhnē), Latin: ars mathematica, meant “the mathematical art”.

In Latin, and in English until around 1700, the term mathematics more commonly meant “astrology” (or sometimes

“astronomy”) rather than “mathematics"; the meaning gradually changed to its present one from about 1500 to 1800.

This has resulted in several mistranslations: a particularly notorious one is Saint Augustine's warning that Christians should beware of mathematici meaning astrologers, which is sometimes mistranslated as a condemnation of

7.2. DEFINITIONS OF MATHEMATICS

73

mathematicians.[26]

The apparent plural form in English, like the French plural form les mathématiques (and the less commonly used

singular derivative la mathématique), goes back to the Latin neuter plural mathematica (Cicero), based on the Greek

plural τα μαθηματικά (ta mathēmatiká), used by Aristotle (384–322 BC), and meaning roughly “all things mathematical"; although it is plausible that English borrowed only the adjective mathematic(al) and formed the noun

mathematics anew, after the pattern of physics and metaphysics, which were inherited from the Greek.[27] In English,

the noun mathematics takes singular verb forms. It is often shortened to maths or, in English-speaking North America,

math.[28]

7.2 Deﬁnitions of mathematics

Main article: Deﬁnitions of mathematics

Aristotle deﬁned mathematics as “the science of quantity”, and this deﬁnition prevailed until the 18th century.[29]

Starting in the 19th century, when the study of mathematics increased in rigor and began to address abstract topics

such as group theory and projective geometry, which have no clear-cut relation to quantity and measurement, mathematicians and philosophers began to propose a variety of new deﬁnitions.[30] Some of these deﬁnitions emphasize the

deductive character of much of mathematics, some emphasize its abstractness, some emphasize certain topics within

mathematics. Today, no consensus on the deﬁnition of mathematics prevails, even among professionals.[7] There

is not even consensus on whether mathematics is an art or a science.[8] A great many professional mathematicians

take no interest in a deﬁnition of mathematics, or consider it undeﬁnable.[7] Some just say, “Mathematics is what

mathematicians do.”[7]

Three leading types of deﬁnition of mathematics are called logicist, intuitionist, and formalist, each reﬂecting a

diﬀerent philosophical school of thought.[31] All have severe problems, none has widespread acceptance, and no

reconciliation seems possible.[31]

An early deﬁnition of mathematics in terms of logic was Benjamin Peirce's “the science that draws necessary conclusions” (1870).[32] In the Principia Mathematica, Bertrand Russell and Alfred North Whitehead advanced the philosophical program known as logicism, and attempted to prove that all mathematical concepts, statements, and principles can be deﬁned and proven entirely in terms of symbolic logic. A logicist deﬁnition of mathematics is Russell’s

“All Mathematics is Symbolic Logic” (1903).[33]

Intuitionist deﬁnitions, developing from the philosophy of mathematician L.E.J. Brouwer, identify mathematics with

certain mental phenomena. An example of an intuitionist deﬁnition is “Mathematics is the mental activity which

consists in carrying out constructs one after the other.”[31] A peculiarity of intuitionism is that it rejects some mathematical ideas considered valid according to other deﬁnitions. In particular, while other philosophies of mathematics

allow objects that can be proven to exist even though they cannot be constructed, intuitionism allows only mathematical objects that one can actually construct.

Formalist deﬁnitions identify mathematics with its symbols and the rules for operating on them. Haskell Curry deﬁned

mathematics simply as “the science of formal systems”.[34] A formal system is a set of symbols, or tokens, and some

rules telling how the tokens may be combined into formulas. In formal systems, the word axiom has a special meaning,

diﬀerent from the ordinary meaning of “a self-evident truth”. In formal systems, an axiom is a combination of tokens

that is included in a given formal system without needing to be derived using the rules of the system.

7.2.1

Mathematics as science

Gauss referred to mathematics as “the Queen of the Sciences”.[13] In the original Latin Regina Scientiarum, as well

as in German Königin der Wissenschaften, the word corresponding to science means a “ﬁeld of knowledge”, and

this was the original meaning of “science” in English, also; mathematics is in this sense a ﬁeld of knowledge. The

specialization restricting the meaning of “science” to natural science follows the rise of Baconian science, which

contrasted “natural science” to scholasticism, the Aristotelean method of inquiring from ﬁrst principles. The role

of empirical experimentation and observation is negligible in mathematics, compared to natural sciences such as

psychology, biology, or physics. Albert Einstein stated that “as far as the laws of mathematics refer to reality, they

are not certain; and as far as they are certain, they do not refer to reality.”[16] More recently, Marcus du Sautoy has

called mathematics “the Queen of Science ... the main driving force behind scientiﬁc discovery”.[35]

Many philosophers believe that mathematics is not experimentally falsiﬁable, and thus not a science according to the

74

CHAPTER 7. MATHEMATICS

Leonardo Fibonacci, the Italian mathematician who established the Hindu–Arabic numeral system to the Western World

deﬁnition of Karl Popper.[36] However, in the 1930s Gödel’s incompleteness theorems convinced many mathematicians that mathematics cannot be reduced to logic alone, and Karl Popper concluded that “most mathematical theories

are, like those of physics and biology, hypothetico-deductive: pure mathematics therefore turns out to be much closer

to the natural sciences whose hypotheses are conjectures, than it seemed even recently.”[37] Other thinkers, notably

Imre Lakatos, have applied a version of falsiﬁcationism to mathematics itself.

An alternative view is that certain scientiﬁc ﬁelds (such as theoretical physics) are mathematics with axioms that are

7.2. DEFINITIONS OF MATHEMATICS

75

Carl Friedrich Gauss, known as the prince of mathematicians

intended to correspond to reality. The theoretical physicist J.M. Ziman proposed that science is public knowledge,

and thus includes mathematics.[38] Mathematics shares much in common with many ﬁelds in the physical sciences,

notably the exploration of the logical consequences of assumptions. Intuition and experimentation also play a role in

the formulation of conjectures in both mathematics and the (other) sciences. Experimental mathematics continues to

grow in importance within mathematics, and computation and simulation are playing an increasing role in both the

sciences and mathematics.

The opinions of mathematicians on this matter are varied. Many mathematicians feel that to call their area a science

is to downplay the importance of its aesthetic side, and its history in the traditional seven liberal arts; others feel that

76

CHAPTER 7. MATHEMATICS

to ignore its connection to the sciences is to turn a blind eye to the fact that the interface between mathematics and

its applications in science and engineering has driven much development in mathematics. One way this diﬀerence of

viewpoint plays out is in the philosophical debate as to whether mathematics is created (as in art) or discovered (as

in science). It is common to see universities divided into sections that include a division of Science and Mathematics,

indicating that the ﬁelds are seen as being allied but that they do not coincide. In practice, mathematicians are typically

grouped with scientists at the gross level but separated at ﬁner levels. This is one of many issues considered in the

philosophy of mathematics.

7.3 Inspiration, pure and applied mathematics, and aesthetics

Main article: Mathematical beauty

Isaac Newton (left) and Gottfried Wilhelm Leibniz (right), developers of inﬁnitesimal calculus

Mathematics arises from many diﬀerent kinds of problems. At ﬁrst these were found in commerce, land measurement,

architecture and later astronomy; today, all sciences suggest problems studied by mathematicians, and many problems

arise within mathematics itself. For example, the physicist Richard Feynman invented the path integral formulation

of quantum mechanics using a combination of mathematical reasoning and physical insight, and today’s string theory,

a still-developing scientiﬁc theory which attempts to unify the four fundamental forces of nature, continues to inspire

new mathematics.[39]

Some mathematics is relevant only in the area that inspired it, and is applied to solve further problems in that area.

But often mathematics inspired by one area proves useful in many areas, and joins the general stock of mathematical

concepts. A distinction is often made between pure mathematics and applied mathematics. However pure mathematics topics often turn out to have applications, e.g. number theory in cryptography. This remarkable fact, that even the

“purest” mathematics often turns out to have practical applications, is what Eugene Wigner has called "the unreason-

7.4. NOTATION, LANGUAGE, AND RIGOR

77

able eﬀectiveness of mathematics".[40] As in most areas of study, the explosion of knowledge in the scientiﬁc age has

led to specialization: there are now hundreds of specialized areas in mathematics and the latest Mathematics Subject

Classiﬁcation runs to 46 pages.[41] Several areas of applied mathematics have merged with related traditions outside

of mathematics and become disciplines in their own right, including statistics, operations research, and computer

science.

For those who are mathematically inclined, there is often a deﬁnite aesthetic aspect to much of mathematics. Many

mathematicians talk about the elegance of mathematics, its intrinsic aesthetics and inner beauty. Simplicity and

generality are valued. There is beauty in a simple and elegant proof, such as Euclid's proof that there are inﬁnitely

many prime numbers, and in an elegant numerical method that speeds calculation, such as the fast Fourier transform.

G.H. Hardy in A Mathematician’s Apology expressed the belief that these aesthetic considerations are, in themselves,

suﬃcient to justify the study of pure mathematics. He identiﬁed criteria such as signiﬁcance, unexpectedness, inevitability, and economy as factors that contribute to a mathematical aesthetic.[42] Mathematicians often strive to ﬁnd

proofs that are particularly elegant, proofs from “The Book” of God according to Paul Erdős.[43][44] The popularity

of recreational mathematics is another sign of the pleasure many ﬁnd in solving mathematical questions.

7.4 Notation, language, and rigor

Main article: Mathematical notation

Most of the mathematical notation in use today was not invented until the 16th century.[45] Before that, mathematics was written out in words, a painstaking process that limited mathematical discovery.[46] Euler (1707–1783)

was responsible for many of the notations in use today. Modern notation makes mathematics much easier for the

professional, but beginners often ﬁnd it daunting. It is extremely compressed: a few symbols contain a great deal

of information. Like musical notation, modern mathematical notation has a strict syntax (which to a limited extent

varies from author to author and from discipline to discipline) and encodes information that would be diﬃcult to

write in any other way.

Mathematical language can be diﬃcult to understand for beginners. Words such as or and only have more precise

meanings than in everyday speech. Moreover, words such as open and ﬁeld have been given specialized mathematical

meanings. Technical terms such as homeomorphism and integrable have precise meanings in mathematics. Additionally, shorthand phrases such as iﬀ for "if and only if" belong to mathematical jargon. There is a reason for special

notation and technical vocabulary: mathematics requires more precision than everyday speech. Mathematicians refer

to this precision of language and logic as “rigor”.

Mathematical proof is fundamentally a matter of rigor. Mathematicians want their theorems to follow from axioms

by means of systematic reasoning. This is to avoid mistaken "theorems", based on fallible intuitions, of which many

instances have occurred in the history of the subject.[47] The level of rigor expected in mathematics has varied over

time: the Greeks expected detailed arguments, but at the time of Isaac Newton the methods employed were less

rigorous. Problems inherent in the deﬁnitions used by Newton would lead to a resurgence of careful analysis and

formal proof in the 19th century. Misunderstanding the rigor is a cause for some of the common misconceptions

of mathematics. Today, mathematicians continue to argue among themselves about computer-assisted proofs. Since

large computations are hard to verify, such proofs may not be suﬃciently rigorous.[48]

Axioms in traditional thought were “self-evident truths”, but that conception is problematic.[49] At a formal level,

an axiom is just a string of symbols, which has an intrinsic meaning only in the context of all derivable formulas

of an axiomatic system. It was the goal of Hilbert’s program to put all of mathematics on a ﬁrm axiomatic basis,

but according to Gödel’s incompleteness theorem every (suﬃciently powerful) axiomatic system has undecidable

formulas; and so a ﬁnal axiomatization of mathematics is impossible. Nonetheless mathematics is often imagined to

be (as far as its formal content) nothing but set theory in some axiomatization, in the sense that every mathematical

statement or proof could be cast into formulas within set theory.[50]

7.5 Fields of mathematics

See also: Areas of mathematics and Glossary of areas of mathematics

Mathematics can, broadly speaking, be subdivided into the study of quantity, structure, space, and change (i.e.

arithmetic, algebra, geometry, and analysis). In addition to these main concerns, there are also subdivisions dedicated

to exploring links from the heart of mathematics to other ﬁelds: to logic, to set theory (foundations), to the empirical

mathematics of the various sciences (applied mathematics), and more recently to the rigorous study of uncertainty.

78

CHAPTER 7. MATHEMATICS

Leonhard Euler, who created and popularized much of the mathematical notation used today

7.5.1

Foundations and philosophy

In order to clarify the foundations of mathematics, the ﬁelds of mathematical logic and set theory were developed.

Mathematical logic includes the mathematical study of logic and the applications of formal logic to other areas of

mathematics; set theory is the branch of mathematics that studies sets or collections of objects. Category theory,

which deals in an abstract way with mathematical structures and relationships between them, is still in development.

The phrase “crisis of foundations” describes the search for a rigorous foundation for mathematics that took place from

approximately 1900 to 1930.[51] Some disagreement about the foundations of mathematics continues to the present

day. The crisis of foundations was stimulated by a number of controversies at the time, including the controversy

over Cantor’s set theory and the Brouwer–Hilbert controversy.

7.5. FIELDS OF MATHEMATICS

79

An abacus, a simple calculating tool used since ancient times

Mathematical logic is concerned with setting mathematics within a rigorous axiomatic framework, and studying the

implications of such a framework. As such, it is home to Gödel’s incompleteness theorems which (informally) imply

that any eﬀective formal system that contains basic arithmetic, if sound (meaning that all theorems that can be proven

are true), is necessarily incomplete (meaning that there are true theorems which cannot be proved in that system).

Whatever ﬁnite collection of number-theoretical axioms is taken as a foundation, Gödel showed how to construct a

formal statement that is a true number-theoretical fact, but which does not follow from those axioms. Therefore, no

formal system is a complete axiomatization of full number theory. Modern logic is divided into recursion theory,

model theory, and proof theory, and is closely linked to theoretical computer science, as well as to category theory.

Theoretical computer science includes computability theory, computational complexity theory, and information theory. Computability theory examines the limitations of various theoretical models of the computer, including the most

well-known model – the Turing machine. Complexity theory is the study of tractability by computer; some problems,

although theoretically solvable by computer, are so expensive in terms of time or space that solving them is likely to

remain practically unfeasible, even with the rapid advancement of computer hardware. A famous problem is the "P =

NP?" problem, one of the Millennium Prize Problems.[52] Finally, information theory is concerned with the amount

of data that can be stored on a given medium, and hence deals with concepts such as compression and entropy.

7.5.2

Pure mathematics

Quantity

The study of quantity starts with numbers, ﬁrst the familiar natural numbers and integers (“whole numbers”) and

arithmetical operations on them, which are characterized in arithmetic. The deeper properties of integers are studied

in number theory, from which come such popular results as Fermat’s Last Theorem. The twin prime conjecture and

Goldbach’s conjecture are two unsolved problems in number theory.

As the number system is further developed, the integers are recognized as a subset of the rational numbers ("fractions").

These, in turn, are contained within the real numbers, which are used to represent continuous quantities. Real numbers are generalized to complex numbers. These are the ﬁrst steps of a hierarchy of numbers that goes on to include

quaternions and octonions. Consideration of the natural numbers also leads to the transﬁnite numbers, which formalize the concept of "inﬁnity". Another area of study is size, which leads to the cardinal numbers and then to another

conception of inﬁnity: the aleph numbers, which allow meaningful comparison of the size of inﬁnitely large sets.

80

CHAPTER 7. MATHEMATICS

Structure

Many mathematical objects, such as sets of numbers and functions, exhibit internal structure as a consequence of

operations or relations that are deﬁned on the set. Mathematics then studies properties of those sets that can be

expressed in terms of that structure; for instance number theory studies properties of the set of integers that can be

expressed in terms of arithmetic operations. Moreover, it frequently happens that diﬀerent such structured sets (or

structures) exhibit similar properties, which makes it possible, by a further step of abstraction, to state axioms for a

class of structures, and then study at once the whole class of structures satisfying these axioms. Thus one can study

groups, rings, ﬁelds and other abstract systems; together such studies (for structures deﬁned by algebraic operations)

constitute the domain of abstract algebra.

By its great generality, abstract algebra can often be applied to seemingly unrelated problems; for instance a number of ancient problems concerning compass and straightedge constructions were ﬁnally solved using Galois theory,

which involves ﬁeld theory and group theory. Another example of an algebraic theory is linear algebra, which is

the general study of vector spaces, whose elements called vectors have both quantity and direction, and can be used

to model (relations between) points in space. This is one example of the phenomenon that the originally unrelated

areas of geometry and algebra have very strong interactions in modern mathematics. Combinatorics studies ways of

enumerating the number of objects that ﬁt a given structure.

Space

The study of space originates with geometry – in particular, Euclidean geometry. Trigonometry is the branch of

mathematics that deals with relationships between the sides and the angles of triangles and with the trigonometric

functions; it combines space and numbers, and encompasses the well-known Pythagorean theorem. The modern

study of space generalizes these ideas to include higher-dimensional geometry, non-Euclidean geometries (which

play a central role in general relativity) and topology. Quantity and space both play a role in analytic geometry,

diﬀerential geometry, and algebraic geometry. Convex and discrete geometry were developed to solve problems in

number theory and functional analysis but now are pursued with an eye on applications in optimization and computer

science. Within diﬀerential geometry are the concepts of ﬁber bundles and calculus on manifolds, in particular,

vector and tensor calculus. Within algebraic geometry is the description of geometric objects as solution sets of

polynomial equations, combining the concepts of quantity and space, and also the study of topological groups, which

combine structure and space. Lie groups are used to study space, structure, and change. Topology in all its many

ramiﬁcations may have been the greatest growth area in 20th-century mathematics; it includes point-set topology,

set-theoretic topology, algebraic topology and diﬀerential topology. In particular, instances of modern day topology

are metrizability theory, axiomatic set theory, homotopy theory, and Morse theory. Topology also includes the now

solved Poincaré conjecture, and the still unsolved areas of the Hodge conjecture. Other results in geometry and

topology, including the four color theorem and Kepler conjecture, have been proved only with the help of computers.

Change

Understanding and describing change is a common theme in the natural sciences, and calculus was developed as a

powerful tool to investigate it. Functions arise here, as a central concept describing a changing quantity. The rigorous

study of real numbers and functions of a real variable is known as real analysis, with complex analysis the equivalent

ﬁeld for the complex numbers. Functional analysis focuses attention on (typically inﬁnite-dimensional) spaces of

functions. One of many applications of functional analysis is quantum mechanics. Many problems lead naturally

to relationships between a quantity and its rate of change, and these are studied as diﬀerential equations. Many

phenomena in nature can be described by dynamical systems; chaos theory makes precise the ways in which many of

these systems exhibit unpredictable yet still deterministic behavior.

7.6. MATHEMATICAL AWARDS

7.5.3

81

Applied mathematics

Applied mathematics concerns itself with mathematical methods that are typically used in science, engineering, business, and industry. Thus, “applied mathematics” is a mathematical science with specialized knowledge. The term

applied mathematics also describes the professional specialty in which mathematicians work on practical problems;

as a profession focused on practical problems, applied mathematics focuses on the “formulation, study, and use of

mathematical models” in science, engineering, and other areas of mathematical practice.

In the past, practical applications have motivated the development of mathematical theories, which then became the

subject of study in pure mathematics, where mathematics is developed primarily for its own sake. Thus, the activity

of applied mathematics is vitally connected with research in pure mathematics.

Statistics and other decision sciences

Applied mathematics has signiﬁcant overlap with the discipline of statistics, whose theory is formulated mathematically, especially with probability theory. Statisticians (working as part of a research project) “create data that makes

sense” with random sampling and with randomized experiments;[53] the design of a statistical sample or experiment

speciﬁes the analysis of the data (before the data be available). When reconsidering data from experiments and

samples or when analyzing data from observational studies, statisticians “make sense of the data” using the art of

modelling and the theory of inference – with model selection and estimation; the estimated models and consequential

predictions should be tested on new data.[54]

Statistical theory studies decision problems such as minimizing the risk (expected loss) of a statistical action, such as

using a procedure in, for example, parameter estimation, hypothesis testing, and selecting the best. In these traditional

areas of mathematical statistics, a statistical-decision problem is formulated by minimizing an objective function, like

expected loss or cost, under speciﬁc constraints: For example, designing a survey often involves minimizing the

cost of estimating a population mean with a given level of conﬁdence.[55] Because of its use of optimization, the

mathematical theory of statistics shares concerns with other decision sciences, such as operations research, control

theory, and mathematical economics.[56]

Computational mathematics

Computational mathematics proposes and studies methods for solving mathematical problems that are typically too

large for human numerical capacity. Numerical analysis studies methods for problems in analysis using functional

analysis and approximation theory; numerical analysis includes the study of approximation and discretization broadly

with special concern for rounding errors. Numerical analysis and, more broadly, scientiﬁc computing also study nonanalytic topics of mathematical science, especially algorithmic matrix and graph theory. Other areas of computational

mathematics include computer algebra and symbolic computation.

7.6 Mathematical awards

Arguably the most prestigious award in mathematics is the Fields Medal,[57][58] established in 1936 and now awarded

every four years. The Fields Medal is often considered a mathematical equivalent to the Nobel Prize.

The Wolf Prize in Mathematics, instituted in 1978, recognizes lifetime achievement, and another major international

award, the Abel Prize, was introduced in 2003. The Chern Medal was introduced in 2010 to recognize lifetime

achievement. These accolades are awarded in recognition of a particular body of work, which may be innovational,

or provide a solution to an outstanding problem in an established ﬁeld.

A famous list of 23 open problems, called "Hilbert’s problems", was compiled in 1900 by German mathematician

David Hilbert. This list achieved great celebrity among mathematicians, and at least nine of the problems have now

been solved. A new list of seven important problems, titled the "Millennium Prize Problems", was published in 2000.

A solution to each of these problems carries a $1 million reward, and only one (the Riemann hypothesis) is duplicated

in Hilbert’s problems.

82

CHAPTER 7. MATHEMATICS

7.7 See also

Main article: Lists of mathematics topics

• Mathematics and art

• Mathematics education

• Relationship between mathematics and physics

• STEM ﬁelds

7.8 Notes

[1] No likeness or description of Euclid’s physical appearance made during his lifetime survived antiquity. Therefore, Euclid’s

depiction in works of art depends on the artist’s imagination (see Euclid).

[2] “mathematics, n.". Oxford English Dictionary. Oxford University Press. 2012. Retrieved June 16, 2012. The science

of space, number, quantity, and arrangement, whose methods involve logical reasoning and usually the use of symbolic

notation, and which includes geometry, arithmetic, algebra, and analysis.

[3] Kneebone, G.T. (1963). Mathematical Logic and the Foundations of Mathematics: An Introductory Survey. Dover. pp. 4.

ISBN 0-486-41712-3. Mathematics ... is simply the study of abstract structures, or formal patterns of connectedness.

[4] LaTorre, Donald R., John W. Kenelly, Iris B. Reed, Laurel R. Carpenter, and Cynthia R Harris (2011). Calculus Concepts:

An Informal Approach to the Mathematics of Change. Cengage Learning. pp. 2. ISBN 1-4390-4957-2. Calculus is the

study of change—how things change, and how quickly they change.

[5] Ramana (2007). Applied Mathematics. Tata McGraw–Hill Education. p. 2.10. ISBN 0-07-066753-5. The mathematical

study of change, motion, growth or decay is calculus.

[6] Ziegler, Günter M. (2011). “What Is Mathematics?". An Invitation to Mathematics: From Competitions to Research.

Springer. pp. 7. ISBN 3-642-19532-6.

[7] Mura, Roberta (Dec 1993). “Images of Mathematics Held by University Teachers of Mathematical Sciences”. Educational

Studies in Mathematics 25 (4): 375–385.

[8] Tobies, Renate and Helmut Neunzert (2012). Iris Runge: A Life at the Crossroads of Mathematics, Science, and Industry.

Springer. pp. 9. ISBN 3-0348-0229-3. It is ﬁrst necessary to ask what is meant by mathematics in general. Illustrious

scholars have debated this matter until they were blue in the face, and yet no consensus has been reached about whether

mathematics is a natural science, a branch of the humanities, or an art form.

[9] Steen, L.A. (April 29, 1988). The Science of Patterns Science, 240: 611–616. And summarized at Association for Supervision and Curriculum Development, www.ascd.org.

[10] Devlin, Keith, Mathematics: The Science of Patterns: The Search for Order in Life, Mind and the Universe (Scientiﬁc

American Paperback Library) 1996, ISBN 978-0-7167-5047-5

[11] Eves

[12] Marcus du Sautoy, A Brief History of Mathematics: 1. Newton and Leibniz, BBC Radio 4, September 27, 2010.

[13] Waltershausen

[14] Peirce, p. 97.

[15] Hilbert, D. (1919–20), Natur und Mathematisches Erkennen: Vorlesungen, gehalten 1919–1920 in Göttingen. Nach der

Ausarbeitung von Paul Bernays (Edited and with an English introduction by David E. Rowe), Basel, Birkhäuser (1992).

[16] Einstein, p. 28. The quote is Einstein’s answer to the question: “how can it be that mathematics, being after all a product

of human thought which is independent of experience, is so admirably appropriate to the objects of reality?" He, too, is

concerned with The Unreasonable Eﬀectiveness of Mathematics in the Natural Sciences.

[17] “Claire Voisin, Artist of the Abstract”. .cnrs.fr. Retrieved October 13, 2013.

7.8. NOTES

83

[18] Peterson

[19] Dehaene, Stanislas; Dehaene-Lambertz, Ghislaine; Cohen, Laurent (Aug 1998). “Abstract representations of numbers in

the animal and human brain”. Trends in Neuroscience 21 (8): 355–361. doi:10.1016/S0166-2236(98)01263-6. PMID

9720604.

[20] See, for example, Raymond L. Wilder, Evolution of Mathematical Concepts; an Elementary Study, passim

[21] Kline 1990, Chapter 1.

[22] "A History of Greek Mathematics: From Thales to Euclid". Thomas Little Heath (1981). ISBN 0-486-24073-8

[23] Sevryuk

[24] “mathematic”. Online Etymology Dictionary.

[25] Both senses can be found in Plato. μαθηματική. Liddell, Henry George; Scott, Robert; A Greek–English Lexicon at the

Perseus Project

[26] Cipra, Barry (1982). “St. Augustine v. The Mathematicians”. osu.edu. Ohio State University Mathematics department.

Retrieved July 14, 2014.

[27] The Oxford Dictionary of English Etymology, Oxford English Dictionary, sub “mathematics”, “mathematic”, “mathematics”

[28] “maths, n." and “math, n.3". Oxford English Dictionary, on-line version (2012).

[29] James Franklin, “Aristotelian Realism” in Philosophy of Mathematics”, ed. A.D. Irvine, p. 104. Elsevier (2009).

[30] Cajori, Florian (1893). A History of Mathematics. American Mathematical Society (1991 reprint). pp. 285–6. ISBN

0-8218-2102-4.

[31] Snapper, Ernst (September 1979). “The Three Crises in Mathematics: Logicism, Intuitionism, and Formalism”. Mathematics Magazine 52 (4): 207–16. doi:10.2307/2689412. JSTOR 2689412.

[32] Peirce, Benjamin (1882). Linear Associative Algebra. p. 1.

[33] Bertrand Russell, The Principles of Mathematics, p. 5. University Press, Cambridge (1903)

[34] Curry, Haskell (1951). Outlines of a Formalist Philosophy of Mathematics. Elsevier. pp. 56. ISBN 0-444-53368-0.

[35] Marcus du Sautoy, A Brief History of Mathematics: 10. Nicolas Bourbaki, BBC Radio 4, October 1, 2010.

[36] Shasha, Dennis Elliot; Lazere, Cathy A. (1998). Out of Their Minds: The Lives and Discoveries of 15 Great Computer

Scientists. Springer. p. 228.

[37] Popper 1995, p. 56

[38] Ziman

[39] Johnson, Gerald W.; Lapidus, Michel L. (2002). The Feynman Integral and Feynman’s Operational Calculus. Oxford

University Press. ISBN 0-8218-2413-9.

[40] Wigner, Eugene (1960). “The Unreasonable Eﬀectiveness of Mathematics in the Natural Sciences”. Communications on

Pure and Applied Mathematics 13 (1): 1–14. doi:10.1002/cpa.3160130102.

[41] “Mathematics Subject Classiﬁcation 2010” (PDF). Retrieved November 9, 2010.

[42] Hardy, G.H. (1940). A Mathematician’s Apology. Cambridge University Press. ISBN 0-521-42706-1.

[43] Gold, Bonnie; Simons, Rogers A. (2008). Proof and Other Dilemmas: Mathematics and Philosophy. MAA.

[44] Aigner, Martin; Ziegler, Günter M. (2001). Proofs from The Book. Springer. ISBN 3-540-40460-0.

[45] “Earliest Uses of Various Mathematical Symbols”. Retrieved September 14, 2014.

[46] Kline, p. 140, on Diophantus; p. 261, on Vieta.

[47] See false proof for simple examples of what can go wrong in a formal proof.

[48] Ivars Peterson, The Mathematical Tourist, Freeman, 1988, ISBN 0-7167-1953-3. p. 4 “A few complain that the computer

program can't be veriﬁed properly”, (in reference to the Haken–Apple proof of the Four Color Theorem).

84

CHAPTER 7. MATHEMATICS

[49] " The method of “postulating” what we want has many advantages; they are the same as the advantages of theft over honest

toil.” Bertrand Russell (1919), Introduction to Mathematical Philosophy, New York and London, p 71.

[50] Patrick Suppes, Axiomatic Set Theory, Dover, 1972, ISBN 0-486-61630-4. p. 1, “Among the many branches of modern

mathematics set theory occupies a unique place: with a few rare exceptions the entities which are studied and analyzed in

mathematics may be regarded as certain particular sets or classes of objects.”

[51] Luke Howard Hodgkin & Luke Hodgkin, A History of Mathematics, Oxford University Press, 2005.

[52] Clay Mathematics Institute, P=NP, claymath.org

[53] Rao, C.R. (1997) Statistics and Truth: Putting Chance to Work, World Scientiﬁc. ISBN 981-02-3111-3

[54] Like other mathematical sciences such as physics and computer science, statistics is an autonomous discipline rather than

a branch of applied mathematics. Like research physicists and computer scientists, research statisticians are mathematical

scientists. Many statisticians have a degree in mathematics, and some statisticians are also mathematicians.

[55] Rao, C.R. (1981). “Foreword”. In Arthanari, T.S.; Dodge, Yadolah. Mathematical programming in statistics. Wiley Series

in Probability and Mathematical Statistics. New York: Wiley. pp. vii–viii. ISBN 0-471-08073-X. MR 607328.

[56] Whittle (1994, pp. 10–11 and 14–18): Whittle, Peter (1994). “Almost home”. In Kelly, F.P. Probability, statistics and

optimisation: A Tribute to Peter Whittle (previously “A realised path: The Cambridge Statistical Laboratory upto 1993

(revised 2002)" ed.). Chichester: John Wiley. pp. 1–28. ISBN 0-471-94829-2.

[57] "The Fields Medal is now indisputably the best known and most inﬂuential award in mathematics." Monastyrsky

[58] Riehm

7.9 References

• Courant, Richard and H. Robbins, What Is Mathematics? : An Elementary Approach to Ideas and Methods,

Oxford University Press, USA; 2 edition (July 18, 1996). ISBN 0-19-510519-2.

• Einstein, Albert (1923). Sidelights on Relativity: I. Ether and relativity. II. Geometry and experience (translated

by G.B. Jeﬀery, D.Sc., and W. Perrett, Ph.D). E.P. Dutton & Co., New York.

• du Sautoy, Marcus, A Brief History of Mathematics, BBC Radio 4 (2010).

• Eves, Howard, An Introduction to the History of Mathematics, Sixth Edition, Saunders, 1990, ISBN 0-03029558-0.

• Kline, Morris, Mathematical Thought from Ancient to Modern Times, Oxford University Press, USA; Paperback

edition (March 1, 1990). ISBN 0-19-506135-7.

• Monastyrsky, Michael (2001). “Some Trends in Modern Mathematics and the Fields Medal” (PDF). Canadian

Mathematical Society. Retrieved July 28, 2006.

• Oxford English Dictionary, second edition, ed. John Simpson and Edmund Weiner, Clarendon Press, 1989,

ISBN 0-19-861186-2.

• The Oxford Dictionary of English Etymology, 1983 reprint. ISBN 0-19-861112-9.

• Pappas, Theoni, The Joy Of Mathematics, Wide World Publishing; Revised edition (June 1989). ISBN 0933174-65-9.

• Peirce, Benjamin (1881). Peirce, Charles Sanders, ed. “Linear associative algebra”. American Journal of

Mathematics (Corrected, expanded, and annotated revision with an 1875 paper by B. Peirce and annotations by his son, C.S. Peirce, of the 1872 lithograph ed.) (Johns Hopkins University) 4 (1–4): 97–229.

doi:10.2307/2369153. JSTOR 2369153. Corrected, expanded, and annotated revision with an 1875 paper

by B. Peirce and annotations by his son, C. S. Peirce, of the 1872 lithograph ed. Google Eprint and as an

extract, D. Van Nostrand, 1882, Google Eprint..

• Peterson, Ivars, Mathematical Tourist, New and Updated Snapshots of Modern Mathematics, Owl Books, 2001,

ISBN 0-8050-7159-8.

7.10. FURTHER READING

85

• Popper, Karl R. (1995). “On knowledge”. In Search of a Better World: Lectures and Essays from Thirty Years.

Routledge. ISBN 0-415-13548-6.

• Riehm, Carl (August 2002). “The Early History of the Fields Medal” (PDF). Notices of the AMS (AMS) 49

(7): 778–782.

• Sevryuk, Mikhail B. (January 2006). “Book Reviews” (PDF). Bulletin of the American Mathematical Society

43 (1): 101–109. doi:10.1090/S0273-0979-05-01069-4. Retrieved June 24, 2006.

• Waltershausen, Wolfgang Sartorius von (1965) [ﬁrst published 1856]. Gauss zum Gedächtniss. Sändig Reprint

Verlag H. R. Wohlwend. ASIN B0000BN5SQ. ISBN 3-253-01702-8. ASIN 3253017028.

7.10 Further reading

• Benson, Donald C., The Moment of Proof: Mathematical Epiphanies, Oxford University Press, USA; New Ed

edition (December 14, 2000). ISBN 0-19-513919-4.

• Boyer, Carl B., A History of Mathematics, Wiley; 2nd edition, revised by Uta C. Merzbach, (March 6, 1991).

ISBN 0-471-54397-7.—A concise history of mathematics from the Concept of Number to contemporary

Mathematics.

• Davis, Philip J. and Hersh, Reuben, The Mathematical Experience. Mariner Books; Reprint edition (January

14, 1999). ISBN 0-395-92968-7.

• Gullberg, Jan, Mathematics – From the Birth of Numbers. W. W. Norton & Company; 1st edition (October

1997). ISBN 0-393-04002-X.

• Hazewinkel, Michiel (ed.), Encyclopaedia of Mathematics. Kluwer Academic Publishers 2000. – A translated

and expanded version of a Soviet mathematics encyclopedia, in ten (expensive) volumes, the most complete

and authoritative work available. Also in paperback and on CD-ROM, and online.

• Jourdain, Philip E. B., The Nature of Mathematics, in The World of Mathematics, James R. Newman, editor,

Dover Publications, 2003, ISBN 0-486-43268-8.

• Maier, Annaliese, At the Threshold of Exact Science: Selected Writings of Annaliese Maier on Late Medieval

Natural Philosophy, edited by Steven Sargent, Philadelphia: University of Pennsylvania Press, 1982.

7.11 External links

• Mathematics at Encyclopædia Britannica

• Mathematics on In Our Time at the BBC. (listen now)

• Free Mathematics books Free Mathematics books collection.

• Encyclopaedia of Mathematics online encyclopaedia from Springer, Graduate-level reference work with over

8,000 entries, illuminating nearly 50,000 notions in mathematics.

• HyperMath site at Georgia State University

• FreeScience Library The mathematics section of FreeScience library

• Rusin, Dave: The Mathematical Atlas. A guided tour through the various branches of modern mathematics.

(Can also be found at NIU.edu.)

• Polyanin, Andrei: EqWorld: The World of Mathematical Equations. An online resource focusing on algebraic,

ordinary diﬀerential, partial diﬀerential (mathematical physics), integral, and other mathematical equations.

• Cain, George: Online Mathematics Textbooks available free online.

• Tricki, Wiki-style site that is intended to develop into a large store of useful mathematical problem-solving

techniques.

86

CHAPTER 7. MATHEMATICS

• Mathematical Structures, list information about classes of mathematical structures.

• Mathematician Biographies. The MacTutor History of Mathematics archive Extensive history and quotes from

all famous mathematicians.

• Metamath. A site and a language, that formalize mathematics from its foundations.

• Nrich, a prize-winning site for students from age ﬁve from Cambridge University

• Open Problem Garden, a wiki of open problems in mathematics

• Planet Math. An online mathematics encyclopedia under construction, focusing on modern mathematics. Uses

the Attribution-ShareAlike license, allowing article exchange with Wikipedia. Uses TeX markup.

• Some mathematics applets, at MIT

• Weisstein, Eric et al.: MathWorld: World of Mathematics. An online encyclopedia of mathematics.

• Patrick Jones’ Video Tutorials on Mathematics

• Citizendium: Theory (mathematics).

• du Sautoy, Marcus, A Brief History of Mathematics, BBC Radio 4 (2010).

• MathOverﬂow A Q&A site for research-level mathematics

• Math – Khan Academy

• National Museum of Mathematics, located in New York City

Chapter 8

Matrix (mathematics)

For other uses, see Matrix.

“Matrix theory” redirects here. For the physics topic, see Matrix string theory.

In mathematics, a matrix (plural matrices) is a rectangular array[1] —of numbers, symbols, or expressions, arranged

a2,1

a2,2

a2,3

a3,1

a3,2

a3,3

.

.

.

.

.

.

.

.

.

.

.

.

a1,3

.

.

.

a1,2

.

.

.

i

c

h

a

n

g

e

s

a1,1

.

m

rows

j changes

n columns

..

ai,j

Each element of a matrix is often denoted by a variable with two subscripts. For instance, a2,1 represents the element at the second

row and ﬁrst column of a matrix A.

in rows and columns[2][3] —that is treated in certain prescribed ways. One such way is to state the order of the matrix.

For example, the order of the matrix below is a 2x3 matrix because there are two rows and three columns. The

individual items in a matrix are called its elements or entries.[4]

87

88

[

1 9

20 5

CHAPTER 8. MATRIX (MATHEMATICS)

]

−13

.

−6

Provided that they are the same size (have the same number of rows and the same number of columns), two matrices

can be added or subtracted element by element. The rule for matrix multiplication, however, is that two matrices

can be multiplied only when the number of columns in the ﬁrst equals the number of rows in the second. A major

application of matrices is to represent linear transformations, that is, generalizations of linear functions such as f(x) =

4x. For example, the rotation of vectors in three dimensional space is a linear transformation which can be represented

by a rotation matrix R: if v is a column vector (a matrix with only one column) describing the position of a point in

space, the product Rv is a column vector describing the position of that point after a rotation. The product of two

transformation matrices is a matrix that represents the composition of two linear transformations. Another application

of matrices is in the solution of systems of linear equations. If the matrix is square, it is possible to deduce some of its

properties by computing its determinant. For example, a square matrix has an inverse if and only if its determinant

is not zero. Insight into the geometry of a linear transformation is obtainable (along with other information) from the

matrix’s eigenvalues and eigenvectors.

Applications of matrices are found in most scientiﬁc ﬁelds. In every branch of physics, including classical mechanics,

optics, electromagnetism, quantum mechanics, and quantum electrodynamics, they are used to study physical phenomena, such as the motion of rigid bodies. In computer graphics, they are used to project a 3-dimensional image

onto a 2-dimensional screen. In probability theory and statistics, stochastic matrices are used to describe sets of

probabilities; for instance, they are used within the PageRank algorithm that ranks the pages in a Google search.[5]

Matrix calculus generalizes classical analytical notions such as derivatives and exponentials to higher dimensions.

A major branch of numerical analysis is devoted to the development of eﬃcient algorithms for matrix computations,

a subject that is centuries old and is today an expanding area of research. Matrix decomposition methods simplify

computations, both theoretically and practically. Algorithms that are tailored to particular matrix structures, such as

sparse matrices and near-diagonal matrices, expedite computations in ﬁnite element method and other computations.

Inﬁnite matrices occur in planetary theory and in atomic theory. A simple example of an inﬁnite matrix is the matrix

representing the derivative operator, which acts on the Taylor series of a function.

8.1 Deﬁnition

A matrix is a rectangular array of numbers or other mathematical objects, for which operations such as addition and

multiplication are deﬁned.[6] Most commonly, a matrix over a ﬁeld F is a rectangular array of scalars from F.[7][8]

Most of this article focuses on real and complex matrices, i.e., matrices whose elements are real numbers or complex

numbers, respectively. More general types of entries are discussed below. For instance, this is a real matrix:

−1.3 0.6

5.5 .

A = 20.4

9.7 −6.2

The numbers, symbols or expressions in the matrix are called its entries or its elements. The horizontal and vertical

lines of entries in a matrix are called rows and columns, respectively.

8.1.1

Size

The size of a matrix is deﬁned by the number of rows and columns that it contains. A matrix with m rows and n

columns is called an m × n matrix or m-by-n matrix, while m and n are called its dimensions. For example, the matrix

A above is a 3 × 2 matrix.

Matrices which have a single row are called row vectors, and those which have a single column are called column

vectors. A matrix which has the same number of rows and columns is called a square matrix. A matrix with an

inﬁnite number of rows or columns (or both) is called an inﬁnite matrix. In some contexts, such as computer algebra

programs, it is useful to consider a matrix with no rows or no columns, called an empty matrix.

8.2. NOTATION

89

8.2 Notation

Matrices are commonly written in box brackets or an alternative notation uses large parentheses instead of box brackets:

a11

a21

A= .

..

a12

a22

..

.

···

···

..

.

am1

am2

···

a1n

a11 a12

a21 a22

a2n

.. =

..

..

.

.

.

amn

am1 am2

···

···

..

.

a1n

a2n

..

.

···

amn

∈ Rm×n .

The speciﬁcs of symbolic matrix notation varies widely, with some prevailing trends. Matrices are usually symbolized

using upper-case letters (such as A in the examples above), while the corresponding lower-case letters, with two

subscript indices (e.g., a11 , or a₁,₁), represent the entries. In addition to using upper-case letters to symbolize matrices,

many authors use a special typographical style, commonly boldface upright (non-italic), to further distinguish matrices

from other mathematical objects. An alternative notation involves the use of a double-underline with the variable

name, with or without boldface style, (e.g., A ).

The entry in the i-th row and j-th column of a matrix A is sometimes referred to as the i,j, (i,j), or (i,j)th entry of

the matrix, and most commonly denoted as ai,j, or aij. Alternative notations for that entry are A[i,j] or Ai,j. For

example, the (1,3) entry of the following matrix A is 5 (also denoted a13 , a₁,₃, A[1,3] or A1,3):

4 −7 5

0

11 8

A = −2 0

19

1 −3 12

Sometimes, the entries of a matrix can be deﬁned by a formula such as ai,j = f(i, j). For example, each of the entries

of the following matrix A is determined by aij = i − j.

0 −1 −2

A = 1 0 −1

2 1

0

−3

−2

−1

In this case, the matrix itself is sometimes deﬁned by that formula, within square brackets or double parenthesis. For

example, the matrix above is deﬁned as A = [i-j], or A = ((i-j)). If matrix size is m × n, the above-mentioned formula

f(i, j) is valid for any i = 1, ..., m and any j = 1, ..., n. This can be either speciﬁed separately, or using m × n as a

subscript. For instance, the matrix A above is 3 × 4 and can be deﬁned as A = [i − j] (i = 1, 2, 3; j = 1, ..., 4), or A =

[i − j]3×4.

Some programming languages utilize doubly subscripted arrays (or arrays of arrays) to represent an m-×-n matrix.

Some programming languages start the numbering of array indexes at zero, in which case the entries of an m-by-n

matrix are indexed by 0 ≤ i ≤ m − 1 and 0 ≤ j ≤ n − 1.[9] This article follows the more common convention in

mathematical writing where enumeration starts from 1.

The set of all m-by-n matrices is denoted 𝕄(m, n).

8.3 Basic operations

There are a number of basic operations that can be applied to modify matrices, called matrix addition, scalar multiplication, transposition, matrix multiplication, row operations, and submatrix.[11]

8.3.1

Addition, scalar multiplication and transposition

Main articles: Matrix addition, Scalar multiplication and Transpose

90

CHAPTER 8. MATRIX (MATHEMATICS)

Familiar properties of numbers extend to these operations of matrices: for example, addition is commutative, i.e.,

the matrix sum does not depend on the order of the summands: A + B = B + A.[12] The transpose is compatible with

addition and scalar multiplication, as expressed by (cA)T = c(AT ) and (A + B)T = AT + BT . Finally, (AT )T = A.

8.3.2

Matrix multiplication

Main article: Matrix multiplication

Multiplication of two matrices is deﬁned if and only if the number of columns of the left matrix is the same as the

B

b1,1 b1,2 b1,3

b2,1 b2,2 b2,3

a1,1 a1,2

A

a2,1 a2,2

a3,1 a3,2

a4,1 a4,2

Schematic depiction of the matrix product AB of two matrices A and B.

number of rows of the right matrix. If A is an m-by-n matrix and B is an n-by-p matrix, then their matrix product AB

is the m-by-p matrix whose entries are given by dot product of the corresponding row of A and the corresponding

column of B:

[AB]i,j = Ai,1 B1,j + Ai,2 B2,j + · · · + Ai,n Bn,j =

∑n

r=1

Ai,r Br,j ,

where 1 ≤ i ≤ m and 1 ≤ j ≤ p.[13] For example, the underlined entry 2340 in the product is calculated as (2 × 1000)

+ (3 × 100) + (4 × 10) = 2340:

[

] 0 1000

[

2 3 4

3

1 100 =

1 0 0

0

0 10

]

2340

.

1000

8.3. BASIC OPERATIONS

91

Matrix multiplication satisﬁes the rules (AB)C = A(BC) (associativity), and (A+B)C = AC+BC as well as C(A+B)

= CA+CB (left and right distributivity), whenever the size of the matrices is such that the various products are

deﬁned.[14] The product AB may be deﬁned without BA being deﬁned, namely if A and B are m-by-n and n-by-k

matrices, respectively, and m ≠ k. Even if both products are deﬁned, they need not be equal, i.e., generally

AB ≠ BA,

i.e., matrix multiplication is not commutative, in marked contrast to (rational, real, or complex) numbers whose

product is independent of the order of the factors. An example of two matrices not commuting with each other is:

[

1 2

3 4

][

] [

]

0 1

0 1

=

,

0 0

0 3

whereas

[

0 1

0 0

][

] [

]

1 2

3 4

=

.

3 4

0 0

Besides the ordinary matrix multiplication just described, there exist other less frequently used operations on matrices

that can be considered forms of multiplication, such as the Hadamard product and the Kronecker product.[15] They

arise in solving matrix equations such as the Sylvester equation.

8.3.3

Row operations

Main article: Row operations

There are three types of row operations:

1. row addition, that is adding a row to another.

2. row multiplication, that is multiplying all entries of a row by a non-zero constant;

3. row switching, that is interchanging two rows of a matrix;

These operations are used in a number of ways, including solving linear equations and ﬁnding matrix inverses.

8.3.4

Submatrix

A submatrix of a matrix is obtained by deleting any collection of rows and/or columns.[16][17][18] For example, from

the following 3-by-4 matrix, we can construct a 2-by-3 submatrix by removing row 3 and column 2:

1 2

A = 5 6

9 10

[

3 4

1

7 8 →

5

11 12

3

7

]

4

.

8

The minors and cofactors of a matrix are found by computing the determinant of certain submatrices.[18][19]

A principal submatrix is a square submatrix obtained by removing certain rows and columns. The deﬁnition varies

from author to author. According to some authors, a principal submatrix is a submatrix in which the set of row indices

that remain is the same as the set of column indices that remain.[20][21] Other authors deﬁne a principal submatrix to

be one in which the ﬁrst k rows and columns, for some number k, are the ones that remain;[22] this type of submatrix

has also been called a leading principal submatrix.[23]

92

CHAPTER 8. MATRIX (MATHEMATICS)

8.4 Linear equations

Main articles: Linear equation and System of linear equations

Matrices can be used to compactly write and work with multiple linear equations, i.e., systems of linear equations.

For example, if A is an m-by-n matrix, x designates a column vector (i.e., n×1-matrix) of n variables x1 , x2 , ..., xn,

and b is an m×1-column vector, then the matrix equation

Ax = b

is equivalent to the system of linear equations

A₁,₁x1 + A₁,₂x2 + ... + A₁,nxn = b1

...

Am,₁x1 + Am,₂x2 + ... + Am,nxn = bm .[24]

8.5 Linear transformations

Main articles: Linear transformation and Transformation matrix

Matrices and matrix multiplication reveal their essential features when related to linear transformations, also known

as linear maps. A real m-by-n matrix A gives rise to a linear transformation Rn → Rm mapping each vector x in Rn

to the (matrix) product Ax, which is a vector in Rm . Conversely, each linear transformation f: Rn → Rm arises from

a unique m-by-n matrix A: explicitly, the (i, j)-entry of A is the ith coordinate of f(ej), where ej = (0,...,0,1,0,...,0) is

the unit vector with 1 in the j th position and 0 elsewhere. The matrix A is said to represent the linear map f, and A

is called the transformation matrix of f.

For example, the 2×2 matrix

[

]

a c

A=

b d

can be viewed as the transform of the unit square into a parallelogram with vertices at (0, 0), (a, b), (a + c, b + d),

and

pictured at the right is obtained by multiplying A with each of the column vectors

[ ] (c,

[ ]d).[ The

] parallelogram

[ ]

0 1

1

0

,

,

and

in turn. These vectors deﬁne the vertices of the unit square.

0 0

1

1

The following table shows a number of 2-by-2 matrices with the associated linear maps of R2 . The blue original is

mapped to the green grid and shapes. The origin (0,0) is marked with a black point.

Under the 1-to-1 correspondence between matrices and linear maps, matrix multiplication corresponds to composition

of maps:[25] if a k-by-m matrix B represents another linear map g : Rm → Rk , then the composition g ∘ f is represented

by BA since

(g ∘ f)(x) = g(f(x)) = g(Ax) = B(Ax) = (BA)x.

The last equality follows from the above-mentioned associativity of matrix multiplication.

The rank of a matrix A is the maximum number of linearly independent row vectors of the matrix, which is the same

as the maximum number of linearly independent column vectors.[26] Equivalently it is the dimension of the image of

the linear map represented by A.[27] The rank-nullity theorem states that the dimension of the kernel of a matrix plus

the rank equals the number of columns of the matrix.[28]

8.6 Square matrices

Main article: Square matrix

8.6. SQUARE MATRICES

93

(a+c,b+d)

(c,d)

ad−bc

(a,b)

(0,0)

The vectors represented by a 2-by-2 matrix correspond to the sides of a unit square transformed into a parallelogram.

A square matrix is a matrix with the same number of rows and columns. An n-by-n matrix is known as a square

matrix of order n. Any two square matrices of the same order can be added and multiplied. The entries aii form the

main diagonal of a square matrix. They lie on the imaginary line which runs from the top left corner to the bottom

right corner of the matrix.

8.6.1

Main types

94

CHAPTER 8. MATRIX (MATHEMATICS)

Diagonal and triangular matrices

If all entries of A below the main diagonal are zero, A is called an upper triangular matrix. Similarly if all entries of

A above the main diagonal are zero, A is called a lower triangular matrix. If all entries outside the main diagonal are

zero, A is called a diagonal matrix.

Identity matrix

The identity matrix In of size n is the n-by-n matrix in which all the elements on the main diagonal are equal to 1 and

all other elements are equal to 0, e.g.

[

[ ]

1

I1 = 1 , I2 =

0

1 0

]

0 1

0

, · · · , In = . .

1

.. ..

0 0

···

···

..

.

0

0

..

.

···

1

It is a square matrix of order n, and also a special kind of diagonal matrix. It is called an identity matrix because

multiplication with it leaves a matrix unchanged:

AIn = ImA = A for any m-by-n matrix A.

Symmetric or skew-symmetric matrix

A square matrix A that is equal to its transpose, i.e., A = AT , is a symmetric matrix. If instead, A was equal to the

negative of its transpose, i.e., A = −AT , then A is a skew-symmetric matrix. In complex matrices, symmetry is often

replaced by the concept of Hermitian matrices, which satisfy A∗ = A, where the star or asterisk denotes the conjugate

transpose of the matrix, i.e., the transpose of the complex conjugate of A.

By the spectral theorem, real symmetric matrices and complex Hermitian matrices have an eigenbasis; i.e., every

vector is expressible as a linear combination of eigenvectors. In both cases, all eigenvalues are real.[29] This theorem

can be generalized to inﬁnite-dimensional situations related to matrices with inﬁnitely many rows and columns, see

below.

Invertible matrix and its inverse

A square matrix A is called invertible or non-singular if there exists a matrix B such that

AB = BA = In.[30][31]

If B exists, it is unique and is called the inverse matrix of A, denoted A−1 .

Deﬁnite matrix

A symmetric n×n-matrix is called positive-deﬁnite (respectively negative-deﬁnite; indeﬁnite), if for all nonzero vectors

x ∈ Rn the associated quadratic form given by

Q(x) = xT Ax

takes only positive values (respectively only negative values; both some negative and some positive values).[32] If

the quadratic form takes only non-negative (respectively only non-positive) values, the symmetric matrix is called

positive-semideﬁnite (respectively negative-semideﬁnite); hence the matrix is indeﬁnite precisely when it is neither

positive-semideﬁnite nor negative-semideﬁnite.

A symmetric matrix is positive-deﬁnite if and only if all its eigenvalues are positive, i.e., the matrix is positivesemideﬁnite and it is invertible.[33] The table at the right shows two possibilities for 2-by-2 matrices.

Allowing as input two diﬀerent vectors instead yields the bilinear form associated to A:

8.6. SQUARE MATRICES

95

BA (x, y) = xT Ay.[34]

Orthogonal matrix

An orthogonal matrix is a square matrix with real entries whose columns and rows are orthogonal unit vectors (i.e.,

orthonormal vectors). Equivalently, a matrix A is orthogonal if its transpose is equal to its inverse:

AT = A−1 ,

which entails

AT A = AAT = I,

where I is the identity matrix.

An orthogonal matrix A is necessarily invertible (with inverse A−1 = AT ), unitary (A−1 = A*), and normal (A*A =

AA*). The determinant of any orthogonal matrix is either +1 or −1. A special orthogonal matrix is an orthogonal

matrix with determinant +1. As a linear transformation, every orthogonal matrix with determinant +1 is a pure

rotation, while every orthogonal matrix with determinant −1 is either a pure reﬂection, or a composition of reﬂection

and rotation.

The complex analogue of an orthogonal matrix is a unitary matrix.

8.6.2

Main operations

Trace

The trace, tr(A) of a square matrix A is the sum of its diagonal entries. While matrix multiplication is not commutative

as mentioned above, the trace of the product of two matrices is independent of the order of the factors:

tr(AB) = tr(BA).

This is immediate from the deﬁnition of matrix multiplication:

tr(AB)=

∑m ∑n

i=1

j=1

Aij Bji =tr(BA).

Also, the trace of a matrix is equal to that of its transpose, i.e.,

tr(A) = tr(AT ).

Determinant

Main article: Determinant

The determinant det(A) or |A| of a square matrix A is a number encoding certain properties of the matrix. A matrix

is invertible if and only if its determinant is nonzero. Its absolute value equals the area (in R2 ) or volume (in R3 ) of

the image of the unit square (or cube), while its sign corresponds to the orientation of the corresponding linear map:

the determinant is positive if and only if the orientation is preserved.

The determinant of 2-by-2 matrices is given by

[

]

a b

det

= ad − bc.

c d

The determinant of 3-by-3 matrices involves 6 terms (rule of Sarrus). The more lengthy Leibniz formula generalises

these two formulae to all dimensions.[35]

The determinant of a product of square matrices equals the product of their determinants:

96

CHAPTER 8. MATRIX (MATHEMATICS)

( 01 −11)

x2

f(x1 )

x1

f(x2 )

A linear transformation on R2 given by the indicated matrix. The determinant of this matrix is −1, as the area of the green

parallelogram at the right is 1, but the map reverses the orientation, since it turns the counterclockwise orientation of the vectors

to a clockwise one.

det(AB) = det(A) · det(B).[36]

Adding a multiple of any row to another row, or a multiple of any column to another column, does not change

the determinant. Interchanging two rows or two columns aﬀects the determinant by multiplying it by −1.[37] Using

these operations, any matrix can be transformed to a lower (or upper) triangular matrix, and for such matrices the

determinant equals the product of the entries on the main diagonal; this provides a method to calculate the determinant

of any matrix. Finally, the Laplace expansion expresses the determinant in terms of minors, i.e., determinants of

smaller matrices.[38] This expansion can be used for a recursive deﬁnition of determinants (taking as starting case

the determinant of a 1-by-1 matrix, which is its unique entry, or even the determinant of a 0-by-0 matrix, which is

1), that can be seen to be equivalent to the Leibniz formula. Determinants can be used to solve linear systems using

Cramer’s rule, where the division of the determinants of two related square matrices equates to the value of each of

the system’s variables.[39]

Eigenvalues and eigenvectors

Main article: Eigenvalues and eigenvectors

A number λ and a non-zero vector v satisfying

Av = λv

are called an eigenvalue and an eigenvector of A, respectively.[nb 1][40] The number λ is an eigenvalue of an n×n-matrix

A if and only if A−λIn is not invertible, which is equivalent to

det(A − λI) = 0.

[41]

The polynomial pA in an indeterminate X given by evaluation the determinant det(XIn−A) is called the characteristic

polynomial of A. It is a monic polynomial of degree n. Therefore the polynomial equation pA(λ) = 0 has at most

n diﬀerent solutions, i.e., eigenvalues of the matrix.[42] They may be complex even if the entries of A are real.

According to the Cayley–Hamilton theorem, pA(A) = 0, that is, the result of substituting the matrix itself into its own

characteristic polynomial yields the zero matrix.

8.7 Computational aspects

Matrix calculations can be often performed with diﬀerent techniques. Many problems can be solved by both direct

algorithms or iterative approaches. For example, the eigenvectors of a square matrix can be obtained by ﬁnding a

8.8. DECOMPOSITION

97

sequence of vectors xn converging to an eigenvector when n tends to inﬁnity.[43]

To be able to choose the more appropriate algorithm for each speciﬁc problem, it is important to determine both

the eﬀectiveness and precision of all the available algorithms. The domain studying these matters is called numerical

linear algebra.[44] As with other numerical situations, two main aspects are the complexity of algorithms and their

numerical stability.

Determining the complexity of an algorithm means ﬁnding upper bounds or estimates of how many elementary operations such as additions and multiplications of scalars are necessary to perform some algorithm, e.g., multiplication

of matrices. For example, calculating the matrix product of two n-by-n matrix using the deﬁnition given above needs

n3 multiplications, since for any of the n2 entries of the product, n multiplications are necessary. The Strassen algorithm outperforms this “naive” algorithm; it needs only n2.807 multiplications.[45] A reﬁned approach also incorporates

speciﬁc features of the computing devices.

In many practical situations additional information about the matrices involved is known. An important case are

sparse matrices, i.e., matrices most of whose entries are zero. There are speciﬁcally adapted algorithms for, say,

solving linear systems Ax = b for sparse matrices A, such as the conjugate gradient method.[46]

An algorithm is, roughly speaking, numerically stable, if little deviations in the input values do not lead to big deviations in the result. For example, calculating the inverse of a matrix via Laplace’s formula (Adj (A) denotes the

adjugate matrix of A)

A−1 = Adj(A) / det(A)

may lead to signiﬁcant rounding errors if the determinant of the matrix is very small. The norm of a matrix can be

used to capture the conditioning of linear algebraic problems, such as computing a matrix’s inverse.[47]

Although most computer languages are not designed with commands or libraries for matrices, as early as the 1970s,

some engineering desktop computers such as the HP 9830 had ROM cartridges to add BASIC commands for matrices.

Some computer languages such as APL were designed to manipulate matrices, and various mathematical programs

can be used to aid computing with matrices.[48]

8.8 Decomposition

Main articles: Matrix decomposition, Matrix diagonalization, Gaussian elimination and Montante’s method

There are several methods to render matrices into a more easily accessible form. They are generally referred to as

matrix decomposition or matrix factorization techniques. The interest of all these techniques is that they preserve

certain properties of the matrices in question, such as determinant, rank or inverse, so that these quantities can be

calculated after applying the transformation, or that certain matrix operations are algorithmically easier to carry out

for some types of matrices.

The LU decomposition factors matrices as a product of lower (L) and an upper triangular matrices (U).[49] Once

this decomposition is calculated, linear systems can be solved more eﬃciently, by a simple technique called forward

and back substitution. Likewise, inverses of triangular matrices are algorithmically easier to calculate. The Gaussian elimination is a similar algorithm; it transforms any matrix to row echelon form.[50] Both methods proceed by

multiplying the matrix by suitable elementary matrices, which correspond to permuting rows or columns and adding

multiples of one row to another row. Singular value decomposition expresses any matrix A as a product UDV∗ , where

U and V are unitary matrices and D is a diagonal matrix.

The eigendecomposition or diagonalization expresses A as a product VDV−1 , where D is a diagonal matrix and V

is a suitable invertible matrix.[51] If A can be written in this form, it is called diagonalizable. More generally, and

applicable to all matrices, the Jordan decomposition transforms a matrix into Jordan normal form, that is to say

matrices whose only nonzero entries are the eigenvalues λ1 to λ of A, placed on the main diagonal and possibly

entries equal to one directly above the main diagonal, as shown at the right.[52] Given the eigendecomposition, the nth

power of A (i.e., n-fold iterated matrix multiplication) can be calculated via

An = (VDV−1 )n = VDV−1 VDV−1 ...VDV−1 = VDn V−1

and the power of a diagonal matrix can be calculated by taking the corresponding powers of the diagonal entries, which

is much easier than doing the exponentiation for A instead. This can be used to compute the matrix exponential eA , a

98

CHAPTER 8. MATRIX (MATHEMATICS)

An example of a matrix in Jordan normal form. The grey blocks are called Jordan blocks.

need frequently arising in solving linear diﬀerential equations, matrix logarithms and square roots of matrices.[53] To

avoid numerically ill-conditioned situations, further algorithms such as the Schur decomposition can be employed.[54]

8.9 Abstract algebraic aspects and generalizations

Matrices can be generalized in diﬀerent ways. Abstract algebra uses matrices with entries in more general ﬁelds

or even rings, while linear algebra codiﬁes properties of matrices in the notion of linear maps. It is possible to

consider matrices with inﬁnitely many columns and rows. Another extension are tensors, which can be seen as

higher-dimensional arrays of numbers, as opposed to vectors, which can often be realised as sequences of numbers,

while matrices are rectangular or two-dimensional arrays of numbers.[55] Matrices, subject to certain requirements

tend to form groups known as matrix groups.

8.9.1

Matrices with more general entries

This article focuses on matrices whose entries are real or complex numbers. However, matrices can be considered

with much more general types of entries than real or complex numbers. As a ﬁrst step of generalization, any ﬁeld, i.e.,

a set where addition, subtraction, multiplication and division operations are deﬁned and well-behaved, may be used

instead of R or C, for example rational numbers or ﬁnite ﬁelds. For example, coding theory makes use of matrices

over ﬁnite ﬁelds. Wherever eigenvalues are considered, as these are roots of a polynomial they may exist only in a

8.9. ABSTRACT ALGEBRAIC ASPECTS AND GENERALIZATIONS

99

larger ﬁeld than that of the entries of the matrix; for instance they may be complex in case of a matrix with real

entries. The possibility to reinterpret the entries of a matrix as elements of a larger ﬁeld (e.g., to view a real matrix

as a complex matrix whose entries happen to be all real) then allows considering each square matrix to possess a full

set of eigenvalues. Alternatively one can consider only matrices with entries in an algebraically closed ﬁeld, such as

C, from the outset.

More generally, abstract algebra makes great use of matrices with entries in a ring R.[56] Rings are a more general

notion than ﬁelds in that a division operation need not exist. The very same addition and multiplication operations of

matrices extend to this setting, too. The set M(n, R) of all square n-by-n matrices over R is a ring called matrix ring,

isomorphic to the endomorphism ring of the left R-module Rn .[57] If the ring R is commutative, i.e., its multiplication

is commutative, then M(n, R) is a unitary noncommutative (unless n = 1) associative algebra over R. The determinant

of square matrices over a commutative ring R can still be deﬁned using the Leibniz formula; such a matrix is invertible

if and only if its determinant is invertible in R, generalising the situation over a ﬁeld F, where every nonzero element

is invertible.[58] Matrices over superrings are called supermatrices.[59]

Matrices do not always have all their entries in the same ring – or even in any ring at all. One special but common

case is block matrices, which may be considered as matrices whose entries themselves are matrices. The entries need

not be quadratic matrices, and thus need not be members of any ordinary ring; but their sizes must fulﬁl certain

compatibility conditions.

8.9.2

Relationship to linear maps

Linear maps Rn → Rm are equivalent to m-by-n matrices, as described above. More generally, any linear map f: V

→ W between ﬁnite-dimensional vector spaces can be described by a matrix A = (aij), after choosing bases v1 , ...,

vn of V, and w1 , ..., wm of W (so n is the dimension of V and m is the dimension of W), which is such that

f (vj ) =

m

∑

ai,j wi

for j = 1, . . . , n.

i=1

In other words, column j of A expresses the image of vj in terms of the basis vectors wi of W; thus this relation uniquely

determines the entries of the matrix A. Note that the matrix depends on the choice of the bases: diﬀerent choices of

bases give rise to diﬀerent, but equivalent matrices.[60] Many of the above concrete notions can be reinterpreted in

this light, for example, the transpose matrix AT describes the transpose of the linear map given by A, with respect to

the dual bases.[61]

These properties can be restated in a more natural way: the category of all matrices with entries in a ﬁeld k with

multiplication as composition is equivalent to the category of ﬁnite dimensional vector spaces and linear maps over

this ﬁeld.

More generally, the set of m×n matrices can be used to represent the R-linear maps between the free modules Rm

and Rn for an arbitrary ring R with unity. When n = m composition of these maps is possible, and this gives rise to

the matrix ring of n×n matrices representing the endomorphism ring of Rn .

8.9.3

Matrix groups

Main article: Matrix group

A group is a mathematical structure consisting of a set of objects together with a binary operation, i.e., an operation

combining any two objects to a third, subject to certain requirements.[62] A group in which the objects are matrices

and the group operation is matrix multiplication is called a matrix group.[nb 2][63] Since in a group every element has

to be invertible, the most general matrix groups are the groups of all invertible matrices of a given size, called the

general linear groups.

Any property of matrices that is preserved under matrix products and inverses can be used to deﬁne further matrix

groups. For example, matrices with a given size and with a determinant of 1 form a subgroup of (i.e., a smaller

group contained in) their general linear group, called a special linear group.[64] Orthogonal matrices, determined by

the condition

MT M = I,

100

CHAPTER 8. MATRIX (MATHEMATICS)

form the orthogonal group.[65] Every orthogonal matrix has determinant 1 or −1. Orthogonal matrices with determinant 1 form a subgroup called special orthogonal group.

Every ﬁnite group is isomorphic to a matrix group, as one can see by considering the regular representation of the

symmetric group.[66] General groups can be studied using matrix groups, which are comparatively well-understood,

by means of representation theory.[67]

8.9.4

Inﬁnite matrices

It is also possible to consider matrices with inﬁnitely many rows and/or columns[68] even if, being inﬁnite objects, one

cannot write down such matrices explicitly. All that matters is that for every element in the set indexing rows, and

every element in the set indexing columns, there is a well-deﬁned entry (these index sets need not even be subsets of

the natural numbers). The basic operations of addition, subtraction, scalar multiplication and transposition can still

be deﬁned without problem; however matrix multiplication may involve inﬁnite summations to deﬁne the resulting

entries, and these are not deﬁned in general.

⊕

If R is any ring with unity, then the ring of endomorphisms of M = i∈I R as a right R module is isomorphic to

the ring of column ﬁnite matrices CFMI (R) whose entries are indexed by I × I , and whose columns each contain

only ﬁnitely many nonzero entries. The endomorphisms of M considered as a left R module result in an analogous

object, the row ﬁnite matrices RFMI (R) whose rows each only have ﬁnitely many nonzero entries.

If inﬁnite matrices are used to describe linear maps, then only those matrices can be used all of whose columns have

but a ﬁnite number of nonzero entries, for the following reason. For a matrix A to describe a linear map f: V→W,

bases for both spaces must have been chosen; recall that by deﬁnition this means that every vector in the space can be

written uniquely as a (ﬁnite) linear combination of basis vectors, so that written as a (column) vector v of coeﬃcients,

only ﬁnitely many entries vi are nonzero. Now the columns of A describe the images by f of individual basis vectors

of V in the basis of W, which is only meaningful if these columns have only ﬁnitely many nonzero entries. There

is no restriction on the rows of A however: in the product A·v there are only ﬁnitely many nonzero coeﬃcients of

v involved, so every one of its entries, even if it is given as an inﬁnite sum of products, involves only ﬁnitely many

nonzero terms and is therefore well deﬁned. Moreover this amounts to forming a linear combination of the columns

of A that eﬀectively involves only ﬁnitely many of them, whence the result has only ﬁnitely many nonzero entries,

because each of those columns do. One also sees that products of two matrices of the given type is well deﬁned

(provided as usual that the column-index and row-index sets match), is again of the same type, and corresponds to

the composition of linear maps.

If R is a normed ring, then the condition of row or column ﬁniteness can be relaxed. With the norm in place, absolutely

convergent series can be used instead of ﬁnite sums. For example, the matrices whose column sums are absolutely

convergent sequences form a ring. Analogously of course, the matrices whose row sums are absolutely convergent

series also form a ring.

In that vein, inﬁnite matrices can also be used to describe operators on Hilbert spaces, where convergence and

continuity questions arise, which again results in certain constraints that have to be imposed. However, the explicit

point of view of matrices tends to obfuscate the matter,[nb 3] and the abstract and more powerful tools of functional

analysis can be used instead.

8.9.5

Empty matrices

An empty matrix is a matrix in which the number of rows or columns (or both) is zero.[69][70] Empty matrices help

dealing with maps involving the zero vector space. For example, if A is a 3-by-0 matrix and B is a 0-by-3 matrix,

then AB is the 3-by-3 zero matrix corresponding to the null map from a 3-dimensional space V to itself, while BA

is a 0-by-0 matrix. There is no common notation for empty matrices, but most computer algebra systems allow

creating and computing with them. The determinant of the 0-by-0 matrix is 1 as follows from regarding the empty

product occurring in the Leibniz formula for the determinant as 1. This value is also consistent with the fact that the

identity map from any ﬁnite dimensional space to itself has determinant 1, a fact that is often used as a part of the

characterization of determinants.

8.10. APPLICATIONS

101

8.10 Applications

There are numerous applications of matrices, both in mathematics and other sciences. Some of them merely take

advantage of the compact representation of a set of numbers in a matrix. For example, in game theory and economics,

the payoﬀ matrix encodes the payoﬀ for two players, depending on which out of a given (ﬁnite) set of alternatives the

players choose.[71] Text mining and automated thesaurus compilation makes use of document-term matrices such as

tf-idf to track frequencies of certain words in several documents.[72]

Complex numbers can be represented by particular real 2-by-2 matrices via

[

a + ib ↔

]

a −b

,

b a

under which addition and multiplication of complex numbers and matrices correspond to each other. For example,

2-by-2 rotation matrices represent the multiplication with some complex number of absolute value 1, as above. A

similar interpretation is possible for quaternions[73] and Cliﬀord algebras in general.

Early encryption techniques such as the Hill cipher also used matrices. However, due to the linear nature of matrices,

these codes are comparatively easy to break.[74] Computer graphics uses matrices both to represent objects and to

calculate transformations of objects using aﬃne rotation matrices to accomplish tasks such as projecting a threedimensional object onto a two-dimensional screen, corresponding to a theoretical camera observation.[75] Matrices

over a polynomial ring are important in the study of control theory.

Chemistry makes use of matrices in various ways, particularly since the use of quantum theory to discuss molecular

bonding and spectroscopy. Examples are the overlap matrix and the Fock matrix used in solving the Roothaan

equations to obtain the molecular orbitals of the Hartree–Fock method.

8.10.1

Graph theory

The adjacency matrix of a ﬁnite graph is a basic notion of graph theory.[76] It records which vertices of the graph

are connected by an edge. Matrices containing just two diﬀerent values (1 and 0 meaning for example “yes” and

“no”, respectively) are called logical matrices. The distance (or cost) matrix contains information about distances of

the edges.[77] These concepts can be applied to websites connected by hyperlinks or cities connected by roads etc.,

in which case (unless the connection network is extremely dense) the matrices tend to be sparse, i.e., contain few

nonzero entries. Therefore, speciﬁcally tailored matrix algorithms can be used in network theory.

8.10.2

Analysis and geometry

The Hessian matrix of a diﬀerentiable function ƒ: Rn → R consists of the second derivatives of ƒ with respect to the

several coordinate directions, i.e.[78]

[

]

∂2f

H(f ) =

.

∂xi ∂xj

It encodes information about the local growth behaviour of the function: given a critical point x = (x1 , ..., xn), i.e., a

point where the ﬁrst partial derivatives ∂f /∂xi of ƒ vanish, the function has a local minimum if the Hessian matrix is

positive deﬁnite. Quadratic programming can be used to ﬁnd global minima or maxima of quadratic functions closely

related to the ones attached to matrices (see above).[79]

Another matrix frequently used in geometrical situations is the Jacobi matrix of a diﬀerentiable map f: Rn → Rm . If

f 1 , ..., fm denote the components of f, then the Jacobi matrix is deﬁned as [80]

[

∂fi

J(f ) =

∂xj

]

.

1≤i≤m,1≤j≤n

If n > m, and if the rank of the Jacobi matrix attains its maximal value m, f is locally invertible at that point, by the

implicit function theorem.[81]

102

CHAPTER 8. MATRIX (MATHEMATICS)

2

3

1

1

An undirected graph with adjacency matrix 1

0

1

0

1

0

1 .

0

Partial diﬀerential equations can be classiﬁed by considering the matrix of coeﬃcients of the highest-order diﬀerential

operators of the equation. For elliptic partial diﬀerential equations this matrix is positive deﬁnite, which has decisive

inﬂuence on the set of possible solutions of the equation in question.[82]

The ﬁnite element method is an important numerical method to solve partial diﬀerential equations, widely applied in

simulating complex physical systems. It attempts to approximate the solution to some equation by piecewise linear

functions, where the pieces are chosen with respect to a suﬃciently ﬁne grid, which in turn can be recast as a matrix

equation.[83]

8.10.3

Probability theory and statistics

Stochastic matrices are square matrices whose rows are probability vectors, i.e., whose entries are non-negative and

sum up to one. Stochastic matrices are used to deﬁne Markov chains with ﬁnitely many states.[84] A row of the

stochastic matrix gives the probability distribution for the next position of some particle currently in the state that

corresponds to the row. Properties of the Markov chain like absorbing states, i.e., states that any particle attains

eventually, can be read oﬀ the eigenvectors of the transition matrices.[85]

Statistics also makes use of matrices in many diﬀerent forms.[86] Descriptive statistics is concerned with describing

data sets, which can often be represented as data matrices, which may then be subjected to dimensionality reduction

techniques. The covariance matrix encodes the mutual variance of several random variables.[87] Another technique

8.10. APPLICATIONS

At the saddle point (x = 0, y = 0) (red) of the function f(x,−y) = x2 − y2 , the Hessian matrix

103

[

2

0

]

0

is indeﬁnite.

−2

using matrices are linear least squares, a method that approximates a ﬁnite set of pairs (x1 , y1 ), (x2 , y2 ), ..., (xN, yN),

by a linear function

yi ≈ axi + b, i = 1, ..., N

which can be formulated in terms of matrices, related to the singular value decomposition of matrices.[88]

Random matrices are matrices whose entries are random numbers, subject to suitable probability distributions, such

as matrix normal distribution. Beyond probability theory, they are applied in domains ranging from number theory

to physics.[89][90]

8.10.4

Symmetries and transformations in physics

Further information: Symmetry in physics

Linear transformations and the associated symmetries play a key role in modern physics. For example, elementary

particles in quantum ﬁeld theory are classiﬁed as representations of the Lorentz group of special relativity and, more

speciﬁcally, by their behavior under the spin group. Concrete representations involving the Pauli matrices and more

general gamma matrices are an integral part of the physical description of fermions, which behave as spinors.[91] For

the three lightest quarks, there is a group-theoretical representation involving the special unitary group SU(3); for

their calculations, physicists use a convenient matrix representation known as the Gell-Mann matrices, which are also

used for the SU(3) gauge group that forms the basis of the modern description of strong nuclear interactions, quantum

chromodynamics. The Cabibbo–Kobayashi–Maskawa matrix, in turn, expresses the fact that the basic quark states

that are important for weak interactions are not the same as, but linearly related to the basic quark states that deﬁne

particles with speciﬁc and distinct masses.[92]

104

CHAPTER 8. MATRIX (MATHEMATICS)

Two diﬀerent Markov chains. The chart depicts the number of

[ particles

] (of a total

[ of 1000)

] in state “2”. Both limiting values can be

.7 0

.7 .2

determined from the transition matrices, which are given by

(red) and

(black).

.3 1

.3 .8

8.10.5

Linear combinations of quantum states

The ﬁrst model of quantum mechanics (Heisenberg, 1925) represented the theory’s operators by inﬁnite-dimensional

matrices acting on quantum states.[93] This is also referred to as matrix mechanics. One particular example is the

density matrix that characterizes the “mixed” state of a quantum system as a linear combination of elementary, “pure”

eigenstates.[94]

Another matrix serves as a key tool for describing the scattering experiments that form the cornerstone of experimental particle physics: Collision reactions such as occur in particle accelerators, where non-interacting particles head

towards each other and collide in a small interaction zone, with a new set of non-interacting particles as the result,

can be described as the scalar product of outgoing particle states and a linear combination of ingoing particle states.

The linear combination is given by a matrix known as the S-matrix, which encodes all information about the possible

interactions between particles.[95]

8.10.6

Normal modes

A general application of matrices in physics is to the description of linearly coupled harmonic systems. The equations

of motion of such systems can be described in matrix form, with a mass matrix multiplying a generalized velocity

to give the kinetic term, and a force matrix multiplying a displacement vector to characterize the interactions. The

best way to obtain solutions is to determine the system’s eigenvectors, its normal modes, by diagonalizing the matrix

equation. Techniques like this are crucial when it comes to the internal dynamics of molecules: the internal vibrations of systems consisting of mutually bound component atoms.[96] They are also needed for describing mechanical

vibrations, and oscillations in electrical circuits.[97]

8.11. HISTORY

8.10.7

105

Geometrical optics

Geometrical optics provides further matrix applications. In this approximative theory, the wave nature of light is

neglected. The result is a model in which light rays are indeed geometrical rays. If the deﬂection of light rays by

optical elements is small, the action of a lens or reﬂective element on a given light ray can be expressed as multiplication

of a two-component vector with a two-by-two matrix called ray transfer matrix: the vector’s components are the light

ray’s slope and its distance from the optical axis, while the matrix encodes the properties of the optical element.

Actually, there are two kinds of matrices, viz. a refraction matrix describing the refraction at a lens surface, and a

translation matrix, describing the translation of the plane of reference to the next refracting surface, where another

refraction matrix applies. The optical system, consisting of a combination of lenses and/or reﬂective elements, is

simply described by the matrix resulting from the product of the components’ matrices.[98]

8.10.8

Electronics

Traditional mesh analysis in electronics leads to a system of linear equations that can be described with a matrix.

The behaviour of many electronic components can be described using matrices. Let A be a 2-dimensional vector

with the component’s input voltage v1 and input current i1 as its elements, and let B be a 2-dimensional vector

with the component’s output voltage v2 and output current i2 as its elements. Then the behaviour of the electronic

component can be described by B = H · A, where H is a 2 x 2 matrix containing one impedance element (h12 ),

one admittance element (h21 ) and two dimensionless elements (h11 and h22 ). Calculating a circuit now reduces to

multiplying matrices.

8.11 History

Matrices have a long history of application in solving linear equations but they were known as arrays until the 1800s.

The Chinese text The Nine Chapters on the Mathematical Art written in 10th–2nd century BCE is the ﬁrst example

of the use of array methods to solve simultaneous equations,[99] including the concept of determinants. In 1545

Italian mathematician Girolamo Cardano brought the method to Europe when he published Ars Magna.[100] The

Japanese mathematician Seki used the same array methods to solve simultaneous equations in 1683.[101] The Dutch

Mathematician Jan de Witt represented transformations using arrays in his 1659 book Elements of Curves (1659).[102]

Between 1700 and 1710 Gottfried Wilhelm Leibniz publicized the use of arrays for recording information or solutions

and experimented with over 50 diﬀerent systems of arrays.[100] Cramer presented his rule in 1750.

The term “matrix” (Latin for “womb”, derived from mater—mother[103] ) was coined by James Joseph Sylvester in

1850,[104] who understood a matrix as an object giving rise to a number of determinants today called minors, that is

to say, determinants of smaller matrices that derive from the original one by removing columns and rows. In an 1851

paper, Sylvester explains:

I have in previous papers deﬁned a “Matrix” as a rectangular array of terms, out of which diﬀerent

systems of determinants may be engendered as from the womb of a common parent.[105]

Arthur Cayley published a treatise on geometric transformations using matrices that were not rotated versions of the

coeﬃcients being investigated as had previously been done. Instead he deﬁned operations such as addition, subtraction, multiplication, and division as transformations of those matrices and showed the associative and distributive

properties held true. Cayley investigated and demonstrated the non-commutative property of matrix multiplication

as well as the commutative property of matrix addition.[100] Early matrix theory had limited the use of arrays almost

exclusively to determinants and Arthur Cayley’s abstract matrix operations were revolutionary. He was instrumental

in proposing a matrix concept independent of equation systems. In 1858 Cayley published his Memoir on the theory

of matrices[106][107] in which he proposed and demonstrated the Cayley-Hamilton theorem.[100]

An English mathematician named Cullis was the ﬁrst to use modern bracket notation for matrices in 1913 and he

simultaneously demonstrated the ﬁrst signiﬁcant use the notation A = [ai,j] to represent a matrix where ai,j refers to

the ith row and the jth column.[100]

The study of determinants sprang from several sources.[108] Number-theoretical problems led Gauss to relate coefﬁcients of quadratic forms, i.e., expressions such as x2 + xy − 2y2 , and linear maps in three dimensions to matrices. Eisenstein further developed these notions, including the remark that, in modern parlance, matrix products are

106

CHAPTER 8. MATRIX (MATHEMATICS)

non-commutative. Cauchy was the ﬁrst to prove general statements about determinants, using as deﬁnition of the

determinant of a matrix A = [ai,j] the following: replace the powers aj k by ajk in the polynomial

a1 a2 · · · an

∏

(aj − ai )

i<j

where Π denotes the product of the indicated terms. He also showed, in 1829, that the eigenvalues of symmetric matrices are real.[109] Jacobi studied “functional determinants”—later called Jacobi determinants by Sylvester—which

can be used to describe geometric transformations at a local (or inﬁnitesimal) level, see above; Kronecker’s Vorlesungen über die Theorie der Determinanten[110] and Weierstrass’ Zur Determinantentheorie,[111] both published in 1903,

ﬁrst treated determinants axiomatically, as opposed to previous more concrete approaches such as the mentioned

formula of Cauchy. At that point, determinants were ﬁrmly established.

Many theorems were ﬁrst established for small matrices only, for example the Cayley–Hamilton theorem was proved

for 2×2 matrices by Cayley in the aforementioned memoir, and by Hamilton for 4×4 matrices. Frobenius, working

on bilinear forms, generalized the theorem to all dimensions (1898). Also at the end of the 19th century the Gauss–

Jordan elimination (generalizing a special case now known as Gauss elimination) was established by Jordan. In the

early 20th century, matrices attained a central role in linear algebra.[112] partially due to their use in classiﬁcation of

the hypercomplex number systems of the previous century.

The inception of matrix mechanics by Heisenberg, Born and Jordan led to studying matrices with inﬁnitely many

rows and columns.[113] Later, von Neumann carried out the mathematical formulation of quantum mechanics, by

further developing functional analytic notions such as linear operators on Hilbert spaces, which, very roughly speaking,

correspond to Euclidean space, but with an inﬁnity of independent directions.

8.11.1

Other historical usages of the word “matrix” in mathematics

The word has been used in unusual ways by at least two authors of historical importance.

Bertrand Russell and Alfred North Whitehead in their Principia Mathematica (1910–1913) use the word “matrix” in

the context of their Axiom of reducibility. They proposed this axiom as a means to reduce any function to one of

lower type, successively, so that at the “bottom” (0 order) the function is identical to its extension:

“Let us give the name of matrix to any function, of however many variables, which does not involve any

apparent variables. Then any possible function other than a matrix is derived from a matrix by means

of generalization, i.e., by considering the proposition which asserts that the function in question is true

with all possible values or with some value of one of the arguments, the other argument or arguments

remaining undetermined”.[114]

For example a function Φ(x, y) of two variables x and y can be reduced to a collection of functions of a single variable,

e.g., y, by “considering” the function for all possible values of “individuals” ai substituted in place of variable x. And

then the resulting collection of functions of the single variable y, i.e., ∀aᵢ: Φ(ai, y), can be reduced to a “matrix” of

values by “considering” the function for all possible values of “individuals” bi substituted in place of variable y:

∀b ∀aᵢ: Φ(ai, b ).

Alfred Tarski in his 1946 Introduction to Logic used the word “matrix” synonymously with the notion of truth table

as used in mathematical logic.[115]

8.12 See also

• Algebraic multiplicity

• Geometric multiplicity

• Gram-Schmidt process

• List of matrices

8.13. NOTES

107

• Matrix calculus

• Periodic matrix set

• Tensor

8.13 Notes

[1] equivalently, table

[2] Anton (1987, p. 23)

[3] Beauregard & Fraleigh (1973, p. 56)

[4] Young, Cynthia. Precalculus. Laurie Rosatone. p. 727. Check date values in: |accessdate= (help);

[5] K. Bryan and T. Leise. The $25,000,000,000 eigenvector: The linear algebra behind Google. SIAM Review, 48(3):569–

581, 2006.

[6] Lang 2002

[7] Fraleigh (1976, p. 209)

[8] Nering (1970, p. 37)

[9] Oualline 2003, Ch. 5

[10] “How to organize, add and multiply matrices - Bill Shillito”. TED ED. Retrieved April 6, 2013.

[11] Brown 1991, Deﬁnition I.2.1 (addition), Deﬁnition I.2.4 (scalar multiplication), and Deﬁnition I.2.33 (transpose)

[12] Brown 1991, Theorem I.2.6

[13] Brown 1991, Deﬁnition I.2.20

[14] Brown 1991, Theorem I.2.24

[15] Horn & Johnson 1985, Ch. 4 and 5

[16] Bronson (1970, p. 16)

[17] Kreyszig (1972, p. 220)

[18] Protter & Morrey (1970, p. 869)

[19] Kreyszig (1972, pp. 241,244)

[20] Schneider, Hans; Barker, George Phillip (2012), Matrices and Linear Algebra, Dover Books on Mathematics, Courier

Dover Corporation, p. 251, ISBN 9780486139302.

[21] Perlis, Sam (1991), Theory of Matrices, Dover books on advanced mathematics, Courier Dover Corporation, p. 103, ISBN

9780486668109.

[22] Anton, Howard (414), Elementary Linear Algebra (10th ed.), John Wiley & Sons, ISBN 9780470458211 .

[23] Horn, Roger A.; Johnson, Charles R. (2012), Matrix Analysis (2nd ed.), Cambridge University Press, p. 17, ISBN

9780521839402.

[24] Brown 1991, I.2.21 and 22

[25] Greub 1975, Section III.2

[26] Brown 1991, Deﬁnition II.3.3

[27] Greub 1975, Section III.1

[28] Brown 1991, Theorem II.3.22

[29] Horn & Johnson 1985, Theorem 2.5.6

[30] Brown 1991, Deﬁnition I.2.28

108

[31] Brown 1991, Deﬁnition I.5.13

[32] Horn & Johnson 1985, Chapter 7

[33] Horn & Johnson 1985, Theorem 7.2.1

[34] Horn & Johnson 1985, Example 4.0.6, p. 169

[35] Brown 1991, Deﬁnition III.2.1

[36] Brown 1991, Theorem III.2.12

[37] Brown 1991, Corollary III.2.16

[38] Mirsky 1990, Theorem 1.4.1

[39] Brown 1991, Theorem III.3.18

[40] Brown 1991, Deﬁnition III.4.1

[41] Brown 1991, Deﬁnition III.4.9

[42] Brown 1991, Corollary III.4.10

[43] Householder 1975, Ch. 7

[44] Bau III & Trefethen 1997

[45] Golub & Van Loan 1996, Algorithm 1.3.1

[46] Golub & Van Loan 1996, Chapters 9 and 10, esp. section 10.2

[47] Golub & Van Loan 1996, Chapter 2.3

[48] For example, Mathematica, see Wolfram 2003, Ch. 3.7

[49] Press, Flannery & Teukolsky 1992

[50] Stoer & Bulirsch 2002, Section 4.1

[51] Horn & Johnson 1985, Theorem 2.5.4

[52] Horn & Johnson 1985, Ch. 3.1, 3.2

[53] Arnold & Cooke 1992, Sections 14.5, 7, 8

[54] Bronson 1989, Ch. 15

[55] Coburn 1955, Ch. V

[56] Lang 2002, Chapter XIII

[57] Lang 2002, XVII.1, p. 643

[58] Lang 2002, Proposition XIII.4.16

[59] Reichl 2004, Section L.2

[60] Greub 1975, Section III.3

[61] Greub 1975, Section III.3.13

[62] See any standard reference in group.

[63] Baker 2003, Def. 1.30

[64] Baker 2003, Theorem 1.2

[65] Artin 1991, Chapter 4.5

[66] Rowen 2008, Example 19.2, p. 198

[67] See any reference in representation theory or group representation.

[68] See the item “Matrix” in Itõ, ed. 1987

CHAPTER 8. MATRIX (MATHEMATICS)

8.13. NOTES

109

[69] “Empty Matrix: A matrix is empty if either its row or column dimension is zero”, Glossary, O-Matrix v6 User Guide

[70] “A matrix having at least one dimension equal to zero is called an empty matrix”, MATLAB Data Structures

[71] Fudenberg & Tirole 1983, Section 1.1.1

[72] Manning 1999, Section 15.3.4

[73] Ward 1997, Ch. 2.8

[74] Stinson 2005, Ch. 1.1.5 and 1.2.4

[75] Association for Computing Machinery 1979, Ch. 7

[76] Godsil & Royle 2004, Ch. 8.1

[77] Punnen 2002

[78] Lang 1987a, Ch. XVI.6

[79] Nocedal 2006, Ch. 16

[80] Lang 1987a, Ch. XVI.1

[81] Lang 1987a, Ch. XVI.5. For a more advanced, and more general statement see Lang 1969, Ch. VI.2

[82] Gilbarg & Trudinger 2001

[83] Šolin 2005, Ch. 2.5. See also stiﬀness method.

[84] Latouche & Ramaswami 1999

[85] Mehata & Srinivasan 1978, Ch. 2.8

[86] Healy, Michael (1986), Matrices for Statistics, Oxford University Press, ISBN 978-0-19-850702-4

[87] Krzanowski 1988, Ch. 2.2., p. 60

[88] Krzanowski 1988, Ch. 4.1

[89] Conrey 2007

[90] Zabrodin, Brezin & Kazakov et al. 2006

[91] Itzykson & Zuber 1980, Ch. 2

[92] see Burgess & Moore 2007, section 1.6.3. (SU(3)), section 2.4.3.2. (Kobayashi–Maskawa matrix)

[93] Schiﬀ 1968, Ch. 6

[94] Bohm 2001, sections II.4 and II.8

[95] Weinberg 1995, Ch. 3

[96] Wherrett 1987, part II

[97] Riley, Hobson & Bence 1997, 7.17

[98] Guenther 1990, Ch. 5

[99] Shen, Crossley & Lun 1999 cited by Bretscher 2005, p. 1

[100] Discrete Mathematics 4th Ed. Dossey, Otto, Spense, Vanden Eynden, Published by Addison Wesley, October 10, 2001

ISBN 978-0321079121 | p.564-565

[101] Needham, Joseph; Wang Ling (1959). Science and Civilisation in China III. Cambridge: Cambridge University Press. p.

117. ISBN 9780521058018.

[102] Discrete Mathematics 4th Ed. Dossey, Otto, Spense, Vanden Eynden, Published by Addison Wesley, October 10, 2001

ISBN 978-0321079121 | p.564

[103] Merriam–Webster dictionary, Merriam–Webster, retrieved April 20, 2009

110

CHAPTER 8. MATRIX (MATHEMATICS)

[104] Although many sources state that J. J. Sylvester coined the mathematical term “matrix” in 1848, Sylvester published nothing

in 1848. (For proof that Sylvester published nothing in 1848, see: J. J. Sylvester with H. F. Baker, ed., The Collected

Mathematical Papers of James Joseph Sylvester (Cambridge, England: Cambridge University Press, 1904), vol. 1.) His

earliest use of the term “matrix” occurs in 1850 in: J. J. Sylvester (1850) “Additions to the articles in the September

number of this journal, “On a new class of theorems,” and on Pascal’s theorem,” The London, Edinburgh and Dublin

Philosophical Magazine and Journal of Science, 37 : 363-370. From page 369: “For this purpose we must commence, not

with a square, but with an oblong arrangement of terms consisting, suppose, of m lines and n columns. This will not in

itself represent a determinant, but is, as it were, a Matrix out of which we may form various systems of determinants … "

[105] The Collected Mathematical Papers of James Joseph Sylvester: 1837–1853, Paper 37, p. 247

[106] Phil.Trans. 1858, vol.148, pp.17-37 Math. Papers II 475-496

[107] Dieudonné, ed. 1978, Vol. 1, Ch. III, p. 96

[108] Knobloch 1994

[109] Hawkins 1975

[110] Kronecker 1897

[111] Weierstrass 1915, pp. 271–286

[112] Bôcher 2004

[113] Mehra & Rechenberg 1987

[114] Whitehead, Alfred North; and Russell, Bertrand (1913) Principia Mathematica to *56, Cambridge at the University Press,

Cambridge UK (republished 1962) cf page 162ﬀ.

[115] Tarski, Alfred; (1946) Introduction to Logic and the Methodology of Deductive Sciences, Dover Publications, Inc, New York

NY, ISBN 0-486-28462-X.

[1] Eigen means “own” in German and in Dutch.

[2] Additionally, the group is required to be closed in the general linear group.

[3] “Not much of matrix theory carries over to inﬁnite-dimensional spaces, and what does is not so useful, but it sometimes

helps.” Halmos 1982, p. 23, Chapter 5

8.14 References

• Anton, Howard (1987), Elementary Linear Algebra (5th ed.), New York: Wiley, ISBN 0-471-84819-0

• Arnold, Vladimir I.; Cooke, Roger (1992), Ordinary diﬀerential equations, Berlin, DE; New York, NY:

Springer-Verlag, ISBN 978-3-540-54813-3

• Artin, Michael (1991), Algebra, Prentice Hall, ISBN 978-0-89871-510-1

• Association for Computing Machinery (1979), Computer Graphics, Tata McGraw–Hill, ISBN 978-0-07-0593763

• Baker, Andrew J. (2003), Matrix Groups: An Introduction to Lie Group Theory, Berlin, DE; New York, NY:

Springer-Verlag, ISBN 978-1-85233-470-3

• Bau III, David; Trefethen, Lloyd N. (1997), Numerical linear algebra, Philadelphia, PA: Society for Industrial

and Applied Mathematics, ISBN 978-0-89871-361-9

• Beauregard, Raymond A.; Fraleigh, John B. (1973), A First Course In Linear Algebra: with Optional Introduction to Groups, Rings, and Fields, Boston: Houghton Miﬄin Co., ISBN 0-395-14017-X

• Bretscher, Otto (2005), Linear Algebra with Applications (3rd ed.), Prentice Hall

• Bronson, Richard (1970), Matrix Methods: An Introduction, New York: Academic Press, LCCN 70097490

• Bronson, Richard (1989), Schaum’s outline of theory and problems of matrix operations, New York: McGraw–

Hill, ISBN 978-0-07-007978-6

8.14. REFERENCES

111

• Brown, William C. (1991), Matrices and vector spaces, New York, NY: Marcel Dekker, ISBN 978-0-82478419-5

• Coburn, Nathaniel (1955), Vector and tensor analysis, New York, NY: Macmillan, OCLC 1029828

• Conrey, J. Brian (2007), Ranks of elliptic curves and random matrix theory, Cambridge University Press, ISBN

978-0-521-69964-8

• Fraleigh, John B. (1976), A First Course In Abstract Algebra (2nd ed.), Reading: Addison-Wesley, ISBN 0201-01984-1

• Fudenberg, Drew; Tirole, Jean (1983), Game Theory, MIT Press

• Gilbarg, David; Trudinger, Neil S. (2001), Elliptic partial diﬀerential equations of second order (2nd ed.),

Berlin, DE; New York, NY: Springer-Verlag, ISBN 978-3-540-41160-4

• Godsil, Chris; Royle, Gordon (2004), Algebraic Graph Theory, Graduate Texts in Mathematics 207, Berlin,

DE; New York, NY: Springer-Verlag, ISBN 978-0-387-95220-8

• Golub, Gene H.; Van Loan, Charles F. (1996), Matrix Computations (3rd ed.), Johns Hopkins, ISBN 978-08018-5414-9

• Greub, Werner Hildbert (1975), Linear algebra, Graduate Texts in Mathematics, Berlin, DE; New York, NY:

Springer-Verlag, ISBN 978-0-387-90110-7

• Halmos, Paul Richard (1982), A Hilbert space problem book, Graduate Texts in Mathematics 19 (2nd ed.),

Berlin, DE; New York, NY: Springer-Verlag, ISBN 978-0-387-90685-0, MR 675952

• Horn, Roger A.; Johnson, Charles R. (1985), Matrix Analysis, Cambridge University Press, ISBN 978-0-52138632-6

• Householder, Alston S. (1975), The theory of matrices in numerical analysis, New York, NY: Dover Publications, MR 0378371

• Kreyszig, Erwin (1972), Advanced Engineering Mathematics (3rd ed.), New York: Wiley, ISBN 0-471-507288.

• Krzanowski, Wojtek J. (1988), Principles of multivariate analysis, Oxford Statistical Science Series 3, The

Clarendon Press Oxford University Press, ISBN 978-0-19-852211-9, MR 969370

• Itõ, Kiyosi, ed. (1987), Encyclopedic dictionary of mathematics. Vol. I-IV (2nd ed.), MIT Press, ISBN 978-0262-09026-1, MR 901762

• Lang, Serge (1969), Analysis II, Addison-Wesley

• Lang, Serge (1987a), Calculus of several variables (3rd ed.), Berlin, DE; New York, NY: Springer-Verlag,

ISBN 978-0-387-96405-8

• Lang, Serge (1987b), Linear algebra, Berlin, DE; New York, NY: Springer-Verlag, ISBN 978-0-387-96412-6

• Lang, Serge (2002), Algebra, Graduate Texts in Mathematics 211 (Revised third ed.), New York: SpringerVerlag, ISBN 978-0-387-95385-4, MR 1878556

• Latouche, Guy; Ramaswami, Vaidyanathan (1999), Introduction to matrix analytic methods in stochastic modeling (1st ed.), Philadelphia, PA: Society for Industrial and Applied Mathematics, ISBN 978-0-89871-425-8

• Manning, Christopher D.; Schütze, Hinrich (1999), Foundations of statistical natural language processing, MIT

Press, ISBN 978-0-262-13360-9

• Mehata, K. M.; Srinivasan, S. K. (1978), Stochastic processes, New York, NY: McGraw–Hill, ISBN 978-0-07096612-3

• Mirsky, Leonid (1990), An Introduction to Linear Algebra, Courier Dover Publications, ISBN 978-0-48666434-7

• Nering, Evar D. (1970), Linear Algebra and Matrix Theory (2nd ed.), New York: Wiley, LCCN 76-91646

112

CHAPTER 8. MATRIX (MATHEMATICS)

• Nocedal, Jorge; Wright, Stephen J. (2006), Numerical Optimization (2nd ed.), Berlin, DE; New York, NY:

Springer-Verlag, p. 449, ISBN 978-0-387-30303-1

• Oualline, Steve (2003), Practical C++ programming, O'Reilly, ISBN 978-0-596-00419-4

• Press, William H.; Flannery, Brian P.; Teukolsky, Saul A.; Vetterling, William T. (1992), “LU Decomposition and Its Applications”, Numerical Recipes in FORTRAN: The Art of Scientiﬁc Computing (PDF) (2nd ed.),

Cambridge University Press, pp. 34–42

• Protter, Murray H.; Morrey, Jr., Charles B. (1970), College Calculus with Analytic Geometry (2nd ed.), Reading:

Addison-Wesley, LCCN 76087042

• Punnen, Abraham P.; Gutin, Gregory (2002), The traveling salesman problem and its variations, Boston, MA:

Kluwer Academic Publishers, ISBN 978-1-4020-0664-7

• Reichl, Linda E. (2004), The transition to chaos: conservative classical systems and quantum manifestations,

Berlin, DE; New York, NY: Springer-Verlag, ISBN 978-0-387-98788-0

• Rowen, Louis Halle (2008), Graduate Algebra: noncommutative view, Providence, RI: American Mathematical

Society, ISBN 978-0-8218-4153-2

• Šolin, Pavel (2005), Partial Diﬀerential Equations and the Finite Element Method, Wiley-Interscience, ISBN

978-0-471-76409-0

• Stinson, Douglas R. (2005), Cryptography, Discrete Mathematics and its Applications, Chapman & Hall/CRC,

ISBN 978-1-58488-508-5

• Stoer, Josef; Bulirsch, Roland (2002), Introduction to Numerical Analysis (3rd ed.), Berlin, DE; New York,

NY: Springer-Verlag, ISBN 978-0-387-95452-3

• Ward, J. P. (1997), Quaternions and Cayley numbers, Mathematics and its Applications 403, Dordrecht, NL:

Kluwer Academic Publishers Group, ISBN 978-0-7923-4513-8, MR 1458894

• Wolfram, Stephen (2003), The Mathematica Book (5th ed.), Champaign, IL: Wolfram Media, ISBN 978-157955-022-6

8.14.1

Physics references

• Bohm, Arno (2001), Quantum Mechanics: Foundations and Applications, Springer, ISBN 0-387-95330-2

• Burgess, Cliﬀ; Moore, Guy (2007), The Standard Model. A Primer, Cambridge University Press, ISBN 0-52186036-9

• Guenther, Robert D. (1990), Modern Optics, John Wiley, ISBN 0-471-60538-7

• Itzykson, Claude; Zuber, Jean-Bernard (1980), Quantum Field Theory, McGraw–Hill, ISBN 0-07-032071-3

• Riley, Kenneth F.; Hobson, Michael P.; Bence, Stephen J. (1997), Mathematical methods for physics and

engineering, Cambridge University Press, ISBN 0-521-55506-X

• Schiﬀ, Leonard I. (1968), Quantum Mechanics (3rd ed.), McGraw–Hill

• Weinberg, Steven (1995), The Quantum Theory of Fields. Volume I: Foundations, Cambridge University Press,

ISBN 0-521-55001-7

• Wherrett, Brian S. (1987), Group Theory for Atoms, Molecules and Solids, Prentice–Hall International, ISBN

0-13-365461-3

• Zabrodin, Anton; Brezin, Édouard; Kazakov, Vladimir; Serban, Didina; Wiegmann, Paul (2006), Applications

of Random Matrices in Physics (NATO Science Series II: Mathematics, Physics and Chemistry), Berlin, DE; New

York, NY: Springer-Verlag, ISBN 978-1-4020-4530-1

8.15. EXTERNAL LINKS

8.14.2

113

Historical references

• A. Cayley A memoir on the theory of matrices. Phil. Trans. 148 1858 17-37; Math. Papers II 475-496

• Bôcher, Maxime (2004), Introduction to higher algebra, New York, NY: Dover Publications, ISBN 978-0-48649570-5, reprint of the 1907 original edition

• Cayley, Arthur (1889), The collected mathematical papers of Arthur Cayley, I (1841–1853), Cambridge University Press, pp. 123–126

• Dieudonné, Jean, ed. (1978), Abrégé d'histoire des mathématiques 1700-1900, Paris, FR: Hermann

• Hawkins, Thomas (1975), “Cauchy and the spectral theory of matrices”, Historia Mathematica 2: 1–29,

doi:10.1016/0315-0860(75)90032-4, ISSN 0315-0860, MR 0469635

• Knobloch, Eberhard (1994), “From Gauss to Weierstrass: determinant theory and its historical evaluations”,

The intersection of history and mathematics, Science Networks Historical Studies 15, Basel, Boston, Berlin:

Birkhäuser, pp. 51–66, MR 1308079

• Kronecker, Leopold (1897), Hensel, Kurt, ed., Leopold Kronecker’s Werke, Teubner

• Mehra, Jagdish; Rechenberg, Helmut (1987), The Historical Development of Quantum Theory (1st ed.), Berlin,

DE; New York, NY: Springer-Verlag, ISBN 978-0-387-96284-9

• Shen, Kangshen; Crossley, John N.; Lun, Anthony Wah-Cheung (1999), Nine Chapters of the Mathematical

Art, Companion and Commentary (2nd ed.), Oxford University Press, ISBN 978-0-19-853936-0

• Weierstrass, Karl (1915), Collected works 3

8.15 External links

Encyclopedic articles

• Hazewinkel, Michiel, ed. (2001), “Matrix”, Encyclopedia of Mathematics, Springer, ISBN 978-1-55608-010-4

History

• MacTutor: Matrices and determinants

• Matrices and Linear Algebra on the Earliest Uses Pages

• Earliest Uses of Symbols for Matrices and Vectors

Online books

• Kaw, Autar K., Introduction to Matrix Algebra, ISBN 978-0-615-25126-4

• The Matrix Cookbook (PDF), retrieved 24 March 2014

• Brookes, Mike (2005), The Matrix Reference Manual, London: Imperial College, retrieved 10 Dec 2008

Online matrix calculators

• SimplyMath (Matrix Calculator)

• Matrix Calculator (DotNumerics)

• Xiao, Gang, Matrix calculator, retrieved 10 Dec 2008

• Online matrix calculator, retrieved 10 Dec 2008

• Online matrix calculator (ZK framework), retrieved 26 Nov 2009

114

CHAPTER 8. MATRIX (MATHEMATICS)

• Oehlert, Gary W.; Bingham, Christopher, MacAnova, University of Minnesota, School of Statistics, retrieved

10 Dec 2008, a freeware package for matrix algebra and statistics

• Online matrix calculator, retrieved 14 Dec 2009

• Operation with matrices in R (determinant, track, inverse, adjoint, transpose)

Chapter 9

Vertex (graph theory)

For other uses, see Vertex (disambiguation).

In mathematics, and more speciﬁcally in graph theory, a vertex (plural vertices) or node is the fundamental unit

6

5

4

1

2

3

A graph with 6 vertices and 7 edges where the vertex number 6 on the far-left is a leaf vertex or a pendant vertex

of which graphs are formed: an undirected graph consists of a set of vertices and a set of edges (unordered pairs

of vertices), while a directed graph consists of a set of vertices and a set of arcs (ordered pairs of vertices). In a

diagram of a graph, a vertex is usually represented by a circle with a label, and an edge is represented by a line or

arrow extending from one vertex to another.

From the point of view of graph theory, vertices are treated as featureless and indivisible objects, although they may

have additional structure depending on the application from which the graph arises; for instance, a semantic network

is a graph in which the vertices represent concepts or classes of objects.

The two vertices forming an edge are said to be the endpoints of this edge, and the edge is said to be incident to the

vertices. A vertex w is said to be adjacent to another vertex v if the graph contains an edge (v,w). The neighborhood

of a vertex v is an induced subgraph of the graph, formed by all vertices adjacent to v.

115

116

CHAPTER 9. VERTEX (GRAPH THEORY)

9.1 Types of vertices

The degree of a vertex in a graph is the number of edges incident to it. An isolated vertex is a vertex with degree

zero; that is, a vertex that is not an endpoint of any edge. A leaf vertex (also pendant vertex) is a vertex with degree

one. In a directed graph, one can distinguish the outdegree (number of outgoing edges) from the indegree (number

of incoming edges); a source vertex is a vertex with indegree zero, while a sink vertex is a vertex with outdegree

zero.

A cut vertex is a vertex the removal of which would disconnect the remaining graph; a vertex separator is a collection

of vertices the removal of which would disconnect the remaining graph into small pieces. A k-vertex-connected graph

is a graph in which removing fewer than k vertices always leaves the remaining graph connected. An independent

set is a set of vertices no two of which are adjacent, and a vertex cover is a set of vertices that includes at least

one endpoint of each edge in the graph. The vertex space of a graph is a vector space having a set of basis vectors

corresponding with the graph’s vertices.

A graph is vertex-transitive if it has symmetries that map any vertex to any other vertex. In the context of graph

enumeration and graph isomorphism it is important to distinguish between labeled vertices and unlabeled vertices.

A labeled vertex is a vertex that is associated with extra information that enables it to be distinguished from other

labeled vertices; two graphs can be considered isomorphic only if the correspondence between their vertices pairs up

vertices with equal labels. An unlabeled vertex is one that can be substituted for any other vertex based only on its

adjacencies in the graph and not based on any additional information.

Vertices in graphs are analogous to, but not the same as, vertices of polyhedra: the skeleton of a polyhedron forms a

graph, the vertices of which are the vertices of the polyhedron, but polyhedron vertices have additional structure (their

geometric location) that is not assumed to be present in graph theory. The vertex ﬁgure of a vertex in a polyhedron

is analogous to the neighborhood of a vertex in a graph.

9.2 See also

• Node (computer science)

• Graph theory

• Glossary of graph theory

9.3 References

• Gallo, Giorgio; Pallotino, Stefano (1988). “Shortest path algorithms”. Annals of Operations Research 13 (1):

1–79. doi:10.1007/BF02288320.

• Berge, Claude, Théorie des graphes et ses applications. Collection Universitaire de Mathématiques, II Dunod,

Paris 1958, viii+277 pp. (English edition, Wiley 1961; Methuen & Co, New York 1962; Russian, Moscow

1961; Spanish, Mexico 1962; Roumanian, Bucharest 1969; Chinese, Shanghai 1963; Second printing of the

1962 ﬁrst English edition. Dover, New York 2001)

• Chartrand, Gary (1985). Introductory graph theory. New York: Dover. ISBN 0-486-24775-9.

• Biggs, Norman; Lloyd, E. H.; Wilson, Robin J. (1986). Graph theory, 1736-1936. Oxford [Oxfordshire]:

Clarendon Press. ISBN 0-19-853916-9.

• Harary, Frank (1969). Graph theory. Reading, Mass.: Addison-Wesley Publishing. ISBN 0-201-41033-8.

• Harary, Frank; Palmer, Edgar M. (1973). Graphical enumeration. New York, Academic Press. ISBN 0-12324245-2.

9.4 External links

• Weisstein, Eric W., “Graph Vertex”, MathWorld.

9.5. TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES

117

9.5 Text and image sources, contributors, and licenses

9.5.1

Text

• Computer science Source: https://en.wikipedia.org/wiki/Computer_science?oldid=668898640 Contributors: AxelBoldt, Derek Ross,

LC~enwiki, Lee Daniel Crocker, Tuxisuau, Brion VIBBER, Mav, Robert Merkel, Espen, The Anome, Tarquin, Taw, Jzcool, DanKeshet,

Andre Engels, Khendon, LA2, Jkominek, Aldie, Fubar Obfusco, SolKarma, SimonP, Peterlin~enwiki, Hannes Hirzel, Ole Aamot,

Camembert, B4hand, Hephaestos, Olivier, Stevertigo, Ghyll~enwiki, DrewT2, JohnOwens, Ted~enwiki, Michael Hardy, Erik Zachte,

Gretyl, Kwertii, JakeVortex, Dante Alighieri, Fuzzie, Rp, Bensmith, Mic, Ixfd64, Phoe6, Sannse, TakuyaMurata, Delirium, Loisel,

7265, Minesweeper, Pcb21, Kvikeg, MartinSpamer, Ahoerstemeier, Haakon, Stan Shebs, Docu, J-Wiki, Kazuo Moriwaka, Angela,

Jdforrester, Salsa Shark, Glenn, Cyan, LouI, Poor Yorick, Nikai, Azazello, Kwekubo, Jiang, Cryoboy, Rob Hooft, Jonik, Mxn, BRG,

Denny, Dgreen34, Schneelocke, Nikola Smolenski, Revolver, Popsracer, Charles Matthews, Guaka, Timwi, Dcoetzee, Sbwoodside, Dysprosia, Jitse Niesen, Jay, Daniel Quinlan, Michaeln, Greenrd, Quux, HappyDog, Tpbradbury, Maximus Rex, Cleduc, Morwen, Buridan, Ed g2s, Persoid, Mikez80, Wakka, Wernher, Bevo, Spikey, Traroth, Shizhao, Farmerchris, Dbabbitt, Raul654, Jim Mahoney,

Marc Girod~enwiki, Guppy, Carbuncle, ThereIsNoSteve, RadicalBender, Robbot, Sdedeo, Fredrik, Hobbes~enwiki, Soilguy2, R3m0t,

RedWolf, Troworld, Altenmann, Naddy, Lowellian, Chris Roy, Mirv, MathMartin, Merovingian, Hellotoast, Rfc1394, Academic Challenger, Texture, Bethenco, Diderot, Hadal, Nerval, Borislav, MOiRe, Pps, Bshankaran, Anthony, Lupo, HaeB, TexasDex, Guy Peters,

Xanzzibar, Iain.mcclatchie, Pengo, Tobias Bergemann, Applegoddess, Ancheta Wis, Decumanus, Honta, Gbali, Giftlite, Thv, Fennec,

Kenny sh, Netoholic, Abigail-II, Levin, Lupin, Zigger, Everyking, Henry Flower, Guanaco, Eequor, Matt Crypto, Just Another Dan,

Arvind Singh, Wmahan, Neilc, Quackor, Andycjp, Dullhunk, Bact, Kjetil r, Mineminemine, Antandrus, BozMo, Thray, Billposer,

APH, Josephgrossberg, Kntg, Bumm13, Sovereigna, Eiserlohpp, Leire Sánchez, Robin klein, Fvilim~enwiki, Andreas Kaufmann, Zondor, Grunt, EagleOne, Bluemask, Corti, Perl guy, Jwdietrich2, MichaelMcGuﬃn, Smimram, Erc, Discospinster, Leibniz, Notinasnaid,

SocratesJedi, Michael Zimmermann, Mani1, BBB~enwiki, Bender235, ESkog, Android79, Kbh3rd, S.K., Mattingly23, Project2501a,

Relix~enwiki, Barfooz, Linn~enwiki, Barcelova, Briséis~enwiki, RoyBoy, Bookofjude, Matteh, Aaronbrick, Coolcaesar, Bobo192, Smalljim, Shenme, Matt Britt, Maurreen, NattyBumppo, Sam Korn, Haham hanuka, Pearle, Mpeisenbr, Nsaa, Mdd, Passw0rd, Poweroid, Alansohn, Liao, Pinar, Samuel.Jones, Tek022, Jason Davies, Hellhound, TheVenerableBede, Walkerma, InShaneee, Hu, Katefan0, DoesPHalt,

Caesura, Polyphilo, Shinjiman, Wtmitchell, Velella, Shepshep, Cburnett, CloudNine, Mikeo, MIT Trekkie, HenryLi, Bookandcoﬀee,

Oleg Alexandrov, SimonW, Ott, Alex.g, Novacatz, Soultaco, Marasmusine, Woohookitty, Debuggar, Uncle G, Robert K S, Ruud Koot,

JeremyA, Orz, MONGO, Nakos2208~enwiki, Shmitra, Al E., TreveX, Ralﬁpedia, Sega381, Z80x86, Graham87, Qwertyus, Chun-hian,

SixWingedSeraph, OMouse, Reisio, Rjwilmsi, Mayumashu, MarSch, Materdaddy, Nneonneo, Ddawson, Jhballard, Bubba73, Brighterorange, The wub, Mkehrt, Kwharris, Sango123, Oo64eva, Leithp, Sheldrake, FayssalF, Johnnyw, Old Moonraker, Mathbot, Crazycomputers, Vsion, Makkuro, TheDJ, Intgr, SpectrumDT, BMF81, Jersey Devil, Bgwhite, Gwernol, Flcelloguy, Jayme, Eray~enwiki, The

Rambling Man, Wavelength, Spacepotato, Angus Lepper, Phantomsteve, RussBot, Jeﬀhoy, Hyad, Piet Delport, Epolk, SpuriousQ, Thoreaulylazy, Stephenb, Gaius Cornelius, Bovineone, Wimt, Anomalocaris, CarlHewitt, Vanished user kjdioejh329io3rksdkj, Mipadi, Grafen,

Jaxl, Ino5hiro, Bobbo, Hakkinen, Anetode, Yym, Jstrater, Jpbowen, JulesH, E rulez, Petr.adamek, Mgnbar, Tigershrike, Light current,

MCB, Sterling, Shimei, The Fish, Claygate, GraemeL, Joshua bigamo, Bachmann1234, Donhalcon, Katieh5584, Kungfuadam, Junglecat, Zvika, DVD R W, Finell, Hide&Reason, Thijswijs, SmackBot, Wilycoder, Sparkz08, Rtc, Slashme, Zanter, Olorin28, K-UNIT,

McGeddon, Brick Thrower, Mmeri, CapitalSasha, Jpvinall, Powo, Gilliam, Ohnoitsjamie, Skizzik, RickiRich, Tv316, Somewherepurple,

Bluebot, Nympheta, Crashuniverse, Jprg1966, Technotaoist, Miquonranger03, Fluri, LaggedOnUser, Spellchecker, Dzonatas, Krallja,

A. B., Dﬂetter, Rrelf, Fireduck, Can't sleep, clown will eat me, Readams, Andri12, Vanished User 0001, Edivorce, Allan McInnes,

Robma, Cybercobra, Jonovision, “alyosha”, MisterCharlie, Dreadstar, Richard001, Tompsci, Iridescence, Brycedrm, JohnC1987, Ultraexactzz, Sigma 7, Zito ta xania, Fyver528, Nazgul533, Lambiam, ArglebargleIV, SilverStar, Harryboyles, Kuru, Treyt021, IAENG,

AlphaTwo, Msc44, Evanx, IronGargoyle, Edenphd, Physis, Ekrub-ntyh, Ckatz, 16@r, JHunterJ, Slakr, Emerybob, Avs5221, Dicklyon, Tee Owe, Allamericanbear, Eridani, Dhp1080, RichardF, Xionbox, Beefyt, Hu12, Lucid, Levineps, DouglasCalvert, Siebrand,

OnBeyondZebrax, Iridescent, Onestone, Markan~enwiki, Xsmith, Joseph Solis in Australia, Pegasus1138, Aeternus, Igoldste, Crippled Sloth, Courcelles, FairuseBot, Tawkerbot2, Jwalls, CRGreathouse, Ahy1, CBM, Banedon, NaBUru38, NickW557, Requestion,

Leujohn, Myasuda, Simeon, Gregbard, Mac010382, Bobthesmiley, Porco-esphino, Gogo Dodo, Blaisorblade, Christian75, Chrislk02,

Garik, Kozuch, Daven200520, Omicronpersei8, EnglishEfternamn, Epbr123, ClosedEyesSeeing, Hunan131, Headbomb, Newton2, Louis

Waweru, Ideogram, Thiaguwin, Mikeeg555, Druiloor, Klausness, Dawnseeker2000, Escarbot, AntiVandalBot, BokicaK, Luna Santin,

Seaphoto, Olexandr Kravchuk, Poshzombie, Superzohar, Mihas, Kdano, Carewolf, Hermel, JAnDbot, Niaz, Husond, Jimothytrotter,

Nthep, Mark Shaw, Rstevens27, Aviroop Ghosh, Fourchannel, Dream Focus, Bookinvestor, Raanoo, 4jobs, Bongwarrior, VoABot II,

Nyq, Necklace, JamesBWatson, Appraiser, Jlenthe, Cadsuane Melaidhrin, Rivertorch, Nikevich, Indon, Nucleophilic, ArchStanton69,

Allstarecho, Bmeguru, JaGa, Kgﬂeischmann, Esanchez7587, D.h, Calltech, Pavel Jelínek, Gwern, Hdt83, MartinBot, Mouhanad alramli,

Anaxial, CommonsDelinker, Pacdude9, Erkan Yilmaz, J.delanoy, Pedrito, Trusilver, Metamusing, Sandeepgupta, Ps ttf, Maurice Carbonaro, Rodhilton, Mike.lifeguard, Christian Storm, Tparameter, The Transhumanist (AWB), NewEnglandYankee, Hennessey, Patrick,

Brian Pearson, Sanscrit1234, Jevansen, Bonadea, Dzenanz, User77764, Regenspaziergang, Neil Dodgson, Cromoser, Idioma-bot, Sheliak, Wikieditor06, Vranak, 28bytes, Hersfold, Fossum~enwiki, Balaji7, MagicBanana, Barneca, Philip Trueman, TXiKiBoT, Coder Dan,

Austin Henderson, The Original Wildbear, Technopat, Sparkzy, Tomsega, Tms9980, Ocolon, T-Solo202, Ferengi, Metalmaniac69, Jackfork, Psyche825, Noformation, Everything counts, The Divine Fluﬀalizer, ARUNKUMAR P.R, Hankhuck, Andy Dingley, Julcia, Yk

Yk Yk, Wolfrock, Piecemealcranky, Careercornerstone, Lake Greifen, Oldwes, Nighthawk19, Insanity Incarnate, Sebastjanmm, Pjoef,

Palaeovia, E. H.-A. Gerbracht, Demize, NHRHS2010, Matthe20, D. Recorder, S.Örvarr.S, SieBot, EllenPetersen, Dawn Bard, Poisoncarter, Bruchowski, Ham Pastrami, Jerryobject, Happysailor, Flyer22, Radon210, JCLately, JetLover, JSpung, Aruton, Oxymoron83,

Anjin-san, Vpovilaitis, Lightmouse, Poindexter Propellerhead, Ceas webmaster, StaticGull, Mori Riyo~enwiki, Maxime.Debosschere,

Denisarona, Savie Kumara, Kayvan45622, Martarius, Sfan00 IMG, ClueBot, MBD123, Bwfrank, Foxj, The Thing That Should Not Be,

Chocoforfriends, Keeper76, HairyFotr, Diana cionoiu, Meisterkoch, Ndenison, Keraunoscopia, R000t, WDavis1911, Der Golem, Uncle

Milty, Agogino, SuperHamster, Niceguyedc, Zow, Amomam, Darkstar56, Jmcangas, Masterpiece2000, Excirial, Pumpmeup, Bedwanimas214, Diderot’s dreams, Jusdafax, Waiwai933, Farisori, John Nevard, Jakraay, Hezarfenn, Muhandes, Buscalade, Alejandrocaro35,

Sun Creator, Turnipface, Brianbjparker, Hans Adler, Morel, H.Marxen, ChrisHamburg, Thehelpfulone, GlasGhost, La Pianista, Thingg,

Hunhot, PCHS-NJROTC, Apparition11, DumZiBoT, AzraelUK, XLinkBot, Spitﬁre, Pichpich, Mohammadshamma, Rror, Pasha11,

Pimpedshortie99, Dhall1245, Little Mountain 5, Srikant.sharma, Dimoes, MCR789, Skarebo, WikHead, Galzigler, Airplaneman, Branrile09, Ackmenm, Max the tenken, Maimai009, Addbot, Some jerk on the Internet, DOI bot, Farzan mc, Betterusername, Elsendero,

CanadianLinuxUser, MrOllie, Download, LaaknorBot, Favonian, West.andrew.g, 5 albert square, Unknown483, Gusisgay, Cupat07, Systemetsys, Tide rolls, Bﬁgura’s puppy, Verbal, Teles, Jarble, Luckas-bot, Yobot, OrgasGirl, Fraggle81, MarioS, Cyanoa Crylate, SergeyJ,

118

CHAPTER 9. VERTEX (GRAPH THEORY)

Jnivekk, KamikazeBot, Khalfani khaldun, Sajibcse, Backslash Forwardslash, AnomieBOT, DemocraticLuntz, Jim1138, IRP, Galoubet, Royote, JackieBot, 9258fahsﬂkh917fas, Piano non troppo, Danielt998, Law, Flewis, Lilgip01, Giants27, Materialscientist, Rtyq2,

Salem F, Danno uk, Citation bot, Neurolysis, Roxxyroxursox, Quebec99, Xqbot, WikiNSK, Hubbard rox 2008, DSisyphBot, Grim23,

Raj Wijesinghe, Blix1ms0ns, Tyrol5, Miym, Deadbeatatdawn, Лев Дубовой, Shirik, Mathonius, Erstats, Amaury, Doulos Christos,

Dontknoa, Shadowjams, Methcub, CSgroup7, Luminique, Remshad, Velblod, CES1596, ESpublic013, FrescoBot, Skylark2008, Vitomontreal, Tobby72, Mark Renier, ToxicOranges, Recognizance, Vacuunaut, MTizz1, Machine Elf 1735, Louperibot, OgreBot, Citation bot 1, Dilaksan, MacMed, Pinethicket, Kiefer.Wolfowitz, BRUTE, Achraf52, Ezrdr, SpaceFlight89, Talbg, Meaghan, RandomStringOfCharacters, Jauhienij, Weylinp, Keri, Trappist the monk, SchreyP, Si23mk4n32i, Alexmilt, Lotje, Keith Cascio, Thefakeeditor, Ladies gifts, Weedwhacker128, Mttcmbs, Lysander89, Yondonjamts, DARTH SIDIOUS 2, Rednas1234, Saywhatman, Иъ Лю

Ха, Sarang, John.legal, Star-Syrup, Gnabi82, EmausBot, John of Reading, Acather96, WikitanvirBot, Pfuchs722, Surlyduﬀ50, Ibbn,

Tinytn, Xiaogaozi, Pratapy, Solarra, Tommy2010, Lightdarkend, Wikipelli, K6ka, Djembayz, Lucas Thoms, Sciprecision, AaronLLF,

Namastheg, BigMattyO, Cogiati, Spykeretro, Fæ, Josve05a, Bijuro, Steave77, H3llBot, Dennis714, Bveedu, Prashant Dey, Jay-Sebastos,

Vanished user ﬁjtji34toksdcknqrjn54yoimascj, Donner60, Junip~enwiki, Orange Suede Sofa, Rangoon11, Tijfo098, Danushka99999,

Srshetty94, TYelliot, 28bot, BigMatty93, Scotty16-2007, GreenEdu, Petrb, Hughleat, Signalizing, ClueBot NG, LogX, This lousy Tshirt, Satellizer, Sdht, Jcrwaford5, Fauzan, Hon-3s-T, Astew10, Dfarrell07, Bergbra, Rinaku, Cntras, Cnkids, O.Koslowski, Mcasswidmeyer, Widr, Tonywchen, Ashish Gaikwad, Ajjuddn, Lawsonstu, Saketmitra, Jk2q3jrklse, Helpful Pixie Bot, HMSSolent, Jkimdgu,

Wald, Wbm1058, Jiule0, Trunks ishida, Lowercase sigmabot, BG19bot, Furkhaocean, ISTB351, MusikAnimal, J991, Neutral current, FutureTrillionaire, Sickdartzepic, Cadiomals, Mayuri.sandhanshiv, CalaD33, Kairi p, Mihai.stefanache, Salesvery1, Bryson1410,

Zhenyanwang1, Sreedharram, Carso empires, Isacdaavid, Abilngeorge, Klilidiplomus, Pavankbk1113, Anbu121, XIN3N, LloydOlivier,

BattyBot, Computer tower, Mburkhol, Alkafriﬁras, ComputerScienceForum, Valueindian, Fagitcasey, E prosser, Varagrawal, The Illusive Man, GoShow, Chitraproject, JYBot, Tow, Mogism, Mani306, BlackHawkToughbook, Lugia2453, ַאְבָרָהם, Jamesx12345, Elcashini,

Zziccardi, Itoula, Snehlata1102, Ekips39, Faizan, Epicgenius, Babara150504, Crap12321, Littlejimmylel, Maggots187, Perfecshun,

Netiru, Agenbola1, Red-eyed demon, RG3Redskins, Eyesnore, PhantomTech, Tiberius6996, Satassi, Tentinator, Dad29, JpurvisUM,

Nbak, Kanoog, NJIT HUM dad29, Backendgaming, DavidLeighEllis, Diptytiw, Hollylilholly, Sibekoe, Spyglasses,

, Ginsuloft,

Quenhitran, MrLinkinPark333, Dannyruthe, Manul, TCMemoire, Rons corner, Jaaron95, Ritik2345678, Philroc, Sbrankov05, Magicalbeakz1, JaconaFrere, Indiasian mbe maﬁa, 7Sidz, Eaglepuﬀs, Kgeza71, CompSci, Bobobobobobobdup, Monkbot, Wigid, Vieque, Ahsannaweed101, James.hochadel, 1908rs, BobVermont, Swet.anzel mee, NishantRM, Stuartbrade, Chacha2001, Typherix, Crfranklin, Susith

Chandira Gts, Antithesisx, Oﬀy284, Robie024, Nigerhoe, Psychedgrad, ChamithN, Crystallizedcarbon, Prachi2812, Rider ranger47,

Yilinglou, Iman.haghdost, Hansguyholt, Yaourrrt, Pishcal, ErickaAgent, Astrachano, Yuil.Tr, K scheik, Swagkid1010, ABWarrick,

Niceguy69, KasparBot, Jamieddd, PACIFICASIAWiki, Brahma Pacey, Zakzak1112 and Anonymous: 1380

• Discrete mathematics Source: https://en.wikipedia.org/wiki/Discrete_mathematics?oldid=668426222 Contributors: AxelBoldt, General Wesc, Toby Bartels, Miguel~enwiki, Camembert, Bdesham, Michael Hardy, Nixdorf, Wapcaplet, TakuyaMurata, Nanshu, Snoyes,

Jdforrester, Poor Yorick, Rotem Dan, Andres, Mxn, Dgreen34, Charles Matthews, Dysprosia, McKay, Jni, Phil Boswell, Robbot, Josh

Cherry, R3m0t, Gandalf61, Tobias Bergemann, Marc Venot, Giftlite, DocWatson42, Nick8325, Tom harrison, Ds13, Guanaco, David

Battle, Knutux, APH, TiMike, Peter Kwok, Discospinster, Rich Farmbrough, Agnistus, Vsmith, Mani1, Goochelaar, ESkog, ZeroOne,

Zaslav, Kosmotheoria, Edwinstearns, RoyBoy, Obradovic Goran, Jumbuck, Msh210, Shoeﬂy, Igorpak, Oleg Alexandrov, Nahabedere,

MZMcBride, Bubba73, FlaBot, Jittat~enwiki, Psantora, Chobot, Kummi, YurikBot, Hairy Dude, Michael Slone, Hede2000, Bhny,

Stephenb, Chaos, Grafen, Trovatore, ZacBowling, Zwobot, Klutzy, Bbaumer, SimonMorgan, Majkl82, Sardanaphalus, JJL, SmackBot,

GoOdCoNtEnT, Kurykh, Silly rabbit, Taxipom, JonHarder, Jon Awbrey, N Shar, Drunken Pirate, Vriullop, JoshuaZ, MTSbot~enwiki,

Aeons, Dlohcierekim, JForget, CRGreathouse, Albert.white, Basawala, WeggeBot, Werratal, NoUser, Gregbard, Mike Christie, Christian75, Chrislk02, Epbr123, Hazmat2, Marek69, AntiVandalBot, JustOneJake, Seaphoto, JAnDbot, MER-C, The Transhumanist, Thenub314,

Avjoska, Jakob.scholbach, SwiftBot, BBar, David Eppstein, Vigyani, Tgeairn, Coppertwig, The Transhumanist (AWB), Ac3bf1, JohnBlackburne, LokiClock, VasilievVV, TXiKiBoT, Crohnie, Anna Lincoln, MarkMarek, Wikiisawesome, Lerdthenerd, Dmcq, Ohiostandard, EmxBot, Radagast3, Ivan Štambuk, Gerakibot, Atlandau, Xelgen, Dcspc, Jorgen W, ClueBot, Justin W Smith, The Thing That

Should Not Be, ChandlerMapBot, Alexbot, Muhandes, Cenarium, Bos Gaurus, PCHS-NJROTC, Marc van Leeuwen, Addbot, NjardarBot, MrVanBot, AndersBot, West.andrew.g, Teles, Zorrobot, Luckas-bot, Yobot, KamikazeBot, AnomieBOT, 1exec1, Galoubet,

JackieBot, Rtyq2, Citation bot, Wrelwser43, ArthurBot, Xqbot, Hydrox24, Tyrol5, GrouchoBot, Deadbeatatdawn, Shirik, Point-set

topologist, Charvest, Shadowjams, Adrignola, FrescoBot, Mark Renier, Citation bot 1, Kiefer.Wolfowitz, Carlc75, Mercy11, Trappist

the monk, Hurricoaster, Gf uip, EmausBot, Racerx11, Wikipelli, Darkﬁght, Bethnim, Akutagawa10, Werieth, Bollyjeﬀ, TomasMartin,

Lorem Ip, EdoBot, Petrb, ClueBot NG, Raiden10, Wcherowi, Bped1985, Rocker202, Jorgenev, Helpful Pixie Bot, Discreto, Cgnorthcutt,

Paolo Lipparini, SoniyaR, Ved.rigved, Yogesh2611, Brad7777, EricEnfermero, Cleanton, ChrisGualtieri, TheJJJunk, Frosty, Dhriti pati

sarkar 1641981, BenCluﬀ, GrettoBob, Jianhui67, Lagoset, Monkbot, SoSivr and Anonymous: 194

• Glossary of graph theory Source: https://en.wikipedia.org/wiki/Glossary_of_graph_theory?oldid=666492791 Contributors: Damian

Yerrick, XJaM, Nonenmac, Tomo, Edward, Patrick, Michael Hardy, Wshun, Booyabazooka, Dcljr, TakuyaMurata, GTBacchus, Eric119,

Charles Matthews, Dcoetzee, Dysprosia, Doradus, Reina riemann, Markhurd, Maximus Rex, Hyacinth, Populus, Altenmann, MathMartin,

Bkell, Giftlite, Dbenbenn, Brona, Sundar, GGordonWorleyIII, HorsePunchKid, Peter Kwok, D6, Rich Farmbrough, ArnoldReinhold, Paul

August, Bender235, Zaslav, Kjoonlee, Elwikipedista~enwiki, El C, Yitzhak, TheSolomon, A1kmm, 3mta3, Jérôme, Ricky81682, Rdvdijk, Oleg Alexandrov, Joriki, Linas, MattGiuca, Ruud Koot, Jwanders, Xiong, Lasunncty, SixWingedSeraph, Grammarbot, Tizio, Salix

alba, Mathbot, Margosbot~enwiki, Sunayana, Pojo, Quuxplusone, Vonkje, N8wilson, Chobot, Algebraist, YurikBot, Me and, Altoid,

Grubber, Archelon, Gaius Cornelius, Rick Norwood, Ott2, Closedmouth, SmackBot, Stux, Achab, Brick Thrower, Mgreenbe, Mcld,

[email protected], Lansey, Thechao, JLeander, DVanDyck, Quaeler, RekishiEJ, CmdrObot, Csaracho, Citrus538, Jokes Free4Me,

Cydebot, Starcrab, Quintopia, Ferris37, Scarpy, Headbomb, Salgueiro~enwiki, Spanningtree, David Eppstein, JoergenB, Kope, CopyToWiktionaryBot, R'n'B, Leyo, Mikhail Dvorkin, The Transliterator, Ratfox, MentorMentorum, Skaraoke, SanitySolipsism, Anonymous

Dissident, PaulTanenbaum, Ivan Štambuk, Whorush, Eggwadi, Thehotelambush, Doc honcho, Anchor Link Bot, Rsdetsch, Denisarona,

Justin W Smith, Unbuttered Parsnip, Happynomad, Alexey Muranov, Addbot, Aarond144, Jﬁtzell, Nate Wessel, Yobot, Jalal0, Ian Kelling,

Citation bot, Buenasdiaz, Twri, Kinewma, Miym, Prunesqualer, Mzamora2, JZacharyG, Pmq20, Shadowjams, Hobsonlane, DixonDBot, Reaper Eternal, EmausBot, John of Reading, Wikipelli, Bethnim, Mastergreg82, ClueBot NG, EmanueleMinotto, Warumwarum,

DavidRideout, BG19bot, Andrey.gric, Szebenisz, ChrisGualtieri, Deltahedron, Jw489kent, Jmerm, Morgoth106, SofjaKovalevskaja and

Anonymous: 131

• Graph (mathematics) Source: https://en.wikipedia.org/wiki/Graph_(mathematics)?oldid=668424269 Contributors: The Anome, Manning Bartlett, XJaM, Tomo, Stevertigo, Patrick, Michael Hardy, W~enwiki, Zocky, Wshun, Booyabazooka, Karada, Ahoerstemeier, Den

fjättrade ankan~enwiki, Jiang, Dcoetzee, Dysprosia, Doradus, Zero0000, McKay, BenRG, Robbot, LuckyWizard, Mountain, Altenmann,

9.5. TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES

119

Mayooranathan, Gandalf61, MathMartin, Timrollpickering, Bkell, Tobias Bergemann, Tosha, Giftlite, Dbenbenn, Harp, Tom harrison,

Chinasaur, Jason Quinn, Matt Crypto, Neilc, Erhudy, Knutux, Yath, Joeblakesley, Tomruen, Peter Kwok, Aknorals, Chmod007, Abdull, Corti, PhotoBox, Discospinster, Rich Farmbrough, Andros 1337, Paul August, Zaslav, Gauge, Tompw, Crisóﬁlax, Yitzhak, Kine,

Bobo192, Jpiw~enwiki, Mdd, Jumbuck, Zachlipton, Sswn, Liao, Rgclegg, Paleorthid, Super-Magician, Mahanga, Joriki, Mindmatrix,

Wesley Moy, Oliphaunt, Brentdax, Jwanders, Tbc2, Cbdorsett, Ch'marr, Davidfstr, Xiong, Marudubshinki, Tslocum, Magister Mathematicae, Ilya, SixWingedSeraph, Sjö, Rjwilmsi, Salix alba, Bhadani, FlaBot, Nowhither, Mathbot, Gurch, MikeBorkowski~enwiki,

Chronist~enwiki, Silversmith, Chobot, Peterl, Siddhant, Borgx, Karlscherer3, Hairy Dude, Gene.arboit, Michael Slone, Gaius Cornelius,

Shanel, Gwaihir, Dtrebbien, Dureo, Doetoe, Wknight94, Netrapt, RobertBorgersen, Cjfsyntropy, Burnin1134, SmackBot, Nihonjoe,

Stux, McGeddon, BiT, Algont, Ohnoitsjamie, Chris the speller, Bluebot, TimBentley, Theone256, Cornﬂake pirate, Zven, Anabus, Can't

sleep, clown will eat me, Tamfang, Cybercobra, Jon Awbrey, Kuru, Nat2, Tomhubbard, Dicklyon, Cbuckley, Quaeler, BranStark, Wandrer2, George100, Ylloh, Vaughan Pratt, Repied, CRGreathouse, Citrus538, Jokes Free4Me, Requestion, Myasuda, Danrah, Robertsteadman, Eric Lengyel, Headbomb, Urdutext, AntiVandalBot, Hannes Eder, JAnDbot, MER-C, Dreamster, Struthious Bandersnatch, JNW,

Catgut, David Eppstein, JoergenB, MartinBot, Rettetast, R'n'B, J.delanoy, Hans Dunkelberg, Yecril, Pafcu, Ijdejter, Deor, ABF, Maghnus, TXiKiBoT, Sdrucker, Someguy1221, PaulTanenbaum, Lambyte, Ilia Kr., Jpeeling, Falcon8765, RaseaC, Insanity Incarnate, Zenek.k,

Radagast3, Debamf, Debeolaurus, SieBot, Minder2k, Dawn Bard, Cwkmail, Jon har, SophomoricPedant, Oxymoron83, Henry Delforn

(old), Ddxc, Svick, Phegyi81, Anchor Link Bot, ClueBot, Vacio, Nsk92, JuPitEer, Huynl, JP.Martin-Flatin, Xavexgoem, UKoch, Mitmaro, Editor70, Watchduck, Hans Adler, Suchap, Wikidsp, Muro Bot, 3ICE, Aitias, Versus22, Djk3, Kruusamägi, SoxBot III, XLinkBot,

Marc van Leeuwen, Libcub, WikiDao, Tangi-tamma, Addbot, Gutin, Athenray, Willking1979, Royerloic, West.andrew.g, Tyw7, Zorrobot, LuK3, Luckas-bot, Yobot, TaBOT-zerem, THEN WHO WAS PHONE?, E mraedarab, Tempodivalse, Пика Пика, Ulric1313,

RandomAct, Materialscientist, Twri, Dockﬁsh, Anand jeyahar, Miym, Prunesqualer, Andyman100, VictorPorton, JonDePlume, Shadowjams, A.amitkumar, Kracekumar, Edgars2007, Citation bot 1, DrilBot, Amintora, Pinethicket, Calmer Waters, RobinK, Barras, Tgv8925,

DARTH SIDIOUS 2, Powerthirst123, DRAGON BOOSTER, Mymyhoward16, Kerrick Staley, Ajraddatz, Wgunther, Bethnim, Akutagawa10, White Trillium, Josve05a, D.Lazard, L Kensington, Maschen, Inka 888, Chewings72, ClueBot NG, Wcherowi, MelbourneStar,

Kingmash, O.Koslowski, Joel B. Lewis, Andrewsky00, Timﬂutre, Helpful Pixie Bot, HMSSolent, Grolmusz, Mrjohncummings, Stevetihi,

Канеюку, Void-995, MRG90, Vanischenu, Tman159, Ekren, Lugia2453, Jeﬀ Erickson, CentroBabbage, Nina Cerutti, Chip Wildon

Forster, Yloreander, Manul, JaconaFrere, Monkbot, Hou710, Anon124 and Anonymous: 351

• Graph theory Source: https://en.wikipedia.org/wiki/Graph_theory?oldid=667682086 Contributors: AxelBoldt, Kpjas, LC~enwiki, Robert

Merkel, Zundark, Taw, Jeronimo, BlckKnght, Dze27, Oskar Flordal, Andre Engels, Karl E. V. Palmen, Shd~enwiki, XJaM, JeLuF,

Arvindn, Gianfranco, Matusz, PierreAbbat, Miguel~enwiki, Boleslav Bobcik, FvdP, Camembert, Hirzel, Tomo, Patrick, Chas zzz brown,

Michael Hardy, Wshun, Booyabazooka, Glinos, Meekohi, Jakob Voss, TakuyaMurata, GTBacchus, Grog~enwiki, Pcb21, Dgrant, CesarB,

Looxix~enwiki, Ellywa, Ams80, Ronz, Nanshu, Gyan, Nichtich~enwiki, Mark Foskey, Александър, Poor Yorick, Caramdir~enwiki,

Mxn, Charles Matthews, Berteun, Almi, Hbruhn, Dysprosia, Daniel Quinlan, Gutza, Doradus, Zoicon5, Roachmeister, Populus, Zero0000,

Doctorbozzball, McKay, Shizhao, Optim, Robbot, Brent Gulanowski, Fredrik, Altenmann, Dittaeva, Gandalf61, MathMartin, Sverdrup,

Puckly, KellyCoinGuy, Thesilverbail, Bkell, Paul Murray, Fuelbottle, ElBenevolente, Aknxy, Dina, Tobias Bergemann, Giftlite, Dbenbenn, Thv, The Cave Troll, Elf, Lupin, Brona, Pashute, Duncharris, Andris, Jorge Stolﬁ, Tyir, Sundar, GGordonWorleyIII, Alan Au,

Bact, Knutux, APH, Tomruen, Tyler McHenry, Naerbnic, Peter Kwok, Robin klein, Ratiocinate, Andreas Kaufmann, Chmod007, Madewokherd, Discospinster, Solitude, Guanabot, Qutezuce, Mani1, Paul August, Bender235, Zaslav, Tompw, Diego UFCG~enwiki, Chalst,

Shanes, Renice, C S, Csl77, Jojit fb, Photonique, Jonsafari, Obradovic Goran, Jumbuck, Msh210, Alansohn, Liao, Mailer diablo, Marianocecowski, Aquae, Blair Azzopardi, Oleg Alexandrov, Youngster68, Linas, LOL, Ruud Koot, Tckma, Astrophil, Davidfstr, GregorB,

SCEhardt, Stochata, Xiong, Graham87, Magister Mathematicae, SixWingedSeraph, Rjwilmsi, Gmelli, George Burgess, Eugeneiiim, Arbor, Kalogeropoulos, Fred Bradstadt, FayssalF, FlaBot, PaulHoadley, RexNL, Vonkje, Chobot, Jinma, YurikBot, Wavelength, Michael

Slone, Gaius Cornelius, Alex Bakharev, Morphh, SEWilcoBot, Jaxl, Ino5hiro, Xdenizen, Daniel Mietchen, Shepazu, Rev3nant, Lt-wikibot, Jwissick, Arthur Rubin, LeonardoRob0t, Agro1986, Eric.weigle, Allens, Sardanaphalus, Melchoir, Brick Thrower, Ohnoitsjamie, Oli

Filth, OrangeDog, Taxipom, DHN-bot~enwiki, Tsca.bot, Onorem, GraphTheoryPwns, Lpgeﬀen, Jon Awbrey, Henning Makholm, Mlpkr,

SashatoBot, Whyﬁsh, Disavian, MynameisJayden, Idiosyncratic-bumblebee, Dicklyon, Quaeler, Lanem, Tawkerbot2, Ylloh, Mahlerite,

CRGreathouse, Dycedarg, Requestion, Bumbulski, Myasuda, RUVARD, The Isiah, Ntsimp, Abeg92, Corpx, DumbBOT, Anthonynow12,

Thijs!bot, Jheuristic, King Bee, Pstanton, Hazmat2, Headbomb, Marek69, Eleuther, AntiVandalBot, Whiteknox, Hannes Eder, Spacefarer, Myanw, JAnDbot, MER-C, Igodard, Restname, Sangak, Tmusgrove, Feeeshboy, Usien6, Ldecola, David Eppstein, Kope, DerHexer, Oroso, MartinBot, R'n'B, Uncle Dick, Joespiﬀ, Ignatzmice, Shikhar1986, Tarotcards, Policron, XxjwuxX, Yecril, JohnBlackburne,

Dggreen, Anonymous Dissident, Alcidesfonseca, Anna Lincoln, Ocolon, Magmi, PaulTanenbaum, Geometry guy, Fivelittlemonkeys, Sacredmint, Spitﬁre8520, Radagast3, SieBot, Dawn Bard, Toddst1, Jon har, Bananastalktome, Titanic4000, Beda42, Maxime.Debosschere,

Damien Karras, ClueBot, DFRussia, PipepBot, Justin W Smith, Vacio, Wraithful, Garyzx, Mild Bill Hiccup, DragonBot, Fchristo, Hans

Adler, Dafyddg, Razorﬂame, Rmiesen, Kruusamägi, Pugget, Darkicebot, BodhisattvaBot, Tangi-tamma, Addbot, Dr.S.Ramachandran,

Cerber, DOI bot, Ronhjones, Low-frequency internal, CanadianLinuxUser, Protonk, LaaknorBot, Smoke73, Delaszk, Favonian, Maurobio, Lightbot, Jarble, Ettrig, Luckas-bot, Yobot, Kilom691, Trinitrix, Jean.julius, AnomieBOT, Womiller99, Sonia, Jim1138, Piano non

troppo, Gragragra, RandomAct, Citation bot, Ayda D, Xqbot, Jerome zhu, Capricorn42, Nasnema, Miym, GiveAFishABone, RibotBOT,

Jalpar75, Aaditya 7, Ankitbhatt, FrescoBot, Mark Renier, SlumdogAramis, Citation bot 1, Sibian, Pinethicket, RobinK, Wsu-dm-jb,

D75304, Wsu-f, Xnn, Obankston, Andrea105, RjwilmsiBot, TjBot, Powerthirst123, Aaronzat, EmausBot, Domesticenginerd, EleferenBot, Jmencisom, Slawekb, Akutagawa10, D.Lazard, Netha Hussain, Tolly4bolly, ChuispastonBot, EdoBot, ClueBot NG, Watersmeetfreak, Matthiaspaul, MelbourneStar, Outraged duck, OMurgo, Bazuz, Aks1521, Masssly, Joel B. Lewis, Johnsopc, HMSSolent, 4368a,

BG19bot, Ajweinstein, Канеюку, MusikAnimal, AvocatoBot, Bereziny, Brad7777, Soﬁa karampataki, ChrisGualtieri, GoShow, Dexbot,

Cerabot~enwiki, Omgigotanaccount, Wikiisgreat123, Faizan, Maxwell bernard, Bg9989, Zsoftua, SakeUPenn, Yloreander, StaticElectricity, Gold4444, Cyborgbadger, Zachwaltman, Gr pbi, KasparBot and Anonymous: 378

• Loop (graph theory) Source: https://en.wikipedia.org/wiki/Loop_(graph_theory)?oldid=640385005 Contributors: Booyabazooka, McKay,

MathMartin, Paul August, Cburnett, Oliphaunt, Dmharvey, Dtrebbien, Gadget850, SmackBot, BiT, Tsca.bot, Lambiam, 16@r, CmdrObot, Letranova, Kylemahan, David Eppstein, Rovnet, ClueBot, JP.Martin-Flatin, Addbot, Twri, Asfarer, FrescoBot, Ricardo Ferreira

de Oliveira, Sinuhe20, MerlIwBot, Ibraheemmoosa and Anonymous: 6

• Mathematics Source: https://en.wikipedia.org/wiki/Mathematics?oldid=667759115 Contributors: AxelBoldt, Magnus Manske, LC~enwiki,

Brion VIBBER, Eloquence, Mav, Bryan Derksen, Zundark, The Anome, Tarquin, Koyaanis Qatsi, Ap, Gareth Owen, -- April, RK,

Iwnbap, LA2, Youssefsan, XJaM, Arvindn, Christian List, Matusz, Toby Bartels, PierreAbbat, Little guru, Miguel~enwiki, Rade Kutil,

DavidLevinson, FvdP, Daniel C. Boyer, David spector, Camembert, Netesq, Zippy, Olivier, Ram-Man, Stevertigo, Spiﬀ~enwiki, Edward,

Quintessent, Ghyll~enwiki, D, Chas zzz brown, JohnOwens, Michael Hardy, Booyabazooka, JakeVortex, Lexor, Isomorphic, Dominus,

120

CHAPTER 9. VERTEX (GRAPH THEORY)

Nixdorf, Grizzly, Kku, Mic, Ixfd64, Firebirth, Alireza Hashemi, Dcljr, Sannse, TakuyaMurata, Karada, Minesweeper, Alﬁo, Tregoweth,

Dgrant, CesarB, Ahoerstemeier, Cyp, Ronz, Muriel Gottrop~enwiki, Snoyes, Notheruser, Angela, Den fjättrade ankan~enwiki, Kingturtle, LittleDan, Kevin Baas, Salsa Shark, Glenn, Jschwa1, Bogdangiusca, BenKovitz, Poor Yorick, Rossami, Tim Retout, Rotem Dan,

Evercat, Rl, Jonik, Madir, Mxn, Smack, Silverﬁsh, Vargenau, Pizza Puzzle, Nikola Smolenski, Charles Matthews, Guaka, Timwi, Spacemonkey~enwiki, Nohat, Ralesk, MarcusVox, Dysprosia, Jitse Niesen, Fuzheado, Gutza, Piolinfax, Selket, DJ Clayworth, Markhurd, Vancouverguy, Tpbradbury, Maximus Rex, Hyacinth, Saltine, AndrewKepert, Fibonacci, Zero0000, Phys, Ed g2s, Wakka, Samsara, Bevo,

McKay, Traroth, Fvw, Babaloulou, Secretlondon, Jusjih, Cvaneg, Flockmeal, Guppy, Francs2000, Dmytro, Lumos3, Jni, PuzzletChung,

Donarreiskoﬀer, Robbot, Fredrik, RedWolf, Peak, Romanm, Lowellian, Gandalf61, Georg Muntingh, Merovingian, HeadCase, Sverdrup,

Henrygb, Academic Challenger, IIR, Thesilverbail, Hadal, Mark Krueger, Wereon, Robinh, Borislav, GarnetRChaney, Ilya (usurped),

Michael Snow, Fuelbottle, ElBenevolente, Lupo, PrimeFan, Zhymkus~enwiki, Dmn, Cutler, Dina, Mlk, Alan Liefting, Rock69~enwiki,

Cedars, Ancheta Wis, Fabiform, Centrx, Giftlite, Dbenbenn, Christopher Parham, Fennec, Markus Krötzsch, Mikez, Inter, Wolfkeeper,

Ævar Arnfjörð Bjarmason, Netoholic, Lethe, Tom harrison, Lupin, MathKnight, Bﬁnn, Ayman, Everyking, No Guru, Curps, Jorend, Ssd,

Niteowlneils, Gareth Wyn, Andris, Guanaco, Sundar, Daniel Brockman, Siroxo, Node ue, Eequor, Arne List, Matt Crypto, Python eggs,

Avala, Jackol, Marlonbraga, Bobblewik, Deus Ex, Golbez, Gubbubu, Kennethduncan, Cap601, Geoﬀspear, Utcursch, Andycjp, CryptoDerk, LucasVB, Quadell, Frogjim~enwiki, Antandrus, BozMo, Rajasekaran Deepak, Beland, WhiteDragon, Bcameron54, Kaldari,

PDH, Profvk, Jossi, Alexturse, Adamsan, CSTAR, Rdsmith4, APH, John Foley, Elektron, Pethan, Mysidia, Pmanderson, Elroch, Sam

Hocevar, Arcturus, Gscshoyru, Stephen j omalley, Jcw69, Ukexpat, Eduardoporcher, Qef, Random account 47, Zondor, Adashiel, Trevor

MacInnis, Grunt, Kate, Bluemask, PhotoBox, Mike Rosoft, Vesta~enwiki, Shahab, Oskar Sigvardsson, Brianjd, D6, CALR, DanielCD,

Olga Raskolnikova, EugeneZelenko, Discospinster, Rich Farmbrough, Guanabot, FiP, Clawed, Inkypaws, Spundun, Andrewferrier, ArnoldReinhold, HeikoEvermann, Smyth, Notinasnaid, AlanBarrett, Paul August, MarkS, DcoetzeeBot~enwiki, Bender235, ESkog, Geoking66, Ben Standeven, Tompw, GabrielAPetrie, RJHall, MisterSheik, Mr. Billion, El C, Chalst, Shanes, Haxwell, Briséis~enwiki, Art

LaPella, RoyBoy, Lyght, Jpgordon, JRM, Porton, Bobo192, Ntmatter, Fir0002, Mike Schwartz, Wood Thrush, Func, Teorth, Flxmghvgvk,

Archfalhwyl, Jung dalglish, Maurreen, Man vyi, Alphax, Rje, Sam Korn, Krellis, Sean Kelly, Jonathunder, Mdd, Tsirel, Passw0rd,

Lawpjc, Vesal, Storm Rider, Stephen G. Brown, Danski14, Msh210, Poweroid, Alansohn, Gary, JYolkowski, Anthony Appleyard,

Blackmail~enwiki, Mo0, Polarscribe, ChristopherWillis, Lordthees, Rgclegg, Jet57, Muﬃn~enwiki, Mmmready, Riana, AzaToth, Lectonar, Lightdarkness, Giant toaster, Cjnm, Mysdaao, Hu, Malo, Avenue, Blobglob, LavosBacons, Schapel, Orionix, BanyanTree, Saga

City, Knowledge Seeker, ReyBrujo, Danhash, Garzo, Huerlisi, Jon Cates, RainbowOfLight, CloudNine, TenOfAllTrades, Mcmillin24,

Bsadowski1, Itsmine, Blaxthos, HenryLi, Bookandcoﬀee, Kz8, Oleg Alexandrov, Ashujo, Stemonitis, Novacatz, Angr, DealPete, Kelly

Martin, Wikiworkerindividual***, TSP, OwenX, Woohookitty, Linas, Masterjamie, Yansa, Brunnock, Carcharoth, BillC, Ruud Koot,

WadeSimMiser, Orz, Hdante, MONGO, Mpatel, Abhilaa, Al E., Wikiklrsc, Bbatsell, Damicatz, Terence, MFH, Sengkang, Zzyzx11,

Noetica,

, Xiong Chiamiov, Gimboid13, Liface, Asdfdsa, PeregrineAY, Thirty-seven, Graham87, Magister Mathematicae,

BD2412, Chun-hian, FreplySpang, JIP, Island, Zoz, Icey, BorgHunter, Josh Parris, Paul13~enwiki, Rjwilmsi, Mayumashu, MJSkia1, Prateekrr, Vary, MarSch, Amire80, Tangotango, Staecker, Omnieiunium, Salix alba, Tawker, Zhurovai, Crazynas, Ligulem, Juan Marquez,

Slac, R.e.b., The wub, Sango123, Yamamoto Ichiro, Kasparov, Staples, Titoxd, Pruneau, RobertG, Latka, Mathbot, Harmil, Narxysus,

Andy85719, RexNL, Gurch, Short Verses, Quuxplusone, Celendin, Ichudov, Jagginess, Alphachimp, Malhonen, David H Braun (1964),

Snailwalker, Mongreilf, Chobot, Jersey Devil, DONZOR, DVdm, Cactus.man, John-Haggerty, Gwernol, Elfguy, Buggi22, Roboto de

Ajvol, Raelx, JPD, YurikBot, Wavelength, Karlscherer3, Jeremybub, Doug Alford, Grifter84, RobotE, Elapsed, Dmharvey, Gmackematix, 4C~enwiki, RussBot, Michael Slone, Geologician, Red Slash, Jtkiefer, Muchness, Anonymous editor, Albert Einsteins pipe,

Nobs01, Soltras, Bhny, Piet Delport, CanadianCaesar, Polyvios, Akamad, Stephenb, Yakuzai, Sacre, Bovineone, Tungsten, Ugur Basak,

David R. Ingham, NawlinWiki, Vanished user kjdioejh329io3rksdkj, Rick Norwood, Misos, SEWilcoBot, Wiki alf, Mipadi, Armindo,

Deskana, Johann Wolfgang, Trovatore, Joel7687, GrumpyTroll, LMSchmitt, Schlaﬂy, Eighty~enwiki, Herve661, JocK, Mccready, Tearlach, Apokryltaros, JDoorjam, Abb3w, Misza13, My Cat inn, Vikvik, Mvsmith, Brucevdk, DryaUnda, SFC9394, Font, Tachyon01,

Mgnbar, Jemebius, Nlu, Mike92591, Dna-webmaster, Tonywalton, Joshurtree, Wknight94, Pooryorick~enwiki, Avraham, Mkns, Googl,

Noosfractal, SimonMorgan, Tigershrike, FF2010, Cursive, Scheinwerfermann, Enormousdude, TheKoG, Donald Albury, Zsynopsis,

Skullﬁssion, Claygate, MaNeMeBasat, GraemeL, JoanneB, Bentong Isles, Donhalcon, JLaTondre, Jaranda, Spliﬀy, Flowersofnight, 158152-12-77, RunOrDie, Kungfuadam, Canadianism, Ben D., Greatal386, JDspeeder1, Saboteur~enwiki, Asterion, Shmm70, Pentasyllabic, Lunch, DVD R W, Finell, Capitalist, Sardanaphalus, Crystallina, JJL, SmackBot, RDBury, YellowMonkey, Selfworm, Smitz, Bobet,

Diggyba, Warhawkhalo101, Estoy Aquí, Reedy, Tarret, KnowledgeOfSelf, Royalguard11, Melchoir, McGeddon, Pavlovič, Masparasol,

Pgk, C.Fred, AndyZ, Kilo-Lima, Jagged 85, PizzaMargherita, CapitalSasha, Antibubbles, AnOddName, Canthusus, BiT, Nscheﬀey,

Amystreet, Ekilfeather, Papep, Jaichander, Ohnoitsjamie, Hmains, Skizzik, Richﬁfe, ERcheck, Hopper5, Squiddy, Armeria, Durova,

Qtoktok, Wigren, Keegan, Woofboy, Rmt2m, Fplay, Christopher denman, Miquonranger03, MalafayaBot, Silly rabbit, Alink, Dlohcierekim’s sock, Richard Woods, Kungming2, Go for it!, Baa, Rdt~enwiki, Spellchecker, Baronnet, Colonies Chris, Ulises Sarry~enwiki,

Nevada, Zachorious, Chendy, J•A•K, Can't sleep, clown will eat me, RyanEberhart, Timothy Clemans, Милан Јелисавчић, TheGerm,

HoodedMan, Chlewbot, Vanished User 0001, Joshua Boniface, TheKMan, Rrburke, Addshore, Mr.Z-man, SundarBot, AndySimpson,

Emre D., Iapetus, Jwy, CraigDesjardins, Daqu, Nakon, VegaDark, Jiddisch~enwiki, Maxwahrhaftig, Salt Yeung, Danielkwalsh, Diocles,

Pg2114, Jon Awbrey, Ruwanraj, Jklin, Xen 1986, Just plain Bill, Knuckles sonic8, Where, Bart v M, ScWizard, Pilotguy, Nov ialiste,

JoeTrumpet, Math hater, Lambiam, Nishkid64, TachyonP, ArglebargleIV, Doug Bell, Harryboyles, Srikeit, Dbtfz, Kuru, JackLumber, Simonkoldyk, Vgy7ujm, Nat2, Cronholm144, Heimstern, Gobonobo, Mﬁshergt, Coastergeekperson04, Sir Nicholas de Mimsy-Porpington,

Dumelow, Jazriel, Gnevin, Unterdenlinden, Ckatz, Loadmaster, Special-T, Dozing, Mr Stephen, Mudcower, AxG, Optakeover, SandyGeorgia, Mets501, Funnybunny, Markjdb, Ryulong, Gﬀ~enwiki, RichardF, Limaner, Jose77, Asyndeton, Stephen B Streater, Politepunk,

DabMachine, Levineps, Hetar, BranStark, Roland Deschain, Kevlar992, Iridescent, K, Kencf0618, Zootsuits, Onestone, Nilamdoc, C.

Lee, CzarB, Polymerbringer, Joseph Solis in Australia, Newone, White wolf753, Muéro, David Little, Igoldste, Amakuru, Marysunshine,

Maelor, Masshaj, Jatrius, Experiment123, Tawkerbot2, Daniel5127, Joshuagross, Emote, Pikminiman, Heyheyhey99, JForget, Smkumar0, Sakowski, Wolfdog, Sleeping123, CRGreathouse, Wafulz, Sir Vicious, Triage, Iced Kola, CBM, Page Up, Jester-Tester, Taylorhewitt, Nczempin, GHe, Green caterpillar, Phanu9000, Yarnalgo, Thomasmeeks, McVities, Requestion, FlyingToaster, MarsRover,

Tac-Tics, Some P. Erson, Tim1988, Tuluat, Alaymehta, MrFish, Oo7565, Gregbard, Captmog, El3m3nt09, Antiwiki~enwiki, Cydebot,

Meznaric, Cantras, Funwithbig, MC10, Meno25, Gogo Dodo, DVokes, ST47, Srinath555, Pascal.Tesson, Goldencako, Benjiboi, Andrewm1986, Michael C Price, Tawkerbot4, Dragomiloﬀ, Juansempere, M a s, Chrislk02, Brotown3, Mamounjo, 5300abc, Roccorossi,

Abtract, Daven200520, Omicronpersei8, Vanished User jdksfajlasd, Daniel Olsen, Ventifact, TAU710, Aditya Kabir, BetacommandBot,

Thijs!bot, Epbr123, Bezking, Jpark3591, Daemen, TheEmaciatedStilson, MCrawford, Opabinia regalis, Mattyboy500, Kilva, Daniel,

Loudsox, Ucanlookitup, Hazmat2, Wootwootwoot, Brian G. Wilson, Timo3, Mojo Hand, Djfeldman, Pjvpjv, West Brom 4ever, John254,

Alientraveller, Mnemeson, Ollyrobotham, BadKarma14, Sethdoe92, Dfrg.msc, RobHar, CharlotteWebb, Dawnseeker2000, RoboServien,

Escarbot, Itsfrankie1221, Thomaswgc, Thadius856, Sidasta, AntiVandalBot, Ais523, RobotG, Gioto, Luna Santin, Dark Load, DarkAu-

9.5. TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES

121

dit, Ringleader1489, Dylan Lake, Doktor Who, Chill doubt, AxiomShell, Abc30, Matheor, Archmagusrm, Falconleaf, Labongo, Spacefarer, Chocolatepizza, JAnDbot, Kaobear, MyNamesLogan, MER-C, The Transhumanist, Db099221, AussieOzborn au, Thenub314,

Mosesroses, Hut 8.5, Kipholbeck, Xact, Twospoonfuls, .anacondabot, Yahel Guhan, Bencherlite, Yurei-eggtart, Bongwarrior, VoABot

II, JamesBWatson, Swpb, EdwardLockhart, SineWave, Charlielee111, Cic, Ryeterrell, Caesarjbsquitti, Wikiwhat?, Bubba hotep, KConWiki, Meb43, Faustnh, Hiplibrarianship, Johnbibby, Seberle, MetsBot, Pawl Kennedy, 28421u2232nfenfcenc, Systemlover, Bmeguru,

Hotmedal, Just James, EstebanF, Glen, Rajpaj, Memorymentor, TheRanger, Calltech, Gun Powder Ma, Welshleprechaun, Robin S,

Seba5618, SquidSK, 0612, J0equ1nn, Riccardobot, Jtir, Hdt83, MartinBot, Vladimir m, Arjun01, Quanticle, Nocklas, Rettetast, Fuzzyhair2, R'n'B, Pbroks13, Cmurphy au, Snozzer, Ben2then, PrestonH, Crazybobson, Thefutureschannel, RockMFR, Hrishikesh.24889,

J.delanoy, Nev1, Unlockitall, Phoenix1177, Numbo3, Sp3000, Maurice Carbonaro, Nigholith, Hellonicole, -jmac-, Boris Allen, 2boobies, Jerry, TheSeven, NerdyNSK, Syphertext, Yadar677, Taop, G. Campbell, Wayp123, Keesiewonder, Matt1314, Ksucemfof, Gzkn,

Ivelnaps, Smeira, DarkFalls, Thomas Larsen, Vishi-vie, Washington8785, Xyzaxis, Arkuski, JDQuimby, Batmanfan77, Alphapeta, Trd89,

HiLo48, The Transhumanist (AWB), NewEnglandYankee, RANDP, MKoltnow, MhordeXsnipa, Milogardner, Nacrha, Balaam42, Mviergujerghs89fhsdifds, Cfrehr, Elvisfan2095, Tiyoringo, Juliancolton, Cometstyles, DavidCBryant, SlightlyMad, Jamesontai, Remember

the dot, Ilya Voyager, Huzefahamid, Dandy mandy, Andreas2001, Ishap, Sarregouset, CANUTELOOL2, CANUTELOOL3, Devonboy69, Jeyarathan, Death blaze, Emo kid you?, Thedudester, Samlyn.josfyn, Mother69, Vinsfan368, Cartiod, Helldude99, Sternkampf,

Steel1943, CardinalDan, RJASE1, Idioma-bot, Remi0o, Lights, Tamillimat, Bandaidboy, C.lettingaAV, VolkovBot, Somebodyreallycool, Pleasantville, Jeﬀ G., JohnBlackburne, Hhjk, The Catcher in The Rye D:, Alexandria, AlnoktaBOT, Dboerstl, NikolaiLobachevsky,

Bangvang, 62 (number), Tseay11, Soliloquial, Headforaheadeyeforaneye, Barneca, Sześćsetsześćdziesiątsześć, Zeuron, Yoyoyo9, Trehansiddharth, TXiKiBoT, Katoa, Jacob Lundberg, Candy-Panda, Chickenclucker, Antoni Barau, Walor, Anonymous Dissident, Qxz, Nukemason4, Retiono Virginian, Ocolon, Savagepine, DennyColt, Digby Tantrum, JhsBot, Leafyplant, Beanai, 20em89.01, Cremepuﬀ222,

Geometry guy, Canyonsupreme, Natural Philosopher, Teller33, Mathsmad, Unknown 987, Tarten5, Nickmuller, Robomonster, Wolfrock,

Jacob501, Kreemy, Synthebot, Tomaxer, Careercornerstone, Enviroboy, Rurik3, Sardonicone, Evanbrown326, Alliashax, Sylent, Rubentimothy, SMIE SMIE, Gamahucher, Braindamage3, Animalalley12895, Moohahaha, Thanatos666, Dillydumdum, AlleborgoBot, Voicework, Symane, Katzmik, Monkeynuts27, Demmy, Cam275, GoonerDP, SieBot, Mikemoral, James Banogon, BotMultichill, Timgregg96,

Triwbe, 5150pacer, Soler97, Andersmusician, Anubhav29, Keilana, Tiptoety, Arbor to SJ, Undead Herle King, Richardcraig, Paolo.dL,

Boogster, Oxymoron83, Henry Delforn (old), Avnjay, MiNombreDeGuerra, RW Marloe, SH84, Deejaye6, Musse-kloge, Jorgen W,

Kumioko, Correogsk, MadmanBot, Nomoneynotime, Nickm4c, Darkmyst932, Anchor Link Bot, Jacob.jose, Randomblue, Melcombe,

CaptainIron555, Yhkhoo, Dabomb87, Jat99, Pinkadelica, Francvs, Athenean, Ooswesthoesbes, ClueBot, Volcom5347, Gladysamuel,

GPdB, Bwfrank, DFRussia, PipepBot, Foxj, Dobermanji, C1932, Remus John Lupin, Chocoforfriends, Smithpith, ArdClose, IceUnshattered, Plastikspork, Lawrence Cohen, Gawaxay, Nnemo, Ukabia, Michael.Urban, Niceguyedc, Xenon54, Mspraveen, DragonBot,

Isaac25, 4pario, Donkeyboya, Excirial, CBOrgatrope, Bedsandbellies, Soccermaster3112, Alexbot, TonyBallioni, Pjb14, 0na01der, Andy

pyro, Wikibobspider, BrentLeah, Eeekster, Anonymous1324354657687980897867564534231, Mycatiscool, Greenjuice, Chance Jeong,

Arunta007, Greenjuice3.0, Greenjuice4, AnimeFan7, MacedonianBoy, ZuluPapa5, NuclearWarfare, JoelDick, Honeyspots3121, Blondeychck7, Faty148, Jotterbot, RC-0722, Wulfric1, Thingg, Franklin.vp, Aitias, DerBorg, Versus22, Hwalee76, SoxBot III, Apparition11,

Mofeed.sawan, Slayerteez, XLinkBot, Marc van Leeuwen, Moocow444, Joejill67~enwiki, Little Mountain 5, Drumbeatsofeden, SilvonenBot, Planb 89, Alexius08, Vianello, MystBot, Zodon, RyanCross, Aetherealize, Zoltan808, T.M.M. Dowd, Aceleo, Jetsboy101, Willking1979, Mattguzy, 3Nigma, DOI bot, Cdt laurence, Fgnievinski, Yobmod, Aaronthegr8, CanadianLinuxUser, Potatoscrub, Download,

Protonk, Chamal N, CarsracBot, Favonian, LinkFA-Bot, ViskonBot, Barak Sh, Aldermalhir, Jubeidono, PRL42, Lightbot, Ann Logsdon,

Floccinocin123, Matěj Grabovský, Fivexthethird, TeH nOmInAtOr, Jarble, Herve1729, Sitehut, Ptbotgourou, Senator Palpatine, TaBOTzerem, Legobot II, Kan8eDie, Nirvana888, Gugtup, Washburnmav, Mikeedla, THEN WHO WAS PHONE?, Skyeliam, MeatJustice,

Wierdox, AnomieBOT, Nastor, ThaddeusB, Connectonline, Taskualads, Themantheman, Galoubet, Neko85, Noahschultz, JackieBot,

Commander Shepard, Chingchangriceball, Piano non troppo, Supersmashballs123, Agroose, Pm11189, Riekuh, Hamletö, Deverenn,

Frank2710, Chief Heath, Easton12, Codycash33, Archaeopteryx, Citation bot, Merlissimo, ArthurBot, Tatarian, MauritsBot, Xqbot,

TinucherianBot II, Sketchmoose, Timir2, Capricorn42, Johnferrer, Jmundo, Locos epraix, Br77rino, Isheden, Inferno, Lord of Penguins,

Uarrin, LevenBoy, Quixotex, GrouchoBot, Resident Mario, ProtectionTaggingBot, Omnipaedista, Point-set topologist, Gott wisst, RibotBOT, Charvest, KrazyKosbyKidz, MarilynCP, Gingerninja12, Caleb7693, Deathiscomin90919, VictorPorton, Grg222, Daryl7569,

Petes2176, GhalyBot, ThibautLienart, Prozo3190, Family400005, Bupsiij, Aaron Kauppi, Har56, Dr. Klim, Velblod, CES1596, GliderMaven, Thomascjackson, FrescoBot, RTFVerterra, Triwikanto, Tobby72, Mark Renier, Oneﬁve15, VS6507, Alpboyraz, ParaDoxus,

Sławomir Biały, Xefer, Zhentmdfan, Tzurvah MeRabannan, Citation bot 1, Amplitude101, Tkuvho, Rotje66, Kiefer.Wolfowitz, AwesomeHersh, ElNuevoEinstein, Gamewizard71, FoxBot, TobeBot, DixonDBot, Burritoburritoburrito, Lotje, Dinamik-bot, Raiden09, Mrjames99, DJTrickyM, Stephen MUFC, Tbhotch, RjwilmsiBot, TjBot, Ripchip Bot, Galois fu, Alphanumeric Sheep Pig, BertSeghers,

Mr magnolias, DarkLightA, LibertyDodzo, EmausBot, PrisonerOfIce, Nima1024, WikitanvirBot, Surlyduﬀ50, AThornyKoanz, Mehdiirfani, Legajoe, Wham Bam Rock II, Bethnim, ZéroBot, John Cline, Josve05a, Leaﬁest of Futures, Battoe19, Anmol9999, Scythia,

Brandmeister, Vanished user ﬁjtji34toksdcknqrjn54yoimascj, Ain92, Agatecat2700, Herk1955, Teapeat, Mjbmrbot, Liuthar, ClueBot

NG, Incompetence, Wcherowi, Movses-bot, Kindyin, LJosil, SilentResident, Braincricket, Rbellini, Zackaback, MillingMachine, Helpful

Pixie Bot, Thisthat2011, Curb Chain, AnandVivekSatpathi, Nashhinton, EmilyREditor, Ariel C.M.K., Fraqtive42, AvocatoBot, Davidiad,

Ropestring, Edward Gordon Gey, EliteforceMMA, Karthickraj007, VirusKA, MYustin, Brad7777, Idresjafary, Nbrothers, IkamusumeFan, Kavy32, Sklange, Blevintron, BlevintronBot, Sulphuric Glue, Dexbot, Rezonansowy, Mudcap, Augustus Leonhardus Cartesius,

Pankaj Jyoti Mahanta, Ybidzian, TycoonSaad, Jarash, Chern038, FireﬂySixtySeven, Kind Tennis Fan, Justin86789, 12visakhva, Dodi

8238, Rcehy, Vanisheduser00348374562342, 115ash, AdditionSubtraction, Mario Castelán Castro, Arvind asia, Rctillinghast, KasparBot, Kaﬁshabbir and Anonymous: 1222

• Matrix (mathematics) Source: https://en.wikipedia.org/wiki/Matrix_(mathematics)?oldid=667651227 Contributors: AxelBoldt, Tarquin, Tbackstr, Hajhouse, XJaM, Ramin Nakisa, Stevertigo, Patrick, Michael Hardy, Wshun, Cole Kitchen, SGBailey, Chinju, Zeno

Gantner, Dcljr, Ejrh, Looxix~enwiki, Muriel Gottrop~enwiki, Angela, Александър, Poor Yorick, Rmilson, Andres, Schneelocke, Charles

Matthews, Dysprosia, Jitse Niesen, Lou Sander, Dtgm, Bevo, Francs2000, Robbot, Mazin07, Sander123, Chrism, Fredrik, R3m0t, Gandalf61, MathMartin, Sverdrup, Rasmus Faber, Bkell, Paul Murray, Neckro, Tobias Bergemann, Tosha, Giftlite, Jao, Arved, BenFrantzDale, Netoholic, Dissident, Dratman, Michael Devore, Waltpohl, Duncharris, Macrakis, Utcursch, Alexf, MarkSweep, Profvk, Wiml,

Urhixidur, Sam nead, Azuredu, Barnaby dawson, Porges, PhotoBox, Shahab, Rich Farmbrough, FiP, ArnoldReinhold, Pavel Vozenilek,

Paul August, ZeroOne, El C, Rgdboer, JRM, NetBot, The strategy freak, La goutte de pluie, Obradovic Goran, Mdd, Tsirel, LutzL,

Landroni, Jumbuck, Jigen III, Alansohn, ABCD, Fritzpoll, Wanderingstan, Mlm42, Jheald, Simone, RJFJR, Dirac1933, AN(Ger),

Adrian.benko, Oleg Alexandrov, Nessalc, Woohookitty, Igny, LOL, Webdinger, David Haslam, UbiquitousUK, Username314, Tabletop, Waldir, Prashanthns, Mandarax, SixWingedSeraph, Grammarbot, Porcher, Sjakkalle, Koavf, Joti~enwiki, Watcharakorn, SchuminWeb, Old Moonraker, RexNL, Jrtayloriv, Krun, Fresheneesz, Srleﬄer, Vonkje, Masnevets, NevilleDNZ, Chobot, Krishnavedala, Karch,

122

CHAPTER 9. VERTEX (GRAPH THEORY)

DVdm, Bgwhite, YurikBot, Wavelength, Borgx, RussBot, Michael Slone, Bhny, NawlinWiki, Rick Norwood, Jfheche, 48v, Bayle Shanks,

Jimmyre, Misza13, Samuel Huang, Merosonox, DeadEyeArrow, Bota47, Glich, Szhaider, Jezzabr, Leptictidium, Mythobeast, Spondoolicks, Alasdair, Lunch, Sardanaphalus, SmackBot, RDBury, CyclePat, KocjoBot~enwiki, Jagged 85, GoonerW, Minglai, Scott Paeth,

Gilliam, Skizzik, Saros136, Chris the speller, Optikos, Bduke, Silly rabbit, DHN-bot~enwiki, Darth Panda, Foxjwill, Can't sleep, clown

will eat me, Smallbones, KaiserbBot, Rrburke, Mhym, SundarBot, Jon Awbrey, Tesseran, Aghitza, The undertow, Lambiam, Wvbailey, Attys, Nat2, Cronholm144, Terry Bollinger, Nijdam, Aleenf1, Jacobdyer, WhiteHatLurker, Beetstra, Kaarebrandt, Mets501, Neddyseagoon, Dr.K., P199, MTSbot~enwiki, Quaeler, Rschwieb, Levineps, JMK, Tawkerbot2, Dlohcierekim, DKqwerty, AbsolutDan,

Propower, CRGreathouse, JohnCD, INVERTED, SelfStudyBuddy, HalJor, MC10, Pascal.Tesson, Bkgoodman, Alucard (Dr.), Juansempere, Codetiger, Bellayet, הסרפד, Epbr123, Paragon12321, Markus Pössel, Aeriform, Gamer007, Headbomb, Marek69, RobHar, Urdutext, AntiVandalBot, Lself, Jj137, Hermel, Oatmealcookiemon, JAnDbot, Fullverse, MER-C, Yanngeﬀrotin~enwiki, Bennybp, VoABot

II, Fusionmix, T@nn, JNW, Jakob.scholbach, Rivertorch, EagleFan, JJ Harrison, Sullivan.t.j, David Eppstein, User A1, ANONYMOUS

COWARD0xC0DE, JoergenB, Philg88, Nevit, Hbent, Gjd001, Doccolinni, Yodalee327, R'n'B, Alfred Legrand, J.delanoy, Rlsheehan, Maurice Carbonaro, Richard777, Wayp123, Toghrul Talibzadeh, Aqwis, It Is Me Here, Cole the ninja, TomyDuby, Peskydan,

AntiSpamBot, JonMcLoone, Policron, Doug4, Fylwind, Kevinecahill, Ben R. Thomas, CardinalDan, OktayD, Egghead06, X!, Malik

Shabazz, UnicornTapestry, Shiggity, VolkovBot, Dark123, JohnBlackburne, LokiClock, VasilievVV, DoorsAjar, TXiKiBoT, Hlevkin,

Rei-bot, Anonymous Dissident, D23042304, PaulTanenbaum, LeaveSleaves, BigDunc, Wolfrock, Wdrev, Brianga, Dmcq, KjellG, AlleborgoBot, Symane, Anoko moonlight, W4chris, Typoﬁer, Neparis, T-9000, D. Recorder, ChrisMiddleton, GirasoleDE, Dogah, SieBot,

Ivan Štambuk, Bachcell, Gerakibot, Cwkmail, Yintan, Radon210, Elcobbola, Paolo.dL, Oxymoron83, Ddxc, Oculi, Manway, AlanUS,

Anchor Link Bot, Rinconsoleao, Denisarona, Canglesea, Myrvin, DEMcAdams, ClueBot, Sural, Wpoely86, Remag Kee, SuperHamster, LizardJr8, Masterpiece2000, Excirial, Da rulz07, Bender2k14, Ftbhrygvn, Muhandes, Brews ohare, Tyler, Livius3, Jotterbot, Hans

Adler, Manco Capac, MiraiWarren, Qwfp, Johnuniq, TimothyRias, Lakeworks, XLinkBot, Marc van Leeuwen, Rror, AndreNatas, Jaan

Vajakas, Porphyro, Stephen Poppitt, Addbot, Proofreader77, Deepmath, RPHv, Steve.jaramillov~enwiki, WardenWalk, Jccwiki, CactusWriter, Mohamed Magdy, MrOllie, Tide rolls, Gail, Jarble, CountryBot, LuK3, Luckas-bot, Yobot, Senator Palpatine, QueenCake,

TestEditBot, AnomieBOT, Autarkaw, Gazzawi, IDangerMouse, MattTait, Kingpin13, Materialscientist, Citation bot, Wrelwser43, LilHelpa, FactSpewer, Xqbot, Capricorn42, Drilnoth, HHahn, El Caro, BrainFRZ, J04n, Nickmn, RibotBOT, Cerniagigante, Smallman12q,

WaysToEscape, Much noise, LucienBOT, Tobby72, VS6507, Recognizance, Sławomir Biały, Izzedine, IT2000, HJ Mitchell, Sae1962,

Jamesooders, Cafreen, Citation bot 1, Swordsmankirby, I dream of horses, Kiefer.Wolfowitz, MarcelB612, NoFlyingCars, RedBot,

RobinK, Kallikanzarid, Jordgette, ItsZippy, Vairoj, SeoMac, MathInclined, The last username left was taken, Birat lamichhane, Katovatzschyn, Soupjvc, Sfbaldbear, Salvio giuliano, Mandolinface, EmausBot, Lkh2099, Nurath224, DesmondSteppe, RIS cody, Slawekb,

Quondum, Chocochipmuﬃn, U+003F, Rcorcs, තඹරු විජේසේකර, Maschen, Babababoshka, Adjointh, Donner60, Puﬃn, JFB80,

Anita5192, Petrb, ClueBot NG, Wcherowi, Michael P. Barnett, Rtucker913, Satellizer, Rank Penguin, Tyrantbrian, Dsperlich, Helpful

Pixie Bot, Rxnt, Christian Matt, MarcoPotok, BG19bot, Wiki13, Muscularmussel, MusikAnimal, Brad7777, René Vápeník, Soﬁa karampataki, BattyBot, Freesodas, IkamusumeFan, Lucaspentzlp, OwenGage, APerson, Dexbot, Mark L MacDonald, Numbermaniac, Frosty,

JustAMuggle, Reatlas, Acetotyce, Debouch, Wamiq, Ugog Nizdast, Zenibus, SwimmerOfAwesome, Jianhui67, OrthogonalFrog, Airwoz, Derpghvdyj, Mezafo, CarnivorousBunny, Xxhihi, Sordin, Username89911998, Gronk Oz, Hidrolandense, Kellywacko, JArnold99,

Kavya l and Anonymous: 624

• Vertex (graph theory) Source: https://en.wikipedia.org/wiki/Vertex_(graph_theory)?oldid=628902495 Contributors: XJaM, Altenmann,

MathMartin, Giftlite, Dbenbenn, Purestorm, Rich Farmbrough, Cburnett, RussBot, Anomalocaris, InverseHypercube, BiT, Chetvorno,

Ylloh, Univer, Escarbot, David Eppstein, JaGa, Mange01, Hans Dunkelberg, Geekdiva, VolkovBot, TXiKiBoT, Synthebot, Anoko moonlight, SieBot, Hxhbot, Kl4m, JP.Martin-Flatin, Niemeyerstein en, Albambot, DOI bot, Zorrobot, Luckas-bot, TaBOT-zerem, KamikazeBot, Ciphers, Citation bot, Twri, Xqbot, Miym, Amaury, Phil1881, Trappist the monk, WillNess, DARTH SIDIOUS 2, TjBot, Orphan

Wiki, ZéroBot, ClueBot NG, Gchrupala, Maxwell bernard, SamX and Anonymous: 16

9.5.2

Images

• File:1u04-argonaute.png Source: https://upload.wikimedia.org/wikipedia/commons/0/02/1u04-argonaute.png License: CC-BY-SA3.0 Contributors: Self created from PDB entry 1U04 using the freely available visualization and analysis package VMD raytraced with

POV-Ray 3.6 Original artist: Opabinia regalis

• File:3-Tastenmaus_Microsoft.jpg Source: https://upload.wikimedia.org/wikipedia/commons/a/aa/3-Tastenmaus_Microsoft.jpg License: CC BY-SA 2.5 Contributors: Own work Original artist: Darkone

• File:4-critical_graph.png Source: https://upload.wikimedia.org/wikipedia/commons/7/73/4-critical_graph.png License: CC0 Contributors: Own work Original artist: Jmerm

• File:6n-graf.svg Source: https://upload.wikimedia.org/wikipedia/commons/5/5b/6n-graf.svg License: Public domain Contributors: Image:

6n-graf.png simlar input data Original artist: User:AzaToth

• File:6n-graph2.svg Source: https://upload.wikimedia.org/wikipedia/commons/2/28/6n-graph2.svg License: Public domain Contributors: ? Original artist: ?

• File:Abacus_6.png Source: https://upload.wikimedia.org/wikipedia/commons/a/af/Abacus_6.png License: Public domain Contributors:

• Article for “abacus”, 9th edition Encyclopedia Britannica, volume 1 (1875); scanned and uploaded by Malcolm Farmer Original artist:

Encyclopædia Britannica

• File:Ada_lovelace.jpg Source: https://upload.wikimedia.org/wikipedia/commons/0/0f/Ada_lovelace.jpg License: Public domain Contributors: www.fathom.com Original artist: Alfred Edward Chalon

• File:Arbitrary-gametree-solved.svg Source: https://upload.wikimedia.org/wikipedia/commons/d/d7/Arbitrary-gametree-solved.svg License: Public domain Contributors:

• Arbitrary-gametree-solved.png Original artist:

• derivative work: Qef (talk)

• File:Area_parallellogram_as_determinant.svg Source: https://upload.wikimedia.org/wikipedia/commons/a/ad/Area_parallellogram_

as_determinant.svg License: Public domain Contributors: Own work, created with Inkscape Original artist: Jitse Niesen

9.5. TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES

123

• File:Babbage40.png Source: https://upload.wikimedia.org/wikipedia/commons/6/67/Babbage40.png License: Public domain Contributors: The Mechanic’s Magazine, Museum, Register, Journal and Gazette, October 6, 1832-March 31, 1833. Vol. XVIII. Original artist:

AGoon, derivative work, original was 'Engraved by Roﬀe, by permifsion from an original Family Painting' 1833

• File:BernoullisLawDerivationDiagram.svg Source: https://upload.wikimedia.org/wikipedia/commons/2/20/BernoullisLawDerivationDiagram.

svg License: CC-BY-SA-3.0 Contributors: Image:BernoullisLawDerivationDiagram.png Original artist: MannyMax (original)

• File:Blochsphere.svg Source: https://upload.wikimedia.org/wikipedia/commons/f/f3/Blochsphere.svg License: CC-BY-SA-3.0 Contributors: Transferred from en.wikipedia to Commons. Original artist: MuncherOfSpleens at English Wikipedia

• File:Braid-modular-group-cover.svg Source: https://upload.wikimedia.org/wikipedia/commons/d/da/Braid-modular-group-cover.svg

License: Public domain Contributors: Own work, created as per: en:meta:Help:Displaying a formula#Commutative diagrams; source code

below. Original artist: Nils R. Barth

• File:CH4-structure.svg Source: https://upload.wikimedia.org/wikipedia/commons/0/0f/CH4-structure.svg License: ? Contributors:

File:Ch4-structure.png Original artist: Own work

• File:Caesar3.svg Source: https://upload.wikimedia.org/wikipedia/commons/2/2b/Caesar3.svg License: Public domain Contributors: Own

work Original artist: Cepheus

• File:Carl_Friedrich_Gauss.jpg Source: https://upload.wikimedia.org/wikipedia/commons/9/9b/Carl_Friedrich_Gauss.jpg License: Public domain Contributors: Gauß-Gesellschaft Göttingen e.V. (Foto: A. Wittmann). Original artist: Gottlieb Biermann

A. Wittmann (photo)

• File:Commons-logo.svg Source: https://upload.wikimedia.org/wikipedia/en/4/4a/Commons-logo.svg License: ? Contributors: ? Original artist: ?

• File:Commutative_diagram_for_morphism.svg Source: https://upload.wikimedia.org/wikipedia/commons/e/ef/Commutative_diagram_

for_morphism.svg License: Public domain Contributors: Own work, based on en:Image:MorphismComposition-01.png Original artist:

User:Cepheus

• File:Compiler.svg Source: https://upload.wikimedia.org/wikipedia/commons/6/6b/Compiler.svg License: CC-BY-SA-3.0 Contributors:

self-made SVG version of Image:Ideal compiler.png by User:Raul654. Incorporates Image:Computer n screen.svg and Image:Nuvola

mimetypes source.png. Original artist: Surachit

• File:Complete_graph_K5.svg Source: https://upload.wikimedia.org/wikipedia/commons/c/cf/Complete_graph_K5.svg License: Public domain Contributors: Own work Original artist: David Benbennick wrote this ﬁle.

• File:Composite_trapezoidal_rule_illustration_small.svg Source: https://upload.wikimedia.org/wikipedia/commons/d/dd/Composite_

trapezoidal_rule_illustration_small.svg License: Attribution Contributors:

• Composite_trapezoidal_rule_illustration_small.png Original artist:

• derivative work: Pbroks13 (talk)

• File:Conformal_grid_after_Möbius_transformation.svg Source: https://upload.wikimedia.org/wikipedia/commons/3/3f/Conformal_

grid_after_M%C3%B6bius_transformation.svg License: CC BY-SA 2.5 Contributors: By Lokal_Proﬁl Original artist: Lokal_Proﬁl

• File:Corner.png Source: https://upload.wikimedia.org/wikipedia/commons/5/5f/Corner.png License: Public domain Contributors: http:

//en.wikipedia.org/wiki/File:Corner.png Original artist: Retardo

• File:DFAexample.svg Source: https://upload.wikimedia.org/wikipedia/commons/9/9d/DFAexample.svg License: Public domain Contributors: Own work Original artist: Cepheus

• File:Determinant_example.svg Source: https://upload.wikimedia.org/wikipedia/commons/a/a7/Determinant_example.svg License: CC

BY-SA 3.0 Contributors: Own work Original artist: Krishnavedala

• File:Directed.svg Source: https://upload.wikimedia.org/wikipedia/commons/a/a2/Directed.svg License: Public domain Contributors: ?

Original artist: ?

• File:Directed_cycle.svg Source: https://upload.wikimedia.org/wikipedia/commons/d/dc/Directed_cycle.svg License: Public domain Contributors: en:Image:Directed cycle.png Original artist: en:User:Dcoetzee, User:Stannered

• File:Earth.png Source: https://upload.wikimedia.org/wikipedia/commons/1/1e/Earth.png License: Public domain Contributors: ? Original artist: ?

• File:Ellipse_in_coordinate_system_with_semi-axes_labelled.svg Source: https://upload.wikimedia.org/wikipedia/commons/8/8e/Ellipse_

in_coordinate_system_with_semi-axes_labelled.svg License: CC BY-SA 3.0 Contributors: Own work Original artist: Jakob.scholbach

• File:Elliptic_curve_simple.svg Source: https://upload.wikimedia.org/wikipedia/commons/d/da/Elliptic_curve_simple.svg License: CCBY-SA-3.0 Contributors:

• Elliptic_curve_simple.png Original artist:

• derivative work: Pbroks13 (talk)

• File:Emp_Tables_(Database).PNG Source: https://upload.wikimedia.org/wikipedia/commons/8/87/Emp_Tables_%28Database%29.

PNG License: Public domain Contributors: Own work Original artist: Jamesssss

• File:English.png Source: https://upload.wikimedia.org/wikipedia/commons/0/0a/English.png License: Public domain Contributors: ?

Original artist: ?

• File:Enigma.jpg Source: https://upload.wikimedia.org/wikipedia/commons/a/ae/Enigma.jpg License: Public domain Contributors: User:

Jszigetvari Original artist: ?

• File:Euclid.jpg Source: https://upload.wikimedia.org/wikipedia/commons/2/21/Euclid.jpg License: Public domain Contributors: ? Original artist: ?

• File:Fibonacci.jpg Source: https://upload.wikimedia.org/wikipedia/commons/a/a2/Fibonacci.jpg License: Public domain Contributors:

Scan from “Mathematical Circus” by Martin Gardner, published 1981 Original artist: unknown medieval artist

124

CHAPTER 9. VERTEX (GRAPH THEORY)

• File:Fivestagespipeline.png Source: https://upload.wikimedia.org/wikipedia/commons/2/21/Fivestagespipeline.png License: CC-BYSA-3.0 Contributors: ? Original artist: ?

• File:Flip_map.svg Source: https://upload.wikimedia.org/wikipedia/commons/3/3f/Flip_map.svg License: CC BY-SA 3.0 Contributors:

derived from File:Rotation_by_pi_over_6.svg Original artist: Jakob.scholbach

• File:Flowchart.png Source: https://upload.wikimedia.org/wikipedia/commons/9/9d/Flowchart.png License: CC SA 1.0 Contributors: ?

Original artist: ?

• File:Folder_Hexagonal_Icon.svg Source: https://upload.wikimedia.org/wikipedia/en/4/48/Folder_Hexagonal_Icon.svg License: Ccby-sa-3.0 Contributors: ? Original artist: ?

• File:Four_Colour_Map_Example.svg Source: https://upload.wikimedia.org/wikipedia/commons/8/8a/Four_Colour_Map_Example.

svg License: CC-BY-SA-3.0 Contributors: Based on a this raster image by chas zzz brown on en.wikipedia. Original artist: Inductiveload

• File:GDP_PPP_Per_Capita_IMF_2008.svg Source: https://upload.wikimedia.org/wikipedia/commons/d/d4/GDP_PPP_Per_Capita_

IMF_2008.svg License: CC BY 3.0 Contributors: Sbw01f’s work, but converted to an SVG ﬁle instead. Data from International Monetary

Fund World Economic Outlook Database April 2009 Original artist: Powerkeys

• File:GodfreyKneller-IsaacNewton-1689.jpg Source: https://upload.wikimedia.org/wikipedia/commons/3/39/GodfreyKneller-IsaacNewton-1689.

jpg License: Public domain Contributors: http://www.newton.cam.ac.uk/art/portrait.html Original artist: Sir Godfrey Kneller

• File:Gottfried_Wilhelm_von_Leibniz.jpg Source: https://upload.wikimedia.org/wikipedia/commons/6/6a/Gottfried_Wilhelm_von_

Leibniz.jpg License: Public domain Contributors: /gbrown/philosophers/leibniz/BritannicaPages/Leibniz/LeibnizGif.html Original artist:

Christoph Bernhard Francke

• File:Gravitation_space_source.png Source: https://upload.wikimedia.org/wikipedia/commons/2/26/Gravitation_space_source.png License: CC-BY-SA-3.0 Contributors: ? Original artist: ?

• File:Group_diagdram_D6.svg Source: https://upload.wikimedia.org/wikipedia/commons/0/0e/Group_diagdram_D6.svg License: Public domain Contributors: Own work Original artist: User:Cepheus

• File:HONDA_ASIMO.jpg Source: https://upload.wikimedia.org/wikipedia/commons/0/05/HONDA_ASIMO.jpg License: CC-BYSA-3.0 Contributors: ? Original artist: ?

• File:Human_eye,_rendered_from_Eye.png Source: https://upload.wikimedia.org/wikipedia/commons/5/51/Human_eye%2C_rendered_

from_Eye.png License: CC-BY-SA-3.0 Contributors: Own work by the original uploader This ﬁle was derived from: Eye.svg

Original artist: Kenny sh at English Wikipedia

• File:Hyperbola2_SVG.svg Source: https://upload.wikimedia.org/wikipedia/commons/d/d9/Hyperbola2_SVG.svg License: CC BY-SA

3.0 Contributors: Own work Original artist: IkamusumeFan

• File:Hyperbolic_triangle.svg Source: https://upload.wikimedia.org/wikipedia/commons/8/89/Hyperbolic_triangle.svg License: Public

domain Contributors: ? Original artist: ?

• File:Ideal_compiler.png Source: https://upload.wikimedia.org/wikipedia/commons/2/20/Ideal_compiler.png License: CC-BY-SA-3.0

Contributors: ? Original artist: ?

• File:Illustration_to_Euclid’{}s_proof_of_the_Pythagorean_theorem.svg Source: https://upload.wikimedia.org/wikipedia/commons/

2/26/Illustration_to_Euclid%27s_proof_of_the_Pythagorean_theorem.svg License: WTFPL Contributors: ? Original artist: ?

• File:Integral_as_region_under_curve.svg Source: https://upload.wikimedia.org/wikipedia/commons/f/f2/Integral_as_region_under_

curve.svg License: CC-BY-SA-3.0 Contributors: Own work, based on JPG version Original artist: 4C

• File:Internet_map_1024.jpg Source: https://upload.wikimedia.org/wikipedia/commons/d/d2/Internet_map_1024.jpg License: CC BY

2.5 Contributors: Originally from the English Wikipedia; description page is/was here. Original artist: The Opte Project

• File:Jordan_blocks.svg Source: https://upload.wikimedia.org/wikipedia/commons/4/4f/Jordan_blocks.svg License: CC BY-SA 3.0

Contributors: Own work Original artist: Jakob.scholbach

• File:Julia_iteration_data.png Source: https://upload.wikimedia.org/wikipedia/commons/4/47/Julia_iteration_data.png License: GFDL

Contributors: Own work Original artist: Adam majewski

• File:Kapitolinischer_Pythagoras_adjusted.jpg Source: https://upload.wikimedia.org/wikipedia/commons/1/1a/Kapitolinischer_Pythagoras_

adjusted.jpg License: CC-BY-SA-3.0 Contributors: First upload to Wikipedia: de.wikipedia; description page is/was here.

Original artist: The original uploader was Galilea at German Wikipedia

• File:KnnClassification.svg Source: https://upload.wikimedia.org/wikipedia/commons/e/e7/KnnClassification.svg License: CC-BYSA-3.0 Contributors: Own work Original artist: Antti Ajanki AnAj

• File:Konigsberg_bridges.png Source: https://upload.wikimedia.org/wikipedia/commons/5/5d/Konigsberg_bridges.png License: CCBY-SA-3.0 Contributors: Public domain (PD), based on the image

• <a href='//commons.wikimedia.org/wiki/File:Image-Koenigsberg,_Map_by_Merian-Erben_1652.jpg' class='image'><img alt='ImageKoenigsberg, Map by Merian-Erben 1652.jpg' src='//upload.wikimedia.org/wikipedia/commons/thumb/1/15/Image-Koenigsberg%

2C_Map_by_Merian-Erben_1652.jpg/120px-Image-Koenigsberg%2C_Map_by_Merian-Erben_1652.jpg' width='120' height='84'

srcset='//upload.wikimedia.org/wikipedia/commons/thumb/1/15/Image-Koenigsberg%2C_Map_by_Merian-Erben_1652.jpg/180px-Image-Koenigsbe

2C_Map_by_Merian-Erben_1652.jpg 1.5x, //upload.wikimedia.org/wikipedia/commons/thumb/1/15/Image-Koenigsberg%2C_

Map_by_Merian-Erben_1652.jpg/240px-Image-Koenigsberg%2C_Map_by_Merian-Erben_1652.jpg 2x' data-ﬁle-width='628' dataﬁle-height='437' /></a>

Original artist: Bogdan Giuşcă

• File:Labelled_undirected_graph.svg Source: https://upload.wikimedia.org/wikipedia/commons/a/a5/Labelled_undirected_graph.svg

License: CC BY-SA 3.0 Contributors: derived from http://en.wikipedia.org/wiki/File:6n-graph2.svg Original artist: Jakob.scholbach

• File:Lambda_lc.svg Source: https://upload.wikimedia.org/wikipedia/commons/3/39/Lambda_lc.svg License: Public domain Contributors: The Greek alphabet Original artist: User:Luks

9.5. TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES

125

• File:Lattice_of_the_divisibility_of_60.svg Source: https://upload.wikimedia.org/wikipedia/commons/5/51/Lattice_of_the_divisibility_

of_60.svg License: CC-BY-SA-3.0 Contributors: ? Original artist: ?

• File:Leonhard_Euler_2.jpg Source: https://upload.wikimedia.org/wikipedia/commons/6/60/Leonhard_Euler_2.jpg License: Public

domain Contributors:

• 2011-12-22 (upload, according to EXIF data)

Original artist: Jakob Emanuel Handmann

• File:Limitcycle.svg Source: https://upload.wikimedia.org/wikipedia/commons/9/91/Limitcycle.svg License: CC BY-SA 3.0 Contributors: Own work Original artist: Gargan

• File:Lorenz_attractor.svg Source: https://upload.wikimedia.org/wikipedia/commons/f/f4/Lorenz_attractor.svg License: CC BY 2.5

Contributors: ? Original artist: ?

• File:Lorenz_attractor_yb.svg Source: https://upload.wikimedia.org/wikipedia/commons/5/5b/Lorenz_attractor_yb.svg License: CCBY-SA-3.0 Contributors: ? Original artist: ?

• File:Mandel_zoom_07_satellite.jpg Source: https://upload.wikimedia.org/wikipedia/commons/b/b3/Mandel_zoom_07_satellite.jpg License: CC-BY-SA-3.0 Contributors: ? Original artist: ?

• File:Market_Data_Index_NYA_on_20050726_202628_UTC.png Source: https://upload.wikimedia.org/wikipedia/commons/4/46/Market_

Data_Index_NYA_on_20050726_202628_UTC.png License: Public domain Contributors: ? Original artist: ?

• File:Markov_chain_SVG.svg Source: https://upload.wikimedia.org/wikipedia/commons/2/29/Markov_chain_SVG.svg License: CC

BY-SA 3.0 Contributors: This graphic was created with matplotlib. Original artist: IkamusumeFan

• File:Matrix.svg Source: https://upload.wikimedia.org/wikipedia/commons/b/bb/Matrix.svg License: GFDL Contributors: Own work

Original artist: Lakeworks

• File:Matrix_multiplication_diagram_2.svg Source: https://upload.wikimedia.org/wikipedia/commons/e/eb/Matrix_multiplication_diagram_

2.svg License: CC-BY-SA-3.0 Contributors: This ﬁle was derived from: Matrix multiplication diagram.svg

Original artist: File:Matrix multiplication diagram.svg:User:Bilou

• File:Maximum_boxed.png Source: https://upload.wikimedia.org/wikipedia/commons/1/1a/Maximum_boxed.png License: Public domain Contributors: Created with the help of GraphCalc Original artist: Freiddy

• File:Maya.svg Source: https://upload.wikimedia.org/wikipedia/commons/1/1b/Maya.svg License: CC-BY-SA-3.0 Contributors: Image:

Maya.png Original artist: Bryan Derksen

• File:Measure_illustration.png Source: https://upload.wikimedia.org/wikipedia/commons/a/a6/Measure_illustration.png License: Public domain Contributors: self-made with en:Inkscape Original artist: Oleg Alexandrov

• File:MeningiomaMRISegmentation.png Source: https://upload.wikimedia.org/wikipedia/commons/e/e4/MeningiomaMRISegmentation.

png License: CC BY-SA 3.0 Contributors:

• Own work by the original uploader

• I used the open source package 3D slicer to segment the meningiom in a data set

Original artist: Rkikinis

• File:Multi-pseudograph.svg Source: https://upload.wikimedia.org/wikipedia/commons/c/c9/Multi-pseudograph.svg License: CC BYSA 3.0 Contributors: Own work Original artist: 0x24a537r9

• File:NOR_ANSI.svg Source: https://upload.wikimedia.org/wikipedia/commons/6/6c/NOR_ANSI.svg License: Public domain Contributors: Own Drawing, made in Inkscape 0.43 Original artist: jjbeard

• File:Naphthalene-3D-balls.png Source: https://upload.wikimedia.org/wikipedia/commons/3/3e/Naphthalene-3D-balls.png License: Public domain Contributors: ? Original artist: ?

• File:Navier_Stokes_Laminar.svg Source: https://upload.wikimedia.org/wikipedia/commons/7/73/Navier_Stokes_Laminar.svg License:

CC BY-SA 4.0 Contributors: Own work Original artist: IkamusumeFan

• File:Network_Library_LAN.svg Source: https://upload.wikimedia.org/wikipedia/commons/b/b9/Network_Library_LAN.svg License:

CC BY-SA 4.0 Contributors: <a href='//commons.wikimedia.org/wiki/File:NETWORK-Library-LAN.png' class='image'><img alt='NETWORKLibrary-LAN.png' src='//upload.wikimedia.org/wikipedia/commons/thumb/2/27/NETWORK-Library-LAN.png/100px-NETWORK-Library-LAN.

png' width='100' height='74' srcset='//upload.wikimedia.org/wikipedia/commons/thumb/2/27/NETWORK-Library-LAN.png/150px-NETWORK-Library-LA

png 1.5x, //upload.wikimedia.org/wikipedia/commons/thumb/2/27/NETWORK-Library-LAN.png/200px-NETWORK-Library-LAN.

png 2x' data-ﬁle-width='1465' data-ﬁle-height='1090' /></a> Original artist: Fred the Oyster

• File:Neuron.png Source: https://upload.wikimedia.org/wikipedia/commons/6/67/Neuron.png License: Public domain Contributors: Originally Neuron.svg -> originally Neuron.jpg taken from the US Federal (public domain) Original artist: Caiguanhao

• File:Neuron.svg Source: https://upload.wikimedia.org/wikipedia/commons/b/b5/Neuron.svg License: CC-BY-SA-3.0 Contributors: ?

Original artist: ?

• File:Nicolas_P._Rougier’{}s_rendering_of_the_human_brain.png Source: https://upload.wikimedia.org/wikipedia/commons/7/73/

Nicolas_P._Rougier%27s_rendering_of_the_human_brain.png License: GPL Contributors: http://www.loria.fr/~{}rougier Original artist:

Nicolas Rougier

• File:Nuvola_apps_atlantik.png Source: https://upload.wikimedia.org/wikipedia/commons/7/77/Nuvola_apps_atlantik.png License: LGPL

Contributors: http://icon-king.com Original artist: David Vignoni / ICON KING

• File:Nuvola_apps_edu_mathematics_blue-p.svg Source: https://upload.wikimedia.org/wikipedia/commons/3/3e/Nuvola_apps_edu_

mathematics_blue-p.svg License: GPL Contributors: Derivative work from Image:Nuvola apps edu mathematics.png and Image:Nuvola

apps edu mathematics-p.svg Original artist: David Vignoni (original icon); Flamurai (SVG convertion); bayo (color)

126

CHAPTER 9. VERTEX (GRAPH THEORY)

• File:Nuvola_apps_kaboodle.svg Source: https://upload.wikimedia.org/wikipedia/commons/1/1b/Nuvola_apps_kaboodle.svg License:

LGPL Contributors: http://ftp.gnome.org/pub/GNOME/sources/gnome-themes-extras/0.9/gnome-themes-extras-0.9.0.tar.gz Original artist:

David Vignoni / ICON KING

• File:Oldfaithful3.png Source: https://upload.wikimedia.org/wikipedia/commons/0/0f/Oldfaithful3.png License: Public domain Contributors: ? Original artist: ?

• File:Open_book_nae_02.svg Source: https://upload.wikimedia.org/wikipedia/commons/9/92/Open_book_nae_02.svg License: CC0

Contributors: OpenClipart Original artist: nae

• File:Operating_system_placement.svg Source: https://upload.wikimedia.org/wikipedia/commons/e/e1/Operating_system_placement.

svg License: CC BY-SA 3.0 Contributors: Own work Original artist: Golftheman

• File:Padlock.svg Source: https://upload.wikimedia.org/wikipedia/en/5/59/Padlock.svg License: PD Contributors: ? Original artist: ?

• File:People_icon.svg Source: https://upload.wikimedia.org/wikipedia/commons/3/37/People_icon.svg License: CC0 Contributors: OpenClipart Original artist: OpenClipart

• File:Pert_chart_colored.svg Source: https://upload.wikimedia.org/wikipedia/commons/3/37/Pert_chart_colored.svg License: Public

domain Contributors: This ﬁle was derived from: Pert chart colored.gif: <a href='//commons.wikimedia.org/wiki/File:Pert_chart_colored.

gif' class='image'><img alt='Pert chart colored.gif' src='//upload.wikimedia.org/wikipedia/commons/thumb/b/b9/Pert_chart_colored.

gif/50px-Pert_chart_colored.gif' width='50' height='31' srcset='//upload.wikimedia.org/wikipedia/commons/thumb/b/b9/Pert_chart_colored.

gif/75px-Pert_chart_colored.gif 1.5x, //upload.wikimedia.org/wikipedia/commons/thumb/b/b9/Pert_chart_colored.gif/100px-Pert_chart_

colored.gif 2x' data-ﬁle-width='309' data-ﬁle-height='190' /></a>

Original artist: Pert_chart_colored.gif: Original uploader was Jeremykemp at en.wikipedia

• File:Portal-puzzle.svg Source: https://upload.wikimedia.org/wikipedia/en/f/fd/Portal-puzzle.svg License: Public domain Contributors:

? Original artist: ?

• File:Python_add5_syntax.svg Source: https://upload.wikimedia.org/wikipedia/commons/e/e1/Python_add5_syntax.svg License: Copyrighted free use Contributors: http://en.wikipedia.org/wiki/Image:Python_add5_syntax.png Original artist: Xander89

• File:Quark_wiki.jpg Source: https://upload.wikimedia.org/wikipedia/commons/c/cb/Quark_wiki.jpg License: CC BY-SA 3.0 Contributors: Own work Original artist: Brianzero

• File:Question_book-new.svg Source: https://upload.wikimedia.org/wikipedia/en/9/99/Question_book-new.svg License: Cc-by-sa-3.0

Contributors:

Created from scratch in Adobe Illustrator. Based on Image:Question book.png created by User:Equazcion Original artist:

Tkgd2007

• File:Roomba_original.jpg Source: https://upload.wikimedia.org/wikipedia/commons/f/f5/Roomba_original.jpg License: CC BY-SA

3.0 Contributors: © 2006 Larry D. Moore Original artist: Larry D. Moore

• File:Rotation_by_pi_over_6.svg Source: https://upload.wikimedia.org/wikipedia/commons/8/8e/Rotation_by_pi_over_6.svg License:

Public domain Contributors: Own work using Inkscape Original artist: RobHar

• File:Rubik’{}s_cube.svg Source: https://upload.wikimedia.org/wikipedia/commons/a/a6/Rubik%27s_cube.svg License: CC-BY-SA3.0 Contributors: Based on Image:Rubiks cube.jpg Original artist: This image was created by me, Booyabazooka

• File:SIMD.svg Source: https://upload.wikimedia.org/wikipedia/commons/2/21/SIMD.svg License: CC-BY-SA-3.0 Contributors: Own

work in Inkscape Original artist: en:User:Cburnett

• File:Saddle_Point_SVG.svg Source: https://upload.wikimedia.org/wikipedia/commons/0/0d/Saddle_Point_SVG.svg License: CC BYSA 3.0 Contributors: This graphic was created with matplotlib. Original artist: IkamusumeFan

• File:Scaling_by_1.5.svg Source: https://upload.wikimedia.org/wikipedia/commons/c/c7/Scaling_by_1.5.svg License: Public domain

Contributors: Own work using Inkscape Original artist: RobHar

• File:Signal_transduction_pathways.svg Source: https://upload.wikimedia.org/wikipedia/commons/b/b0/Signal_transduction_pathways.

svg License: CC BY-SA 3.0 Contributors: http://en.wikipedia.org/wiki/File:Signal_transduction_v1.png Original artist: cybertory

• File:Simple_feedback_control_loop2.svg Source: https://upload.wikimedia.org/wikipedia/commons/9/90/Simple_feedback_control_

loop2.svg License: CC BY-SA 3.0 Contributors: This ﬁle was derived from: Simple feedback control loop2.png: <a href='//commons.

wikimedia.org/wiki/File:Simple_feedback_control_loop2.png' class='image'><img alt='Simple feedback control loop2.png' src='//upload.

wikimedia.org/wikipedia/commons/thumb/4/45/Simple_feedback_control_loop2.png/50px-Simple_feedback_control_loop2.png' width='50'

height='17' srcset='//upload.wikimedia.org/wikipedia/commons/thumb/4/45/Simple_feedback_control_loop2.png/75px-Simple_feedback_

control_loop2.png 1.5x, //upload.wikimedia.org/wikipedia/commons/thumb/4/45/Simple_feedback_control_loop2.png/100px-Simple_

feedback_control_loop2.png 2x' data-ﬁle-width='439' data-ﬁle-height='150' /></a>

Original artist: Simple_feedback_control_loop2.png: Corona

• File:SimplexRangeSearching.png Source: https://upload.wikimedia.org/wikipedia/commons/4/48/SimplexRangeSearching.png License:

Public domain Contributors: Transferred from en.wikipedia Original artist: Original uploader was Gfonsecabr at en.wikipedia. Later version(s) were uploaded by McLoaf at en.wikipedia.

• File:Singly_linked_list.png Source: https://upload.wikimedia.org/wikipedia/commons/3/37/Singly_linked_list.png License: Public domain Contributors: Copied from en. Originally uploaded by Dcoetzee. Original artist: Derrick Coetzee (User:Dcoetzee)

• File:Sinusvåg_400px.png Source: https://upload.wikimedia.org/wikipedia/commons/8/8c/Sinusv%C3%A5g_400px.png License: Public domain Contributors: ? Original artist: User Solkoll on sv.wikipedia

• File:Sky.png Source: https://upload.wikimedia.org/wikipedia/commons/0/08/Sky.png License: CC BY-SA 2.5 Contributors: selbst gemacht.

own work. Original artist: Manuel Strehl

• File:Sorting_quicksort_anim.gif Source: https://upload.wikimedia.org/wikipedia/commons/6/6a/Sorting_quicksort_anim.gif License:

CC-BY-SA-3.0 Contributors: originally upload on the English Wikipedia Original artist: Wikipedia:en:User:RolandH

• File:Sorting_quicksort_anim_frame.png Source: https://upload.wikimedia.org/wikipedia/commons/1/1e/Sorting_quicksort_anim_frame.

png License: CC-BY-SA-3.0 Contributors: Image:Sorting quicksort anim.gif Original artist: en:User:RolandH

9.5. TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES

127

• File:Squeeze_r=1.5.svg Source: https://upload.wikimedia.org/wikipedia/commons/6/67/Squeeze_r%3D1.5.svg License: Public domain

Contributors: Own work Original artist: RobHar

• File:Symbol_book_class2.svg Source: https://upload.wikimedia.org/wikipedia/commons/8/89/Symbol_book_class2.svg License: CC

BY-SA 2.5 Contributors: Mad by Lokal_Proﬁl by combining: Original artist: Lokal_Proﬁl

• File:TSP_Deutschland_3.png Source: https://upload.wikimedia.org/wikipedia/commons/c/c4/TSP_Deutschland_3.png License: Public domain Contributors: https://www.cia.gov/cia/publications/factbook/maps/gm-map.gif Original artist: The original uploader was

Kapitän Nemo at German Wikipedia

• File:Text_document_with_red_question_mark.svg Source: https://upload.wikimedia.org/wikipedia/commons/a/a4/Text_document_

with_red_question_mark.svg License: Public domain Contributors: Created by bdesham with Inkscape; based upon Text-x-generic.svg

from the Tango project. Original artist: Benjamin D. Esham (bdesham)

• File:Torus.png Source: https://upload.wikimedia.org/wikipedia/commons/1/17/Torus.png License: Public domain Contributors: ? Original artist: ?

• File:Tree_graph.svg Source: https://upload.wikimedia.org/wikipedia/commons/2/24/Tree_graph.svg License: Public domain Contributors: ? Original artist: ?

• File:TruncatedTetrahedron.gif Source: https://upload.wikimedia.org/wikipedia/commons/0/0f/TruncatedTetrahedron.gif License: Public domain Contributors: Own work Original artist: Radagast3

• File:Two_red_dice_01.svg Source: https://upload.wikimedia.org/wikipedia/commons/3/36/Two_red_dice_01.svg License: CC0 Contributors: Open Clip Art Library Original artist: Stephen Silver

• File:Ulam_1.png Source: https://upload.wikimedia.org/wikipedia/commons/6/69/Ulam_1.png License: CC-BY-SA-3.0 Contributors:

Transferred from en.wikipedia to Commons. Original artist: Grontesca at English Wikipedia

• File:Undirected.svg Source: https://upload.wikimedia.org/wikipedia/commons/b/bf/Undirected.svg License: Public domain Contributors: ? Original artist: ?

• File:User-FastFission-brain.gif Source: https://upload.wikimedia.org/wikipedia/commons/c/c7/User-FastFission-brain.gif License: CCBY-SA-3.0 Contributors: ? Original artist: ?

• File:Utah_teapot_simple_2.png Source: https://upload.wikimedia.org/wikipedia/commons/5/5f/Utah_teapot_simple_2.png License:

CC BY-SA 3.0 Contributors: Own work Original artist: Dhatﬁeld

• File:Vector_field.svg Source: https://upload.wikimedia.org/wikipedia/commons/c/c2/Vector_field.svg License: Public domain Contributors: Own work Original artist: Fibonacci.

• File:Venn_A_intersect_B.svg Source: https://upload.wikimedia.org/wikipedia/commons/6/6d/Venn_A_intersect_B.svg License: Public domain Contributors: Own work Original artist: Cepheus

• File:VerticalShear_m=1.25.svg Source: https://upload.wikimedia.org/wikipedia/commons/9/92/VerticalShear_m%3D1.25.svg License:

Public domain Contributors: Own work using Inkscape Original artist: RobHar

• File:Wacom_graphics_tablet_and_pen.png Source: https://upload.wikimedia.org/wikipedia/commons/d/d4/Wacom_graphics_tablet_

and_pen.png License: CC BY-SA 3.0 Contributors:

• Wacom_Pen-tablet_without_mouse.jpg Original artist: Wacom_Pen-tablet_without_mouse.jpg: *Wacom_Pen-tablet.jpg: photographed

by Tobias Rütten,Metoc

• File:Wang_tiles.png Source: https://upload.wikimedia.org/wikipedia/commons/0/06/Wang_tiles.png License: Public domain Contributors: ? Original artist: ?

• File:Wikibooks-logo-en-noslogan.svg Source: https://upload.wikimedia.org/wikipedia/commons/d/df/Wikibooks-logo-en-noslogan.

svg License: CC BY-SA 3.0 Contributors: Own work Original artist: User:Bastique, User:Ramac et al.

• File:Wikibooks-logo.svg Source: https://upload.wikimedia.org/wikipedia/commons/f/fa/Wikibooks-logo.svg License: CC BY-SA 3.0

Contributors: Own work Original artist: User:Bastique, User:Ramac et al.

• File:Wikinews-logo.svg Source: https://upload.wikimedia.org/wikipedia/commons/2/24/Wikinews-logo.svg License: CC BY-SA 3.0

Contributors: This is a cropped version of Image:Wikinews-logo-en.png. Original artist: Vectorized by Simon 01:05, 2 August 2006

(UTC) Updated by Time3000 17 April 2007 to use oﬃcial Wikinews colours and appear correctly on dark backgrounds. Originally

uploaded by Simon.

• File:WikipediaBinary.svg Source: https://upload.wikimedia.org/wikipedia/commons/b/bb/WikipediaBinary.svg License: CC BY 2.5

Contributors: Transferred from en.wikipedia; transferred to Commons by User:Sfan00_IMG using CommonsHelper.

Original artist: User:Spinningspark.

• File:Wikipedia_multilingual_network_graph_July_2013.svg Source: https://upload.wikimedia.org/wikipedia/commons/5/5b/Wikipedia_

multilingual_network_graph_July_2013.svg License: CC BY-SA 3.0 Contributors: Own work Original artist: Computermacgyver

• File:Wikiquote-logo.svg Source: https://upload.wikimedia.org/wikipedia/commons/f/fa/Wikiquote-logo.svg License: Public domain

Contributors: ? Original artist: ?

• File:Wikisource-logo.svg Source: https://upload.wikimedia.org/wikipedia/commons/4/4c/Wikisource-logo.svg License: CC BY-SA

3.0 Contributors: Rei-artur Original artist: Nicholas Moreau

• File:Wikiversity-logo-Snorky.svg Source: https://upload.wikimedia.org/wikipedia/commons/1/1b/Wikiversity-logo-en.svg License:

CC BY-SA 3.0 Contributors: Own work Original artist: Snorky

• File:Wikiversity-logo.svg Source: https://upload.wikimedia.org/wikipedia/commons/9/91/Wikiversity-logo.svg License: CC BY-SA

3.0 Contributors: Snorky (optimized and cleaned up by verdy_p) Original artist: Snorky (optimized and cleaned up by verdy_p)

• File:Wiktionary-logo-en.svg Source: https://upload.wikimedia.org/wikipedia/commons/f/f8/Wiktionary-logo-en.svg License: Public

domain Contributors: Vector version of Image:Wiktionary-logo-en.png. Original artist: Vectorized by Fvasconcellos (talk · contribs),

based on original logo tossed together by Brion Vibber

128

CHAPTER 9. VERTEX (GRAPH THEORY)

9.5.3

Content license

• Creative Commons Attribution-Share Alike 3.0