INDEX
A
A/B testing, 25, 85, 140
A3 problem solving, 191
acceptance tests, 44, 54
accidents, in complex systems, 30, 39
Agile development
innovations and, 86-87
measuring productivity in, 12-13
reports on current state of, 135
Agile Manifesto, 41, 49, 75
Allspaw, John, xxiv
Almeida, Thiago, 90
Amazon, 5
moving to SOA, 66
Web Services, 71, 93
analysis of variance (ANOVA), 229
Anita Borg Institute, 114
anonymity, in surveys, 165
anxiety, 89
applications. See software architecture, 59-68
correlated with delivery performance, 60-61, 216
deployability and testability of, 61-62
loosely coupled, 48, 62-65, 91, 204, 216
making large-scale changes to, 62
microservices, 217
service-oriented, 63
Asberg, Marie, 95
automated testing. See test automation
automation, given to computers, 109
average variance extracted (AVE), 226
B
bad data, 163-165
basic assumptions, 29-30
Bessen, James, 4
Bezos, Jeff, 183
bias, 171, 224-225
Blank, Steven, 83
Bogaerts, David, 184
branches
lifetimes of, 44-45, 215
short-lived, 56
Brynjolfsson, Erik, 4
bureaucracy, 35
bureaucratic culture, 31-32, 35, 43
burnout, 94-100
correlated with:
deployment pain, 97, 215
pathological culture, 97
measuring, 96
negatively correlated with:
delivery performance, 64
effective leadership, 98, 215
investments in DevOps, 98
Lean management, 77, 84, 87, 217
organizational performance, 215
trunk-based development, 215
Westrum organizational culture, 215
reducing, 46, 95, 97-100, 107
risk factors for, 95
C
capabilities (DevOps), 6-8, 215, 219
driving delivery performance, 9, 201-209
measuring, 5-6
Capital One, 5, 72
car manufacturing, 75
causal analysis, 140
causation, 136-138
census reports, 134-135
change advisory (approval) board (CAB), 78-81
affecting delivery performance, 217
change approval process, 78-81, 205
only for high-risk changes, 79, 217
change fail rate, 14, 17, 37
with continuous delivery, 48, 50
in performance analysis, 23, 141
Chi-square tests, 224
classification analysis, 228
clinical depression, 94
cluster analysis, 18, 140-141, 228
coaching, 188, 197
Cockcroft, Adrian, 107
collaboration, 36, 43, 45, 64
encouraging, 124
commercial off-the-shelf software (COTS), 60
common method bias/variance (CMB/CMV), 159, 224
communication
improving, 71
inspirational, 117
composite reliability (CR), 226
conferences, 123
configuration drift, 93
configuration management, 44
confirmatory factor analysis (CFA), 225
constructs, 33, 37
impacting each other, 211, 227
continuous delivery (CD), 41-57
capabilities of, 201-203
correlated with:
delivery performance, 48, 98
deployment frequency, 213
empowered teams, 48
identity, 48
Lean product management, 85, 217
loosely coupled architecture, 48, 62, 216
organizational culture, 47, 105-106, 218
trunk-based development, 215
impacts of, 39, 45-52, 56
implementing, 43-45
measuring, 47
practices of, 42-43, 52-56
reducing burnout, 107
continuous deployment, 16
continuous improvement, xxii-xxiii, 6-8, 20, 43
continuous integration, 41, 44-45, 171, 202
correlated with leadership, 220
reducing deployment pain, 91
convergent validity, 34, 150, 225-226
Conway, Melvin, 63
Corman, Josh, 72
correlation, 136-138, 211, 226-227
Cronbach’s alpha, 226
cross-functional teams, 123-124, 183
cross-sectional research design, xxi, 169, 171
culture, 29-40
changing, 39-40
high-performance, 195
high-trusting, 116
improving, 47-48, 123-127
measuring, 27, 32-35
modeling, 29-32
poor, 90, 92
typology of, 29-32, 147
customer feedback
gathered quickly, 15-16, 25, 42, 86
incorporating, 43, 84-87, 204
customer satisfaction, 24, 116
D
dashboards, 77, 206
data, 169-175
bad, 163-165
collecting and analyzing, 158-159, 169-172
system, 157-158, 160-162
trusting, 162-165
debugging someone else’s code, 66
decision-making, 36
focusing on, 109
delivery lead time. See lead time
delivery performance
analyzing, 18-23
correlated with:
change approval process, 78-81, 217
continuous delivery, 48, 98
deployment frequency, 216
investment in DevOps, 122, 213, 221
job satisfaction, 106, 108
Lean management, 77, 84, 87, 98, 217
organizational performance, 98
tempo/stability, 213
transformational leadership, 119-120, 219
trunk-based development, 215
version control, 162
Westrum organizational culture, 218
of high vs. low performers, 212
impacts of, 24-26, 70
improving, xxii-xxiii, 26-27, 46, 122
key capabilities of, 9, 201-209
measuring, 11-17, 37
negatively correlated with:
deployment pain, 213, 215
integrated environment, 216
not correlated with:
approvals for high-risk changes, 217
type of system, 60-61, 216
poor, 90, 92
predicting, 27, 31, 36-37
Deming, W. Edwards, 27, 42
departments
goals of, 43
moving between, 124
protecting, 31
deployment frequency, 14, 16, 37, 79
correlated with:
continuous delivery, 213
delivery performance, 216
version control, 213
in performance analysis, 141
deployment pain, 89-94
correlated with burnout, 97, 215
measuring, 91
negatively correlated with:
delivery performance, 64, 213, 215
organizational culture, 218
trunk-based development, 215
Westrum organizational culture, 215, 218
reducing, 46, 91, 93, 122
deployment pipeline, 45, 79-80
deployments
automated, 45, 80, 92, 109, 202
complex, 92-93
continuous. See continuous deployment
done independently, 62, 216
during normal business hours, 62, 92
descriptive analysis, 134-136
detractors, 103
DevOps movement, xxiv-xxv, 4, 169-172
achieving high outcomes of, 120
in all operating systems, 221
capabilities of, 5
correlated with:
delivery performance, 213, 221
job satisfaction, 109
organizational culture, 218
investment in, 98, 122-123, 213, 215, 218
reports on current state of, 135
value of, 9-10
women and minorities in, 110-113
DevOpsDays, 123
DevSecOps, 72
digital banking, 181
disability, 94
disaster recovery testing exercises (DiRT), 125
discipline, 197
discriminant validity, 34, 150, 226
diversity, 110-114, 220
Duncan’s multiple range test, 229
E
economic cycles, 24
efficiency
of high vs. low performers, 24, 212
impacting, 116
improving, 16
employee Net Promoter Score (eNPS), 102
correlated with:
customer feedback, 103, 218
employee identity, 219
leadership characteristics, 120
organizational performance, 218
workflow visibility, 219
employees
delegating authority to, 122
engagement of, 101-114
focusing on decision-making, 109
improving work, 98
loyalty of, 102-104, 218
sharing their knowledge, 126
empowered teams
choosing their own tools, 66-67, 126, 204, 207
leaders of, 220
enterprises
culture of, 35
performance of, 221
experimentation, 86-87, 107, 116, 205
correlated with leadership, 220
exploratory factor analysis (EFA), 136-138, 140, 225
Extreme Programming (XP), 41
F
Facebook, xxv, 5
failure demand, 52
failures
in complex systems, 39
punishing for, 126
restoring a service after, 17
fairness
absence of, 96
guaranteeing, 35
family issues, 94
fear, 30, 89
Federal Information Security Management Act (FISMA), 71
feedback
correlated with:
eNPS, 103, 218
leadership, 220
from:
customers, 15-16, 25, 42-43, 84-87, 204
infosec personnel, 56
production monitoring tools, 77
team members, 186
gathered quickly, 15-16, 25, 42, 85-86, 188
honest, and anonymity, 165
incorporating, 43, 84-87, 204
feelings, measuring, 165
figures, in this book, 25
Forrester reports, 5, 135
Fremont, California, car
manufacturing plant, 39
G
game days. See disaster recovery testing exercises
Geek Feminism, 114
gender, 110-111, 113, 220
generative culture, 31-32, 35-36, 48, 206
correlated with:
employee identity, 107
Lean management, 77, 87
demonstrating new behaviors, 197
GitHub, 201
approving changes in, 80
Flow, 55
goals
accomplishing, 31
aligning, 106, 122
noncommercial, 24, 116, 212
for system-level outcomes, 43
Google, 5
Cloud Platform, 93
disaster recovery testing exercises at, 125
high-performing teams in, 37-39
20% time policy, 98, 125
greenfield systems, xxii, 8, 10, 60-61
H
Hammond, Paul, xxiv
harassment, 113
Harman’s single-factor test, 224
Heroku, 93
hierarchical clustering, 141
high performers, 9-10, 18-24
correlated with:
change approval process, 79
continuous improvement, 6, 43
deployment frequency, 65, 216
performance, 212
leadership in, 119-120, 219
not correlated with industry characteristics, 221-222
recommending their organization, 103, 219
time spent on:
integration, 215
manual work, 214
new vs. unplanned work/rework, 52, 213
security issues, 72, 215
working independently, 61-64
Honda, 75
Hoshin Kanri, 188
human errors, 39
hypotheses, 138-139
revisiting, 175
testing, 227
I
IBM
performance testing at, 160-161
THINK Friday program, 98
IDC reports, 135
identity, 101
correlated with:
continuous delivery, 48
culture, 107
eNPS, 104, 219
organizational performance, 105, 107-108, 218
trunk-based development, 215
improvement activities, 188
inclusion, 110
individual values, 96, 99
industry characteristics, 221-222
inferential predictive analysis, 138-139, 211, 227
informal learning, 125
information flow, 31, 36
information security (infosec), 69-73
built into daily work, 67, 72
at the end of software delivery lifecycle, 69
integrated into delivery process, 56, 203
shifting left on, 45, 70-72
in US Federal Government, 71
using preapproved tools for, 67, 70
ING Netherlands, 181-194
innovations, 86-87, 205
supporting, 116, 126
integrated environment, 62
delivery performance and, 216
integration time, 215
intellectual stimulation, 117
internal consistency, 226
internal websites, 206
Intuit, 72
inverse Conway Maneuver, 63
investment in DevOps, 98, 122-123, 213, 215, 218
J
job satisfaction, 36, 101, 207
correlated with:
ability to choose tools, 126
delivery performance, 106
Lean management, 217
organizational performance, 108-109
proactive monitoring, 127
trunk-based development, 215
Westrum organizational culture, 218
job stress, 94
job turnover, 94
K
kanban, 77
Kata, 191
Krishnan, Kripa, 125
L
lack of control, 96
latent constructs, 146-155, 225
lead time, 13-17, 37
correlated with:
change approval process, 79
test automation and version control, 213
measuring, 14
in performance analysis, 141
reducing, 116
leadership
coaching, 188
correlated with:
continuous integration, 220
delivery performance, 119, 219
empowered teams, 220
eNPS, 120
experimentation, 220
feedback, 220
loosely coupled architecture, 220
organizational culture, 120, 218
shift left on security, 220
test automation, 220
trunk-based development, 220
working in small batches, 220
high-performance, 179-198
measuring, 118-119
motivating, 115
reducing burnout, 215
servant, 118
supporting continuous
improvement, xxii-xxiii
transformational, 115-121, 207
Lean management, 76-81
correlated with:
delivery performance, 98, 217
job satisfaction, 217
organizational culture, 217-218
organizational performance, 181
impacts of, 39, 115-116
reducing burnout, 98, 100, 107, 217
value streams in, 183
Lean manufacturing, 39, 75
Lean product management, 84-86
correlated with:
continuous delivery, 85, 217
generative culture, 87
performance, 84, 217
Westrum organizational culture, 217
reducing burnout, 84, 217
working in small batches in, 16, 84-88
Lean startup, 191
learning, 108, 187, 193, 207
creating environment for, 195
legacy code, xxii, 10, 23
Lietz, Shannon, 72
Likert-type scale, xxvi, 32-35, 133, 151
linear regression, 228
lines of code, optimal amount of, 12
LinkedIn, xxv
loosely coupled architecture, 62-65, 204
correlated with:
continuous delivery, 48, 62, 216
leadership, 220
reducing deployment pain, 91
low performers, 18-24
correlated with:
change approval process, 79
deployment frequency, 65, 216
mainframe systems, 60, 216
performance, 212
software outsourcing, 60, 216
leadership in, 119-120, 219
not correlated with industry characteristics, 221-222
recommending their organization, 103, 219
time spent on:
integration, 215
manual work, 214
new vs. unplanned work/rework, 52, 213
security issues, 72, 215
trading speed for stability, 10
loyalty, 102-104
correlated with organizational performance, 104, 218
M
mainframe systems, 8, 60, 216
Mainstream Media Accountability Survey, 143
managers
addressing employees’ burnout, 95-98
affecting organizational culture, 105-106, 122-123
leading by example, 197
supporting their teams, 123-127
manifest variables, 146
manual work, 214
marker variable test, 224
market share, 24, 181
of high vs. low performers, 212
Maslach, Christina, 95
maturity models, 6-7
McAfee, Andrew, 4
mean time to restore (MTTR), 14, 17, 37
correlated with:
change approval process, 79
monitoring and version control, 213
in performance analysis, 141
mechanistic analysis, 140
medium performers, 18, 23
correlated with:
delivery performance, 212
deployment frequency, 65, 216
leadership in, 219
not correlated with industry characteristics, 221-222
time spent on:
manual work, 214
new vs. unplanned work/rework, 214
menial tasks, 109
microaggressions, 113
microservices architecture, 217
Microsoft
Azure service, 93
continuous delivery at, 90
minimum viable product (MVP), 84
minorities, 112, 220
mission, 30-31
monitoring tools, 76, 206
correlated with MTTR, 213
feedback from, 77
proactive, 109, 127
motivation
increasing, 16
role of leaders in, 115
N
Net Promoter Score (NPS), 145
explained, 104
measuring, 102-104
Netflix, 5
new work, 23, 51, 213
during normal business hours, 98
Nijssen, Mark, 194
noncommercial performance
of high vs. low performers, 212
measuring, 24
predicting, 213
O
Obeya room, 182, 184, 190
Office Space, 163
operating systems, 221
opinions, measuring, 165
organizational culture changing, 39-40
correlated with:
continuous delivery, 47, 105-106, 218
information flow, 31, 36
investment in DevOps, 218
leadership characteristics, 120, 218
Lean management, 218
Lean product management, 84
organizational performance, 218
retention/turnover, 167
improving, 48, 122
levels of, 29-30
measuring, 32-35, 146-152, 166
modeling, 29-32
negatively correlated with:
burnout, 97
deployment pain, 218
poor, 90, 92
typology of, 29-32
organizational performance
correlated with:
delivery performance, 98
diversity, 113
employee identity, 105, 107-108, 218
employee loyalty, 104, 218
eNPS, 218
job satisfaction, 108-109
Lean management, 77, 181
Lean product management, 84, 217
organizational culture, 218
proactive monitoring, 127
transformational leadership, 120, 122
of high vs. low performers, 212
key capabilities of, 9
measuring, 24-26
negatively correlated with burnout, 215
poor, 90, 92
predicting, 36-37, 87, 212-213
organizational values, 96, 99
organizations
change approval process in, 78-81
goals of, 24, 43, 106, 116, 122, 212
importance of software for, 4
inclusive, 110
overestimating their progress, 5
recommended by peers, 102-103, 219
remaining competitive, 3-4
outcomes, 7-8
outsourcing, 60, 216
overhead, reducing, 16
OWASP Top 10, 69
O’Reilly Data Science Salary Survey, 135
P
pair programming, 79, 205
Pal, Topo, 72
partial least squares regression (PLS), 211, 228
passives, 102
pathological culture, 30, 32
dealing with failures in, 39
leading to burnout, 97
patience, 197
Payment Card Industry Data Security Standard (PCI DSS), 79
Pearson correlations, 138, 226-227
peer review, 79, 205
correlated with delivery performance, 79, 217
perceptions, measuring, 165
performance
analyzing, 141
inspiring, 117
making metrics visible, 122
measuring, 11-27, 154
vs. stability, 17, 20
testing, 160-161
using to make business decisions, 76
performance-oriented culture. See generative culture
Pivotal Cloud Foundry, 71, 93
plan-do-check-act cycle (PDCA), 188
Poppendieck, Mary and Tom, 75
population, in data analysis, 134-135, 172-174
post hoc comparisons, 229
power-oriented culture. See pathological culture
predictive analysis, 140, 211, 227-228
PRINCE2, 75
principal components analysis (PCA), 225
proactive monitoring, 109, 127
problem solving, 191
product development, 83-88
production environment, manual changes to, 93
productivity, 24, 181
displaying metrics of, 76-77
of high vs. low performers, 212
increasing, 64, 116
measuring, 12, 64
profitability, 24, 181
Project Include, 114
Project Management Institute, 75
promoters, 102
Puppet Inc., xxiv, 172, 199
push polls, 143-144
Q
qualitative research, 132-133
quality
acknowledging achievements in, 117
building in, 10, 42
correlated with organizational performance, 24
displaying metrics of, 76-77
focusing on, 43
measuring, 50-52
monitoring, 206
quantitative research, 132-133
quantity of goods, 24
of high vs. low performers, 212
quarantine suites, 54
quick surveys, 145
R
randomized studies, 140
reciprocal model, 87, 106-107
Red Hat OpenShift, 93
referrals, 172
regression testing, 228
release frequency. See deployment frequency
releases, 16
reliability, 34, 151-152, 226
repetitive work, 43
research
cross-sectional, xxi, 169, 171
primary vs. secondary, 131-132
qualitative vs. quantitative, 132-133
in this book, xxii-xxiii, 141
restore time. See mean time to restore
retention, 166-167
return on investment (ROI), 24
rewards, insufficient, 96
rework
decreasing with trunk-based development, 215
of high vs. low performers, 50-51, 213-214
Ries, Eric, 83
right mindset, 195
Rijkhoff, Jan, 191
risk
reducing, 16
taking, 126
Risk Management Framework (RMF), 71
risk management theater, 81
rituals, 30
Rugged DevOps, 72
rule-oriented culture. See bureaucratic culture
S
sample, in data analysis, 135
Scheer, Liedewij van der, 184
Schuyer, Jael, 184
Schwartz, Mark, 35
Scrum rituals, 191
security
built into daily work, 67, 72
shifting left on, 45, 70-72, 91, 203, 220
time spent on remediating of, 215
Seddon, John, 52
segregation of duties, 79
Shook, John, 39
short-lived branches, 44, 56, 215
sick time, 94
simple linear regression, 211
single factors, 224
small batches, 16, 25, 42, 84-88, 205, 220
Smit, Jannes, 181, 187, 192-193
snowball sampling, xxv, 172-174, 224
social norms, 30
software
changes in, 79-80
delivery of. See delivery performance
importance of, for organizations, 4-5
mainframe, 60, 216
outsourcing, 26, 60, 216
perceived quality of, 50
strategic, 26
Spurious Correlations, 137
stability
change approval process and, 79
focusing on, 43
increasing, 64
of high vs. low performers, 10
vs. performance, 17, 20, 213
in performance analysis, 141
trends for, over years, 22
stand-ups, 186-188
startups
culture of, 35
performance of, 221
State of DevOps Report, xxiv, 158, 199
statistical data analysis, 133-135
storyboards, 77
Subversion, 201
suicide, 94
surveys, 33, 143-145
anonymity in, 165
checked for bias, 159
with obvious agenda, 143-144
preparation of, 223
reasons to use, xxv, 157-167
trusting data reported in, 146-155, 162-165
weakness of questions in, 145
system data, 157-158, 160-162
system health monitoring, 152-153, 206
systems of engagement, 8, 60-61
systems of record, 8, 60-61
T
Target, 5
target population, 172
team experimentation, 86-87, 107, 116, 205
teams
choosing their own tools, 48, 66-67, 109, 126, 204, 207
code review in, 79, 205
collaborating, 13, 34, 36, 64, 207
cross-functional, 63, 123-124, 183
demotivating, 107
diversity in, 110-113, 220
having authority to make changes, 62, 78-81, 84
having time for new projects, 98, 106, 123
high-performing, 37-39
leaders of, 98, 115, 220
productivity of, 12-13, 64-65
recommended by peers, 103, 219
size of, 64-65
supporting, 123-127
transforming from within, 197
technical debt, 23, 123
Technology Transformation Service, 5, 26
technology, importance of, 4-5
tempo, 17
increasing, 64
of high vs. low performers, 10
vs. performance, 213
in performance analysis, 141
trends for, over years, 21
Ten Berge, Ingeborg, 184
test automation, 44-45, 53-55, 91, 109, 202
correlated with:
lead time, 213
leadership, 220
test data management, 45, 55, 203
reducing deployment pain, 91
test-driven development (TDD), 41, 54
tests
continuous in-process, 52
integrated environment for, 62
in version control, 44
3M, side projects in, 125
time to restore service. See mean time to restore
tools
chosen by teams, 48, 66-67, 109, 126, 204, 207
preapproved by infosec team, 67, 70
Toyota, 16, 75
training budget, 123, 125
transformational leadership, 115-121, 207
trends, in delivery performance, 20-22
Trump, Donald, 143
trunk-based development, 44-45, 55-56, 202, 215
correlated with leadership, 220
reducing deployment pain, 91
trust, 35-36
between teams, 124
t-tests, 224
Tukey’s test, 229
turnover, 166-167
Twitter, xxiv-xxv
Two Pizza Rule, 183
U
underrepresented minorities, 112, 220
unequal pay, 113
unit tests, 44
unplanned work
capacity to absorb, 13
of high vs. low performers, 23, 50-51, 213-214
US Digital Service, 5, 26
US Federal Government, 5, 26, 35
infosec in, 71
utilization, 13
V
validity, 34, 150-152, 225-226
value streams, 84-86, 183
values, 30
aligning, 99, 106-107, 118
conflicts of, 96
correlated with eNPS, 104
Vanguard Method, 52
velocity, 12-13
vendor reports, 135
version control, 44-45, 201
approving changes in, 80
automated tests in, 44
correlated with:
delivery performance, 162
deployment frequency, lead time, MTTR, 213
keeping application configuration in, 53
measuring capability of, 46
reducing deployment pain, 91
Vigen, Tyler, 137
virtuous cycle. See reciprocal model
visibility, 84
into code deployments, 91
vision, 117
visual displays, 76-77, 182, 184, 186, 206
Vos, Jordi de, 188
W
Wardley mapping method, 26
Wardley, Simon, 26
Weinberg, Jerry, 50
Westrum organizational culture, 29-32
correlated with:
job satisfaction, 218
Lean management, 217
performance, 218
trunk-based development, 215
measuring, 147, 151
negatively correlated with:
burnout, 215
deployment pain, 215, 218
outcomes of, 36-37
Westrum, Ron, 30, 43, 206
Wickett, James, 69, 72
Wijnand, Danny, 190
Wolhoff, Paul, 184
women, 110-111, 113, 220
work
improving, 98
making more sustainable, 49
meaningful, 207
organizing, 77
in small batches, 16, 25, 42, 84, 88, 205, 220
correlated with leadership, 220
work in progress (WIP), 206
limiting, 76-77, 184
work overload, 96
work/life balance, 90
workflow
correlated with eNPS, 104, 219
making visible, 76-77, 84, 204
workplace environment, 96
Y
Yegge, Steve, 66