Disruption is the norm ... and it's accelerating
A conversation with Donald P. Greenberg
An internationally recognized pioneer in computer graphics, Greenberg
has authored hundreds of articles and served as a teacher and mentor
to many prominent computer graphic executives, faculty, artists, and
animators. At Johnson, he teaches a course in Disruptive Technologies.
Here, he shares a few of his thoughts and ideas about the exciting and
disruptive technologies that continue to redefine myriad industries in the
I really was turned on years ago by Clayton Christensen's "disruptive technology" article, which he published in the Harvard Business Review in 1993. In that article, he talked about the rate of change of technology and how some products or processes that didn't work or couldn't compete at the present time would ultimately undermine existing strategies because its rate of performance growth was so steep.
When I read this, I was on sabbatical working with Hewlett Packard on their first digital camera. I took a look at the technology with respect to the density of transistors on a chip — drawing on Moore's Law (the observation that the number of transistors on a computer processing unit will double every year, thereby doubling the overall processing power for computers every year). If Moore's Law held true, digital image sensors would totally disrupt the photography industry. The arguments against it were that current implementations didn't have high enough resolution, cost too much, and the performance wasn't as good as film. But I started to plot the trajectories of the technology, and of course now everybody has a digital camera. Only they don't even call it a digital camera, they call it a cell phone.
Break down the silos to see connections and possibilities.
In the late 1970s and early 1980s, our only real research funding
for computer-aided design (CAD) came from the manufacturing
industries: Grumman Aircraft, Boeing, General Electric, Ford,
General Motors, Corning, etc. The algorithms developed at that
point in time in many CAD communities were superb. But those
same algorithms could be used for a totally different venue. In a
sense, we were creating techniques in search of a problem.
An algorithm is really just a recipe for how to mathematically construct almost anything. For example, a graphics algorithm expresses how to model a shape or mathematically render a given geometry. I can take a series of points and pass a curved line, called a spline, through all of those points. I can then pass a surface through all of the created splines to represent the complex geometry of a specific car body. But using those same algorithms, I can describe the organs of a human body.
I have had the great good fortune to work with my son, who was an intravascular aortic surgeon at the Cleveland Clinic. We took exactly the same algorithms from the automotive and aerospace industries to model the human aorta, used 3D printing technology, and then he developed personalized stents to put into individuals who now needed only a one- or two-day stay in the hospital instead of two weeks.
As an architect, I always wanted to use computer-aided design for the creation of the built environment. These same algorithms are also used in architecture, for example by architect Frank Gehry for the design of the Guggenheim Museum in Bilbao and the Disney Concert Hall in Los Angeles. In our current research, we are using the same technology to reduce the computation time for energy simulations for the design of green environments.
These approaches are also used to model characters in computeranimated films. Thus, although the techniques are similar, we have former students who now work in four disparate industries: engineering, architecture, entertainment, and health care. Wouldn't it be nice if people from these different disciplines talked to each other?
In today's digital age, we've reached the stage where the convergence of different disciplines will provide even greater opportunities. These are the most exciting projects. A simple example: I cannot think of working in neuroscience today without collaborating with imaging scientists, computer graphics experts, magnetic resonance technologists, and biologists. So you mix computer scientists with engineers, scientists, business students, and doctors, and then you can go after some really difficult problems. But you can't do this type of research in isolation.
Understand the technologies well enough to comprehend what they can do.
While business leaders don't need to understand all the intricacies of high technology, they do need to understand enough to see what it might be capable of doing. They don't have to program or try to understand the math, but they should be able to read the science section of The New York Times. They should be able to read Bloomberg News. They should be able to read U.S. News and World Report or The Economist, and they should be able to read Scientific American. If they understand that much, and if they surround themselves with the right technical people, they'll know enough to be able to ask the right questions.
Be aware of performance trajectories and systems if you want to anticipate the future.
When I teach business students, I try to teach two important
characteristics to be aware of in looking ahead to the future. One is
to understand the performance trajectories over time of the different
technologies within a particular segment of the business world, so
they can understand where the opportunities are going to be. This
is particularly relevant with the computer industry and its rate of
The other is to understand that everything is a system and all the relationships between the necessary components within a system are important. For example, cell phones obviously wouldn't work without computer chips, but also require the wireless technology to interface with the higher bandwidth of the fiber technology that enables communication.
Last fall, when I gave a lecture at a conference in Cincinnati, I started off by saying: "How do you predict what to do in an industry which, from its start in 1965, roughly, to 2025, 60 years later, is going to have improved its performance by a factor of one trillion?" Yet I've had the opportunity to live through most of this. I'm starting a new course this January in virtual reality. Why? Because virtual reality and augmented reality will disrupt the communications, television, and travel industries by changing the way people communicate, watch live sports events, and visit foreign places. Most people want to say, "Well, it's not going to happen." But I'm used to that. And by the way, I have support from nearly all the major companies involved.
New technology raises new concerns and questions — and presents new responsibilities.
I try to raise deep questions for which there are no answers yet. Take
privacy, for example. Until recently, it was perfectly legal for me
to fly a drone over your property and take pictures, as long as the
drone didn't land on your property. So I showed a picture of a drone
outside somebody's bedroom to my class.
Another concern is electronic trading. What are the regulations for electronic trading? Is it really fair to place my computer five meters closer to where the signal comes in at the exchange so that I have a trading advantage? Maybe it's only a microsecond or a millisecond — but I can execute a thousand trades in that millisecond, before anybody else gets the same information.
Have the courage to take a risk.
Technology will continue to advance no matter what; understanding the performance trajectories of technologies will instill the courage to take the risks that are necessary and vital to creating disruptive innovations. Failure should not be a stigma! There is a big value to failure.