I recently saw a video on Channel 9 that provides a great background on parallel programming.
The amount of information available to developers from Channel 9 is incredible. Naturally there is good coverage of Microsoft technologies, but there is also a lot of material that has nothing to do with Microsoft.
The video is an interview with Burton Smith. I recommend it to anyone.
If you don't want to watch the interview, or the whole interview, here is how I would break down the talk:
(0:30) Hardware for mainstream computing is changing in a way that makes ideas that were once only relevant to supercomputers important to the mainstream.
(1:52) Multi-core refers to 8 or fewer processors, multi-core refers to either more than 8 processors, or utilizing different types of processors such as CPUs and Graphics Processing Units.
(3:08) In the early days super computers were powerful machines used for general purpose, transactional processing tasks.
(4:05) The introduction of vector processing caused supercomputing to become much more specialized. Programs that had a lot of branching could not benefit from vector processing. Tasks that involved performing the same operation across a set of data could be written to use Instruction Level Parallelism were able to get big gains from vector processing.
(5:37) After vector processing, the next big change was distributing computations among many smaller computers. This is the state of super computing today. Distributed parallel computing works by different machines sending messages to each other, though within each machine itself there may not be any parallel processing going on. In practice this makes supercomputing even more specialized than vector processing did.
(7:17) Now that mainstream computers are being built with multiple cores, the goal is to make parallel processing very general.
(8:56) In the 80s people realized that the physical limits of a single core would be reached. Their research didn't lead to success in the marketplace, but we can learn today from the knowledge they gained.
(10:00) It is not only computer architecture that you worry about. One of the major problems is programming languages.
(10:45) A decision made at the foundation of computing that is inhibiting attempts at parallelization is the idea of a variable.
(11:32) In a computer that executes one instruction at a time, you know the order in which the instructions will be done.
(14:28) Smith gives an analogy of parallel computing by describing a development team and talking about the implication of an office full of programmers working together to write a piece of software. He describes the risks of mutability and the strengths of messaging and immutability.
(16:25) Tweaking C++ a little bit isn't sufficient for solving parallel problems. The transformation needed is much more radical and will change programming styles for ever.
(20:33) The dominant programming languages today require the reading and writing of shared data. The programs are challenging and difficult to maintain as a result.
(21:00) Software Transactional Memory is one of the alternatives being pursued today.
(21:46) SQL is a functional programming language that a lot of people are able to use and understand. When people wish to modify data they use transactions.
(22:46) Excel is a functional programming environment. Every cell contains either a value or an expression that is a function of other values.
(26:28) Even in a parallel functional program, when state is modified it is still done sequentially. The difference is the imperative state changes become isolated pieces of a program.
(30:54) Databases are an example of a persistent store that has to be accessible in parallel, with changes done in isolation.
(32:43) Databases combine transactions with functional programming. That is the most successful parallel programming in the world.
(33:03) "There are only 3 ways to do parallel computing: 1. Get an expert. 2. Use functional Language. 3. Use transactions."
(33:40) Changes have to be done in isolation, but there are many ways to get the isolation in either space or time.
(33:20) Based on our experience with databases, transactional memory should be able to scale very well, but just adding to C++ probably won't be enough.
-- After that is a long discussion about different levels of programming and making parallelism more accessible. It is a fascinating discussion, but tougher to tease out parts for this blog.
No comments:
Post a Comment