|Subject:||Back from the TRUST retreat.|
The retreat turned out to be pretty good. Despite some presentations that weren't relevant to me, or were made of meaningless boxes and arrows, or consisted of people just reading their slides (ugh), there were a few talks that were pretty interesting and a few interesting people i got to meet.
Talks: Pam Samuelson described California's initiative to conserve energy by developing sensors to give people better feedback on their energy use and cost, and that sounded like it could be a neat way to help take better care of the environment, educate people about consumption, develop cool technology, and save people money all at the same time. Emin Gun Sirer mentioned a new operating system he was building called Nexus. There's nothing online yet — i'll have to wait for a paper to come out in SOSP. Cristian Cadar gave an intriguing talk about Execution-Generated Tests, a technique for getting programs to crash themselves (running the programs in a special way to generate test cases). Dawn Song seems to be trying to attack the worm problem very ambitiously in about seven different ways all at the same time — automatic exploit detection, dynamic taint analysis, countermeasures for polymorphic worms, automatic signature generation, and a dissemination system for "antibodies" such as worm signatures, at least.
People: Fred Schneider asked some good questions, and i later had an interesting conversation with him about whether money was better spent defending against information warfare or chemical/biological attacks. Mike Reiter is working on a system for controlling real-world devices using mobile phones. He can unlock his office door using his mobile phone, which in itself isn't that amazing, but the interesting part is that he can delegate the power to unlock that door to someone else, using his phone, from thousands of miles away. He's looking for user interface expertise. Hakim Weatherspoon and i had a good talk on the way back about event-based architecture and about collaborating with academics.
I suck at keeping my big mouth shut. During the panel on education, i asked how we were going to shift from fixing things that are broken to building things right. As long as there are a small number of security experts desperately trying to patch bugs while there are armies of programmers out there writing broken software, we'll never get to a world of reliable and safe computers. My remark was referenced later as a suggestion that we throw everything out and start over, but i didn't intend to call for something so drastic. We've got to do both things at once: fix what's broken today, while making sure we're moving in a direction toward eventually getting it right.
I interrupted John Mitchell unnecessarily with a comment that "there is no system that doesn't have a user", but he was very gracious about it. The other time i piped up was during a presentation by a Microsoft person. His slides listed lots of security projects that were going on at Microsoft; among the fast-flying bullets was one that said "Behavior Blocking." I hadn't heard this term before, but i had a guess as to what it meant, so i asked, "What is behaviour blocking?"
He explained that it had to do with monitoring the activities of a program to see if it was about to do something potentially dangerous. If it looks like something dangerous is about to happen — something a virus might do — you stop the program.
So then i asked, "If you know the action is dangerous, why grant the power to do that action in the first place?"
He replied, "Talk to Bill Gates."
(There's an entry at Usable Security about the misconception that leads Microsoft folks to forget that privileges can be limited.)