This was a big deal in 2002. It got on the front page of Slashdot. I was a student at VUW at the time; someone got a cake with the Slashdot front page printed on icing and Robert Biddle did a magnificent job cutting it up.
> Postmodern computer science holds that no requirements can be both complete and consistent: you have to pick one.
This makes no sense whatsoever. It's up to people to make up their minds about what they want. If they can't, then maybe they don't deserve to be written programs for?
> Formal analysis can be used to show the absence of bugs, but never to show the correctness of the specification.
If the specification is wrong, that's the business analyst's problem. The programmer may work under the assumption that the specification is correct.
> Programs can exhibit “faults in construction” that would be forbidden by a modernist approach.
> This makes no sense whatsoever. It's up to people to make up their minds about what they want. If they can't, then maybe they don't deserve to be written programs for?
It makes perfect sense. To account for every edge or use case for a requirement you'd have to create many different interfaces/points of contact to suit every need. For example, if I want to create a dating app, the interface and underlying code for a mobile device will break consistency with the interface and underlying code of a web page. You could be consistent with a translation layer, but then the interface or code will be incomplete for that particular use case.
> Formal analysis can be used to show the absence of bugs, but never to show the correctness of the specification.
Working in a horizontally stratified organization where the programmer doesn't learn the business domain is ill advised. Often the programmer has the most context and understanding of both the business processes as they are being formalizaed into code. Additionally your understanding of the specification may be flawed or simply written incorrectly, and this can never be formally analyzed.
> Programs can exhibit “faults in construction” that would be forbidden by a modernist approach.
In other words, a program can have a bug as a feature, or something that is wonky, but tolerated.
> the interface and underlying code for a mobile device will break consistency with the interface and underlying code of a web page.
Are you talking about some notion of “consistency” that has nothing to do with logical consistency?
> Additionally your understanding of the specification may be flawed or simply written incorrectly, and this can never be formally analyzed.
The whole point to making a formal specification is reducing the potential for misunderstanding. In fact, the only way to misunderstand a formal specification is to be mathematically incompetent.
---
Reply:
> Yes, the section is called "On Requirements" and the paper is about building programs, not abstract or theoretical computer science/logic.
Yet requirements are logical artifacts.
> do[sic] to a level of randomness and misunderstanding of our natural world,
There's nothing to understand about the natural world. You only need to know what you want the program to do. In other words, you need to consider every possibility, and make up your mind about how you want it to be handled.
> we simply do not have a finite amount of variables to statically analyze against.
> Are you talking about some notion of “consistency” that has nothing to do with logical consistency?
Yes, the section is called "On Requirements" and the paper is about building programs, not abstract or theoretical computer science/logic.
> The whole point to making a formal specification is reducing the potential for misunderstanding. In fact, the only way to misunderstand a formal specification is to be mathematically incompetent.
I suppose we're talking about the semantics of the word formal, as well as the level of granularity one can hope to achieve over a specific problem domain. While I have no doubt that all processes can and will eventually be reduced into an algebraic function, currently, do to a level of randomness and misunderstanding of our natural world, we simply do not have a finite amount of variables to statically analyze against.
Appalling. Self-indulgent without being funny; barely relevant – note outdated Devo reference– and absolutely without enlightenment, even considering its publication date. I feel resentful that my time was wasted this way.
All the characterizations backfire in one way or another:
>Appalling
Compared to what? It's a bloody cheeky paper, not a crime against humanity. It's like those writing "unacceptable" in product reviews for details they don't like, as if they're commending on peace treaty terms.
>Self-indulgent without being funny
Given that what one considers funny can vary widely, this tells nothing.
>barely relevant – note outdated Devo reference
In some sense, Devo haven't been "relevant" for decades even back in 2002. In another sense, why should one care if a reference is "relevant" to the fads and pop culture du jour or not? We are perfectly capable of reading a 1930s text with "not relevant" 1920s culture references and appreciate it for what it is.
>and absolutely without enlightenment, even considering its publication date
Because 15 years for such a topic kills "enlightenment"? We have 30 and 50 year old CS texts that are perfectly fine (including tongue in cheek ones, like "Worse is better" or hacker foklore stuff).
>I feel resentful that my time was wasted this way.
Nobody put any gun to anybody's head. But apparently somebody liked it enough to keep on going and going on.
I admittedly skimmed through, but it seems to be more systems theory, as applied to computer science, than a new form of computer science in and of itself.
this is a poe, right? one of the primary tenets of postmodernism is that nothing can be objectively true or false. while I suspect there may exist certain less-than-serious programming languages which support this and other postmodern features, I don't think that postmodernism has anything useful to add to the field of computer science.