UnderGround Forums
 

PhilosophyGround >> contradiciton in business philosop


9/6/05 3:27 PM
Ignore | Quote | Vote Down | Vote Up
hekster
Send Private Message Add Comment To Profile

Edited: 06-Sep-05
Member Since: 01/01/2001
Posts: 4649
 
Why is it inefficient to plan for a low probability catastrophe but ok to assume that some technological advancement will save our asses from problems which will certainly get worse with time?
12/17/05 4:27 AM
Ignore | Quote | Vote Down | Vote Up
AlabamaSmooth
13 The total sum of your votes up and votes down Send Private Message Add Comment To Profile

Edited: 17-Dec-05
Member Since: 03/20/2002
Posts: 136
Catastrophes are random events with no rational direction and consistency. God does not exist so there is no divine intelligence or absolute justification that wills their occurrence. Technology on the other hand is directed through conscious individuals who will technological occurrence. Humans are constantly curing previously incurable diseases and finding new sources of energy. All these scientific advancements have been very rapid in the recent past. This accelerated advancement will continue as change and innovation telescopes with each innovation.
12/17/05 8:42 PM
Ignore | Quote | Vote Down | Vote Up
FudoMyoo
Send Private Message Add Comment To Profile

Edited: 17-Dec-05
Member Since: 01/01/2001
Posts: 12758
good question. I assume you think of peak oil?
12/18/05 8:00 PM
Ignore | Quote | Vote Down | Vote Up
Roly_Poly_Puppy
Send Private Message Add Comment To Profile

Edited: 18-Dec-05
Member Since: 01/01/2001
Posts: 911
I think it has to do with a combination of poor forecasting/planning, psychological biases and cultural obsession with technology. When forecasting, planning, or even stuck in a catastrophe, a group is flooded with information from a complex system. This complex reality will offer a number of outcomes for planners. Catastrophes offer so many outcomes that planners will be forced to focus on a narrow sector of reality to deal with it. Therein lies one problem with planning: what narrow bit of info do you focus on, and is it the right focus? As you experiment and cull unsuccessful strategies the group will also be constricted by time and resource constraints. There is also the issue of delegating the planning across a wide spectrum thereby inducing more bureaucratic friction. There is also the problem of planning as a form of escaping reality or "paralysis by analysis." There is also a number of internal cultural issues within business such as a focus on the short-term rather than long-term, and the willingness not to spread bad news. Only profits and cutting costs matter in the short-term, the long term is neglected. If bad news pops up you definitely don't want to be the messenger. I think this may stem from our own poorly wired, inattentive, social-driven brains.There is also the problem of the brains inability to deal with surprise. I can go into more detail on this if you wish. Finally, there is a cultural (you could consider it memetic I guess) obsession with technology to solve humanities problems. I've probably written to much but I can point you out to 2 books to explain this problem. The first is Neil Postman's "Technopoly." Postman states that technology has trumped culture and that technology, rather than being a branch of science, is in fact a branch of moral philosophy. Technology is a form of control and rationality that affects the decisions we make in our lives and that is has evolved from being simple tools. The historian John Ralston Saul has also written about the technocrats. The second book would be "The Dynamics of Military Revolution:1300-2050" by MacGregor Knox and Williamson Murray. This book is the antithesis to the idea of technological solutions. It states that revolutions in warfare always come about from the context of politics and strategic situations NOT from new gadgets in warfare. Studying the military is an interesting metaphor for the question you have brought up hekster because you will not find a better example of generals/politicians etc. who think technology can trump an enemy when what is really needed is some form of social, strategic or organisational change within their groups.
12/18/05 11:12 PM
Ignore | Quote | Vote Down | Vote Up
hekster
Send Private Message Add Comment To Profile

Edited: 18-Dec-05
Member Since: 01/01/2001
Posts: 4734
I never thought this thread would see the light of day again. Thanks for the input rolypoly, that is some good shit. I often think that strategic thinking is ignored all too often in political thought. Clausewitz is the only guy political theorists take seriously in academia, but I think some of the decision making advancements from the contemporary military are quite relevant. I read some article a long time ago about abductive logic that military and business people used, and why it worked so efficiently but I lost the link. It was the opposite of "analysis paralysis."
6/3/07 5:28 PM
Ignore | Quote | Vote Down | Vote Up
Roly_Poly_Puppy
Send Private Message Add Comment To Profile

Edited: 03-Jun-07
Member Since: 01/01/2001
Posts: 1204
Hekster, I don't know if you are still interested in this question but there was a recent book called 'The Black Swan' by Nassim Taleb which covers your question perfectly. AMazon link: http://www.amazon.com/Black-Swan-Impact-Highly-Improbable/dp/1400063515 Good review of book: http://arlenegoldbard.com/2007/05/09/my-new-crush/
9/20/07 11:31 AM
Ignore | Quote | Vote Down | Vote Up
Subadie
Send Private Message Add Comment To Profile

Edited: 20-Sep-07
Member Since: 10/09/2004
Posts: 859
combination of cost-benefit analysis and game theory
10/14/07 3:08 PM
Ignore | Quote | Vote Down | Vote Up
thesleeper
Send Private Message Add Comment To Profile

Edited: 14-Oct-07
Member Since: 08/31/2007
Posts: 67
The statements aren't contradictory but the assumptions implied are. "Why is it inefficient to plan for a low probability catastrophe but ok to assume that some technological advancement will save our asses from problems which will certainly get worse with time?" The reason that people give for why they can't plan for a low probability catastrophe; is that they don't have any evidence that a major catastrophe will happen. There aren't enough resources to commit to every chicken little situation, or so the logic goes. Where as if you assume that technology will save you from a problem which will certainly get worse over time, you don't have any evidence or necessary connection that a technological advance for that particular problem will arrive. So on the one hand, you are ignoring a problem due to lack of evidence and on the other hand you are assuming that a problem will be solved with a lack of evidence.

Reply Post

You must log in to post a reply. Click here to login.