Test Information Space

Journal of Tech, Testing and Trends

Archive for May, 2010

Who Goes There

Posted by cadsmith on May 29, 2010


When comparing models, measurement and ranking may involve complexity. There are various types, e.g. software, computational and process. Conceptually, user may consider the number of models necessary to cover phenomenon, length of time required to explain the system, or whether the latter can be captured in an intuitive diagram. During design or build, factors might be dimensionality, manufacturing difficulty, amount of software, fixing difficulty, price, sensitivity to environmental conditions or time to configure. Usage has amount of calculation, time for calculation or halting, difficulty of operation, and stability or failure rates. Scaling adds number of components and connections and types thereof. Automation may have tradeoffs between factors. There can be a composite derived from multiple methods. See computational complexity for discussion of metrics.

Recent Links (of about 27): visualization: latency heatmaps, semantic web: Saplo, crowdsource: Fluidinfo, local events, robotics: mind over machine, security: CERT fuzzing, marketplace: Facebook AppBistro, art.sy, disaster recovery: oil reporter, finance: bonds, technology: NanoProfessor, STS OCW, future cities, synthetic cell, space: duality of gravity, WISE star formation, tilted orbit, photography: Stock Photos, music: UJAM.

Book Reviews:

The Power of the Semantic Web to Transform Your Business, David Siegel, 2009

The author has addressed a book to an audience of business customers to acquaint them with who’s doing what in semantic web applications. The transition is expected to complete in a decade. Trends in digitization, availability, managing metadata, synchronization, syndication, and scalability lead to online data lockers which replace silos. Users specify a want which filters semantic queries. The data, products and services are then available anywhere, e.g. via cellphone screen login and browser. This lead to a conversion from current advertising. Realtime pricing is based on metadata. The fair tax legislation is recommended to eliminate income tax and simplify sales tax structures within transactions. Autonomy is expected to automate processes and increase efficiency. Collaborative design spaces replace data repositories, e.g. building information management (BIM) for 3D models, or science commons for protein designs.
The style is optimistic, visionary and direct with short paragraphs and lots of bold emphasis. There are three parts for seventeen chapters. A book ontology concept map shows how the pieces fit together. It has plenty of url’s to demonstrate existing businesses. Evri has acquired Twine since. It may be so wide-ranging that it could use a filter to find stuff that’s fits reader requirements.
Other issues would require additional sources, e.g. dominant vendors or monopolies compared to opensource, closed data (Newscorp blocking Google), closed ecosystem (Apple), sustainability, security, curated data (Wolframalpha), or privacy (Facebook).

The Philosophy of Science and Technology Studies, Steve Fuller, 2006

Science’s place in society is not as well understood by the public as other subjects. The author is a social constructivist; humanity is a goal. STS applies theories of humanities and social sciences to science and technology. This book covers the European and American history. It is descended from positivism and has field methods of research. Actor-network theory is the main one, e.g. Isabelle Stengers or Bruno Latour. This is a cross-disciplinary academic field sometimes likened to a scaffold for specialties. Technology studies is a subfield added in the mid-80s. The public is taught citizen science and ways to consider technical issues of consequence. Technoscience is an undifferentiated combination of science and technology. Scientific authorities are respected for extreme rationality and have their own interpretations of STS and its research. The Science Wars of the 90s pitted realists and critics of scientific theory against eachother. Science has social organization, politics, and control over public relations. Quality is affected by who it is that researchers are accountable to. Institutional science followed a strategy of colonization. Thought collectives were recognized in the 1930s. Kuhn described paradigms in 70s; philosophy of rational or irrational became normal or revolutionary phases of science. There is a Heraclitean dualism between constructivism and relativism. Ideological framework presuppositions can be disintegrated and reconsidered in newer context.
As far as style, there are thirty-three chapters in six parts; Technology follows science models here; utilitarianism is not distinguished. The term artifact is used as an effect rather than in a design sense. The latter is used for theoretical policy and institutions, rather than engineering. Internet and web are sources and not subjects, though there is a recommendation for an online futures market for science-based proposals. The computation trend is not discussed. Where the frameworks are tabular in two-dimensions, Eastern formulations might add spirals or helices, e.g. to show processes, scale of abstraction or detail, and relations, e.g. counterclockwise between actor-network theory in constructivism on left and theory in relativism on right. Exception cases for each category could be expanded upon. Scifi is not considered a factor, currently subsumed in techno-adventure. There is an extensive bibliography.

The Best Science Fiction and Fantasy of the Year, volume four, ed. Jonathan Strahan, 2010

Scifi anthologies are still published because editors have favorites. Editors still exist because publishers do marketing. In this book, an intro describes the status of the industry. There are twenty-nine stories. Each has its own intro. A section for recommended readings closes the volume. Kij Johnson has two entries. Some of the authors are in other anthologies, such as Dozois 2009, including Stephen Baxter, Michael Swanwick, Sarah Monette and Elizabeth Bear. In terms of perspectives, the ratio is about 2:1 third person and first. Cover art is by John Berkey.
Specialized e-readers have changed things a bit, though they may be peaking since tablets are becoming popular. There are other references. Wikipedia has author info. Author’s often have own websites. There is no last.fm for scifi stories yet in addition to soundtracks. Some sites have rankings over all time and there are lots of separate blog reviews. There might be a demand for a personalized recommendation system as an improvement over Amazon. Readers can stream actual science reporting and real summaries. Writers can complement metadata with imaginary data which may also find its way into art or other media, e.g. on youtube or spacecollective. DIY Symphonic soundtracks may be around the corner, e.g. ujam. Studies or creation of space and matte painting are ways to develop sensitivity for these types of images, e.g. deviantart. It is hard to do something outlandish enough, yet true to the collective unconscious, that a real discovery won’t soon be reported in the news.

The year’s Best Science Fiction, 26th Annual, ed. Gardner Dozois, 2009

Scifi is being used more to get ideas across, science or futurist, where nonfiction may not reach the audience. Short stories are less of a stretch for twitter-types than novels. These readers are accustomed to extreme conditions or moods. Education and energy are an invention away. The editor of this book was originally from Massachusetts and the genre was a way to escape the local confines. This text includes a general summation of publishing for the previous year. There are thirty short stories, each having an intro. These are followed by honorable mentions. Paul McAuley and Ian MacDonald each have a couple of entries. Perspective is about even, 1st having thirteen and 3rd seventeen. Cover art is by Chad Beatty.
Scifi is the fantasy side of technology where scenarios can be explored that can sometimes turn into real movies or games. Of course, haven’t yet seen a blade writer, but animations are very realistic and it may incrementally develop from remixing and adapting existing content to originality from the likes of the machinima sphere, simplified for human enjoyment. Faulty industrial designs that almost made it to production may be the equivalent of cliff hangers or detective yarns in worlds of ruthless environments, corporations and governments. Globalized sources may reflect the local engineering mythos.

Blogs of interest:

Haque on institutional innovation


Marcus de Sautoy on mathematics of symmetry

Lee Rainie on internet usage changes

Documents of interest:

Measuring Test Execution Complexity (PDF)
Measuring model complexity with the prior predictive, Vanpaemel (PDF)

Seerability scifi by yours truly

Posted in Uncategorized | Tagged: , , , , , , , , | Leave a Comment »

Summer of Parts

Posted by cadsmith on May 24, 2010


Digital consciousness has made strides in creativity and biology. Technology is an enabler, though complexity can render further opportunities invisible. Q&A is becoming easier to find. Theoretical frameworks can aggregate sources as shown in books and documents. A synthetic philosopher has not yet been demonstrated, nor the answer to whether a pair would agree.

Recent links: test: Phoronix benchmark, .NET unit; computer: molecular brain, data format archive; robotics: free Microsoft RDS; email: Google gadgets, wave is public; search: Yandex; social media: Vinehub, cafe in a box; tv: Clicker; video: WebM project; technology: synthetic biology, quantum teleportaion 16m, Kostner oil cleanup, China conference livestream; finance: Banksimple; Apple racing for cloud cap.

Blogs of interest:

NYT column What is a philosopher? Plenty of comments follow the post on this new blog “The Stone”. None had said, so far, that philosophy is technology. This seems extreme and falsifiable, but there may be a movement toward its proof, as has happened for science and medicine. If bodies are machines, and knowledgeable wisdom is embodied, then perhaps this too can be practiced. The subtypes might then become those of computation, e.g. cloud, code, robotic, quantum and so on. The technology problem becomes interesting, whether evaluation, determinism, pace, side-effects, human limitations, economics, oppression, etc. Predictions are the product of theories; better questions, along with meaning and morality, of philosophy. The staff of Moses may have new significance.

Book reviews:

Bursts: The Hidden Pattern Behind Everything We Do, Barabasi, 2010.

This breakthrough book delves into the reasons for the way motions tend to suddenly occur together in a group. The absolute answer when it comes to humans usually involves priorities and can be described by power laws. Long periods of inactivity are followed by brief intensity. Natural phenomena often have yet to be discovered rules or states. The metaphysical mystery is that randomness is required for deterministic prediction in the form of probabilities. This learned lesson is blended into the story of the 16th century Crusades and the leader Gyorgy Dozsa Szekely, the last name means frontier guard, who rose from obscurity to final execution. Poisson and Popper are brought in to argue about possibility of prognostication while Einstein offers diffusion theory and other mathematicians document collected cases. Twenty-eight chapters punctuate the flurry of facts illustrated by the historical and scientific illuminations of Botond Reszegh. The style of the narrative is densely detailed and seems to have been written in a nonlinear fashion, the moral of the tale saved for the calculated conclusion.

Philosophy of Technology and .Engineering Sciences, edited by Meijers, 2009.

This is a handbook for the student of technology. It summarizes knowledge and practice, lists questions, and sets a research agenda. The contents are hierarchically organized. Philosophy is the study of aims, methods and assumptions. While that of science was about representing reality, it is applied here to five disciplines: architecture, agriculture, medicine, biology, and information. The original literature search resulted in sixty-five topics which were then constrained by available experts and authors. This led to 41 chapters in 6 parts for the definitions and theories, epistemology and ontology, design, modeling and methodology, norms and values, and issues. Each chapter has introduction, discussion analogous to the overall parts, and conclusion. For example, computer science is described as modeling and designing artifacts. It looks at the nature of information. Computational philosophy superseded linguistics in the 90s and cognitive artifacts now combine hybrid human and computer components. This book itself might be a candidate for ontology extraction by the semantic web. Another example would be synthetic biology where Craig Venter’s views on digitization and the writing of the genetic code are discussed.

The Semantic Web for Knowledge and Data Management: Technologies and Practices, Ma and Wang, 2008.

This book describes semantic web applications in a series of twelve papers and indicates areas for further research. The authors have demonstrated ontology extraction from UML Semantic mobile is large-scale, but not as well-linked as rest of semantic web. E-tourism uses machine-readable RDF repositories and reasoning as knowledge bases instead of databases. XML-based P2P systems utilize semantically-augmented query, semantic relationship matrices, and RDF schema graphs. Semantic web-aided rich mining system imports ontological data from domains. Semantic overlay networks coordinate peer data management systems through semantic routing indices. Semantic annotation automatically extracts entities and relations. Ontologies, topic maps and RDF can be stored in a relational database for easier modification. Fuzzy description logic expressiveness adds complexity. Other examples include probabilistic models, intelligent agents, and contextual concept discovery. Links are shown for sources on the web. The semantic web corpus would seem to be a worthwhile application to present and navigate the concepts, contributors, code and changes. This might be a precursor to tackling the topic of technology beyond linked data mashups.

Creative Space: Models of Creative Processes for the Knowledge Civilization Age, Wierzbicki and Nakamori, 2005.

This is a delightful treatment of preverbal creativity. A knowledge civilization is expected to exist through 2100. Philosophy will be emergent. A new informed systems approach, beyond games of words, and involving social interactivity so it is not completely automated, will help solve the pressing and future issues and paradoxes. In the midst of computerized creative environments, a topic outside the scope of the book, there is a concept of a space comprised of about 3.5 billion transitions between nodes, each representing a unique concept. A Shinayakana approach, the synthesis of hard and soft systems, and a mix of Oriental and Occidental perspectives, e.g. of wisdom and logic, is applied. The authors hail from Japan and Poland. As far as style, emphasis is provided by italics for phrases or box outlines for paragraphs. Spirals are used to diagram creation processes, akin to circuits or cells, similar to the book The Grammar of Technology Development. There are three for organizational knowledge, three for normal academic knowledge and another for revolutionary scientific. They can be juxtaposed or merged, for example into a triple helix projected which functions as a sort of enlightenment press. All are shown together in a tree of types. There can be a lot more observations after an appropriate gestation period. The reader may wonder, if mind is action as the text extols, if they have also developed a material that can replicate human intuition.


Grace Hopper by Beyer 2010,

Aubrey de Grey at TEDMED 2009

Documents of interest:

A General Perspective on Role of Theory in Qualitative Research, Tavalleai and Talib, 2010 (PDF)

Philosophy of Man and Technology, Verbeek 2009 (PDF)

Has the Philosophy of Technology Arrived?: A State-of-the-Art Review Ihde 2003 (PDF)

A Critical Examination of Heidegger’s Thoughts: Technology Places Humanity in Shackles Hindering Our Natural Thinking Process and Our Connection to Being, Renee A. Pistone

The Role of Theory in Research, Kawulich

Logical Structure, Theoretical Frameworks, Cline and Clark, 2000

Toward a Philosophy of Technology, Jonas 1979

The Question Concerning Technology, Heidegger 1954

A Visual Study Guide to Cognitive Biases

Techne: Research in Technology and Philosophy journal

Internet Encyclopedia of Philosophy

Posted in Uncategorized | Tagged: , , | Leave a Comment »

Praxis Upheaval

Posted by cadsmith on May 15, 2010


Information layers seem to spiral as data doubles every year or so. This may find its way to storage structures for accessibility and analysis. In the meantime, semantics do heavy lifting, while users choose centricity, e.g. actor, method, device, network, or space, and match measures to media. Any location may resolve to surface, signal, transform, translation, filter or fractal. Recent links:

Visualization, VTK, Protovis, prefuse; 3D, Autodesk Inventor, camera, Gemvision; Robotics, Anybots telepresence; Socnets, OpenID Connect, OneSocialWeb, Udemy online courses; Translation, Google audio; Security, KHOBE shreds Windows walls,  Verizon cloud; Quantum atomic optical computing, location-based cryptography; Energy, EIA Annual Energy Outlook has a national energy modeling system (NEMS), and scenarios to 2035. smart grid, again; Space, Moon Zoo, multiverse; Psychology, Analyze Words twitter sentiment,  internet improves well-being; Documents of note: The Fate of the Semantic Web, Pew Research, 2010 (pdf), On the Essence of Truth, Martin Heidegger, 1943.

Book reviews:

Re-Designing Learning Contexts, Luckin 2010. This book concentrates on technological literacy for personal and collaborative education. Luckily, it was available for Kindle. Scholars are defining what knowledge is, and it is up to the participant to try to find improvements, learn from failure and synthesize resources. The author, an expert from the UK who will be keynoting the ICICTE conference, highlights interaction and presents the equivalent of a slideshow narrative on the status and needs of the learner. This follows previous work on learning outside of school, holistically within a lifetime. About four hundred publications are organized into nine chapters relating to three parts of background, ecology and future. Context is internalized enough to become independent of elements such as place, culture and technology. For example, Paul Dourish is significant for notion of embodied interaction. Historical approaches include information processing, behaviorism, constructivism, scaffolding in the form of hints or next steps, zone of collaboration, and cognition. Types of learners are supported by more able partners (MAPs). Many software solutions are discussed which emphasize combinations of discussion, tutorials, videos, multiple representations, visualization and simulation, distributed scaffolding, recognizing learner beliefs, collaboration and props such as toys. An ecology of resources model defines filter elements for knowledge and skills, tools and people, and environment. Case studies are distinguished by creation process (brainstorm, focus of attention, categorization, filters, resources, MAPs), relationships, and scaffolds. Interaction models structure conversation, scaffolding design, learner context, and locale and mobile frameworks. A practice example uses lesson plans, whiteboard, tablets and homework activities. Some of the web2.0 terms are used such as tagging and crowdsourcing. Links are given for resources, e.g. rixcentre.org which handles learning disabilities. An attempt to use http://www.autotutor.org appeared inconclusive since it did not seem to understand typed answers on well-known subjects, such as how the internet might be (re)designed, and hand-waving, e.g. to draw gravity wells to show how differently sized objects might respond to eachother, were not allowed. To be fair, the same word problem was completely incomprehensible to a cloud math engine so there is a ways to go for these approaches to be general purpose or compatible with other large scale efforts such as EarthGame.

The Nature of Technology: What It Is and How It Evolves, Arthur 2009. This book presents a theory of technology which directs human life, is beneficial and results in economic arrangements and activities. It is alive in the sense of a coral reef, yet separate from incremental biology which may yet become technology. It involves combinatorial evolution shaped by demand, modularity, recursive structure, and mechanistic bootstrapping. Structures are deepened by subcomponents for performance, monitoring, adaptation, and reliability. The theory is derived from scientific evolution and self-correcting paradigms. It also drives economics which is non-deterministic. The result is physical, yet digitization is the currency. The needs arise from growth of society, support of tech itself, and fixes. Innovation tends to be nation-centric based on deep craft of local cultures. Technology is a means to fulfill a human purpose, the idea of use or programming of a phenomenon for some purpose and resultant cultural practices, components, devices and engineering principles and architecture. Standard engineering aims to solve problems. Invention occurs by mental association. Concepts are realized in physical form. Clusters of common theory form domains which can be categorized hierarchically. Innovation may involve redomaining, e.g. in economics. Solutions become components for further developments.
The reader can make some criticisms. When it comes to theories, testing and falsification of premises and claims are usually required. This one does not show the math nor discuss limits, e.g. due to scale or sustainability. It is predictive only in a causal sense since scientific instruments are a form of technology which discover new phenomena which result in new technologies. Human evolution includes all knowledge and activities, so any particular area, e.g. technology, may already be considered part of this. There are discussions elsewhere that global networks are more for communication than economics and may be prone to politicization. The ROI example of Columbus in the New World may actually be cautionary for indigenous natives. There are risks for controlled ownership by big corporations on one side or open-source and commons on the other. Discussion of robotics, or requirements for education or ethics, for example, would require additional sources.

The Grammar of Technology Development, edited by Tsubaki and others, 2008. The theme is methodologies for quantifying technology development. Idea is based on “The Grammar of Science” 1892 using statistics. A grammar is a description of approaches for selection of the most effective. Three parts provide a total of fourteen papers covering systematic modes, design of experiments and statistical methods. Digital engineering makes use of computer aided engineering, quality control, simulation, verification and validation. The technology development process can be modeled for interactions between virtual and real society. Systems science can also make use of intuition in micro-theories of knowledge creation which are several orders of magnitude better at information processing than verbalization and includes the collective unconscious. Spiral processes in a creative space use dimensions of objectivity, subjective, intersubjective, justification and reflection. Ecodesign is environmentally conscious and has compatible tools. Social networks show useful gaps in three types of communications: interactive, distributed, and soapbox. Simulation matches theoretical and actual conditions and tests the model, its own issues can be compensated for by calibration in the design of experiment. Measures for performance evaluation include hardware simulation, software simulation, and approximate analysis. Several case studies are shown for an example product, digital factory, web-legacy client-server system, and musical search by humming. Many of the articles introduce keywords for the statistical concepts used in that type of grammar and use figure and tables to illustrate the most significant data. This book was the result of a conference of several different authors so it is left to the reader to correlate the ideas. The case seems promising and the combined use will likely lead to additional efforts that can be evaluated for improvements or automation.

Technically Speaking: Why All Americans Need to Know More About Technology, edited by Pearson and Young, 2002. Planning for the future involves significant emphasis on society, economics, and environment. These in turn depend upon development of technology to solve problems, lower costs, and use resources more efficiently. Somewhere along the line of familiarity with a single tool, science experience, and engineering, there is a general technology skill. As values and philosophy are revealed, there seem to be no clearcut definitions of exactly what an essential approach would be or how to recognize the better ones from many attempts. A question arises as to how to remedy this. Education is the usual answer and this then raises another issue of how to teach technology. This book centers around the notion of technology literacy, including capabilities, knowledge and ways of doing. The theme is nationalistic, though international attitudes are described. Some of the problems, e.g. lack of understanding of the electrical grid, have since had new solutions such as smart grid to compensate by automating some of the decision-making. This book is clearly written, formatted like a textbook with boxed summaries and reading lists. This is a subject which can be taken for granted only at peril since the ongoing changes in population, demographics, ideologies and climate seem to be increasing in complexity at faster rates over time. Free complete online preview at http://books.nap.edu/catalog.php?record_id=10250.

Posted in Uncategorized | Tagged: , , , , , , , , , | Leave a Comment »


Posted by cadsmith on May 10, 2010


The byproducts of technology include solutions as well as further problems. A question is how it can better resolve these itself as the speed, scale and subtleties increase. In methodology, there are the exceptions that prove the rule. Testing pursues failure to push advances in design and implementation. Objectivity encourages an overall view that adjusts the structure to best fit values which become explicit and are themselves selected for adaptation and progress. Computer-aided scifi design may be closer than it appears. Recent links: HTML5 activated on Scribd, deepwater horizon spill, Boston 3D water map, NASA Intelligent Robotics Group, mobile trends, Futurist predictions for 2010 and beyond, Recorded Future temporal analytics engine, and really fast quantum molecular computing. Documents of note are Philosophy of tech, and we recently added Meanability to the creative series. Book reviews made for:

Python Testing: Beginner’s Guide, Daniel Arbuckle, 2010. Scripts are often used to run tests, e.g. as series of shell commands, user interface protocols, builds, or regression. Python also runs apps and webpages. The author provides examples of how to test scripts themselves. As a type of test-driven development or automation, types include unit, integration, system or website. The method includes visual diagramming, command syntax and api documentation, test wrappers, data files, error injection, code coverage, mock objects, search for test files and version control. Ten chapters each explain the steps, add a “what just happened?” description of the demo results, have a pop quiz and chapter summary. Tools include doctest, unittest, Nose, twill, python mocker, bazaar, mercurial, git, darcs, subversion, and buildbot.

Socialnomics: How Social Media Transforms The Way We Live and Do Business, Erik Qualman, 2009. Social media supports the social economy. Conclusions may seem familiar, but they are backed up by documentation. This is more statistical than personally anecdotal. Media is the examined life. The social graph is a big referral program. Friend status and word of mouth are global in realtime. Advertising is constantly changing for better targeting; Micropayment model adds up. Can find good sources of free and fast media, e.g. bloggers. News finds its way to readers immediately via aggregated feeds. Author recommends using best existing social media rather than starting own. Maxim is that using social media increases productivity. It influences election outcomes. It increases efficiency, is mobile, saves time and eliminates redundancies. It has replaced email. Social search may replace google type. Collective intelligence will be predictive. Each of eight chapters summarizes key points. The author tends to be optimistic. Other sources would be required to cover issues of privacy, walled gardens, or new technical approaches such as activity stream protocols.

Handbook of Research on Technoethics, Luppicini and Adell, 2008. The name of this field was coined by Mario Bunge in 1977. It is an extension of the arguments of scientific ethics. Anthropological approach considers social values and improvement of humanity through technology. This includes issues such as privacy, race or identity. Factors may vary among cross-cultural networks so the problem of universalization of interest is examined in the cases of Habermas, and of Rawls’ theory of justice as fairness. The law of technoethics has ethical rights and responsibilities commensurate to social impact and concerns how these are assigned. Applications include education, sport enhancement, biomedical and genetic engineering, nanoscale and military research, AI, healthcare, computers, information and communication, digital citizenship, news media, careers, politics, security, economics, and environment. Regional studies include end-of-life in China, and software piracy in Pakistan and Canada. This two-volume book of 1082 pages originally cost almost five-hundred dollars. Four sections contain fifty three chapters usually having an abstract, intro, coverage, future trends, conclusions, references, key terms and readings. There is a chapter on autonomous artificial moral agency, e.g. the work of Luciano Floridi, which may be of interest regarding questions of proxy. Figures and tables include a conceptual map of technoethics, and a belief systems model for complexity-based ethics and nonviolent resolution of ideological battles.

Between reason and history: Habermas and the Idea of Progress, David S. Owen, 2002. Jurgen Habermas was born in Germany in 1929 and grew up in WWII Germany. He developed the theory of communicative rationality where rationality is found in structures of interpersonal linguistic communication rather than cosmos or knowing subject. His concepts include reconstructive science, public sphere and the idea that critical philosophy is effectively about communication rather than economics as proposed by Marx. The author enumerates the types of discourse as aesthetic, therapeutic and explicative. He observes that Hegel showed equivalence of philosophy and social practices and institutions. Critical social theory reveals ideology so that consciousness becomes more rational and developmental logic can develop universal moral principles. Progress is made along technical and moral dimensions similar to Weber. Horkheimer and Piaget are also discussed. This has yielded western technology, but rationalization is not emancipation. The author seeks how progressive social change occurs. Per the theory of social evolution, social networks may accelerate convergence of developmental logic.

Posted in Uncategorized | Tagged: , , , , | Leave a Comment »

Rainbow Positioning System

Posted by cadsmith on May 5, 2010


Following request for comment, new ideas may invoke defensiveness in some, delightful evaluation in others, and possibly incomprehension requiring eventual rediscovery in yet others. Media is emergent though not yet autonomous. It still takes conscious people to make meaning of change, or to consider relevance and consequence. Automated research may eventually yield familiar forms of presentations, but automated reality may seem comparatively upside-down if information density increases by miniaturization. This would be a new context for adaptation. Perhaps philosophy can reinforce and extend the scale of tech and scope of cultural dependencies, if it can survive inquiry. Predicting business requires betting on future value.

Recent bookmarks. Futurict promotes a sustainability clearinghouse. Academia.edu has a topic researcher directory. NASA astrobiology site. Japan plans a lunar robot. Acquia discusses cloud webservice hosting architecture in video. Amsterdam’s Usabilla supports website testing. Spirent offers cloud testing. Pogoplug adds USB cloud storage. Layar hosts an augmented reality marketplace. Textie.me does ipad messsaging. Dailyplaces produces location-based microblogging. Google living stories is now on wordpress.

Book reviews include:

Decoding Reality: The Universe as Quantum Information, Vlatko Vedral, 2010. The news is that a symmetrical pair of processes, the second law of thermodynamics and derived meaning, are enough to generate reality. This bootstraps the existence of information which outweighs matter and energy, while the universe moves to maximize entropy and disorder, and we embody natural laws. That provides a source of ideas which the scientific method, or its analogs in other disciplines, turns into rules of nature. Quantum physics reveals meaning and the other side of the story of creation. The author synthesizes a coherent framework for quantum information science. Landauer’s principle that information is physical, where entropy is proportional to surface area, inspires a combination with Shannon’s information as inverse probability, Boltzmann’s constant, and qubits, to yield randomness at small scales and determinism at large. Twelve chapters explore perspectives of biology, thermodynamics, economics, computer science, sociology, philosophy and quantum physics. Each chapter mixes explanation, observation, anecdotes and humor, and is followed by a summary of the key points. Applications include cryptography, teleportation, climate, diet, segregation and gambling. Literature sources include Popper, Smolin, and Singh. Compare publications by Seth Lloyd, George Johnson, Michael Nielsen and Isaac Chuang, Amir Aczel or Raymond Kurzweil. The challenge is to integrate gravity to quantum physics.

Whole Earth Discipline: An Ecopragmatist Manifesto, Stewart Brand, 2009 This book ambitiously attempts to capture trends in the areas of climate change, urbanization and biotech. There is a lot of data and requirements for much more. It is wordy and tries to wrap a scientific narrative.around things that hold promise for solving issues. The author is optimistic that new tech will provide alternatives and that human nature will shed romanticism for pragmatism. This is a unique datapoint along the scales that, for other authors, would produce do it yourself instructions, computation engines, cognitive advances, nanomanufacturing, human genetic engineering, martial law, or extinction.. It is rather a wholesome approach that prescribes sober effort to get hard data, especially needed about oceans, so that a feasible bearing can be selected from among these currents amid changing forecasts. Readers are treated to statistics about the “city planet” or unlimited growth and economics which include cell phones, electricity, squatters and crime. Nuclear power has become a commodity at the same time as weapons foreshadow the high cost of failure. The upside of genetically modified crops and foods, microbes, metagenomics, and biofuels is examined. Big nations will figure out solutions and curb their toxic tendencies. The author’s roots in the whole earth catalog revisit conservationism, native American Indians, and the oratory of Jerry Brown. This is an anthropocene age demanding new ethics and politics. There is a lot to criticize and be cautious about, which is also the point since the folklore arguments have become outmoded. The resolve is biased and it makes demands of near-future generations which may have local dissenters. The human eye can discriminate shades the most for the color green so that label may not automatically be a consensus builder. As far as the presentation, the layout could stand some alterations, e.g. to add graphics, swap the multi-page bullets for conciseness, and highlight conclusions amid the alternating pluses and minuses. The author was an originator of notions of planetary consciousness who still knows the players so this may be a brief breath of fresh air for those bogged down by confusing terminology and contradictory innovations.

The Windup Girl, Paolo Bacigalupi, 2009 The author presents a very detailed rendering of a near-term survivalist future where the biological clock has been conquered and greed drives society. The genre has been termed biopunk and also has touches of steampunk. A corrupt centralized Thai government controls the population. Corporate foodstuffs are genetically modified and licensed, and plague terrorists maintain scarcity levels for demand and prices. Scientists create synthetic animals and people as slaves. The characters are representative of the major functions of each of the sectors who try their best to succeed, and are combined in various ways to heighten the drama. They see each other through blinders that support their own egotistic biases. The author builds up elaborate structures and then destroys them, naturally, in personal spats, and in battle. A belief in reincarnation rationalizes the sacrifices. The story is told in third person omniscient perspective and, at emotionally intense times, has brief first person thoughts in italics. Each of the fifty chapters is a type of cliff-hanger. The cover art is representative of many of the characteristic elements including the big four-tusked megadonts and their mahouts, dirigibles, and green methane lamps. Some of the characters are from previous short stories. The title New Person combines DNA refinement, extraordinary training, robotic obedience, the resignation of a prisoner, jittery movements for identification, and perfect skin with pores too tiny to cool temperature enough. Others are used as soldiers in Vietnam. This is a cynical world where each potential improvement seems to engender constraints that neutralize or outweigh it. Blade Runner had replicants, but they were used off-world and did not survive long. The biology is more normal than Mieville. Brands’ nonfiction Whole Earth has symmetrical biotech, urbanization and climate though it is more optimistic. A setting in America may have had less mysticism, or in India more divergence of best and worst conditions. The earth does not turn into Mars or Venus here, but it does not seem able to return to any recognized conservationist stage either.

The Annotated Turing: A Guided Tour Through Alan Turing’s Historic Paper On Computability and the Turing Machine, Charles Petzold, 2008. Discusses what can happen when a mathematician attempts to design hardware, actually a mechanical process of proof, yielding a universal virtual machine using notions of finite state and storage, and beginning the field of computer science. This is a tour through mathematical history and a demonstration of how thoughts can be clarified, though details of the existential origins in time of national crisis may be hidden. The subject, one of the most often-cited papers of the century, e.g. impressing Claude Shannon during a meeting prior to his publication about information theory, had ideas which non-mathematicians also sought to understand for potential uses, and which continue to inspire approaches to logical problem-solving. The book author’s style is conversational as if second person directly to the reader. The contents of the original are presented intact with background, biography and blanks filled in enough to translate it to English for the casual reader. There is an extensive bibliography around the scientist and topic.

Posted in Uncategorized | Tagged: , , , , , , , | Leave a Comment »