Showing posts with label Video. Show all posts
Showing posts with label Video. Show all posts

Wednesday, September 24, 2008

Professor Clayton Christensen on MIT video

To be honest I am not a fan of Clayton Christensen's. I find he is standing on the shoulders of giants and not adding much more then a clear message. Lets call him the Oprah Winfrey of innovation. Nonetheless, this video provides some very interesting insights in the topic of innovation and where we find ourselves in the oil and gas industry at this time.

Professor Christensen has written many books all evolving around the theme of disruptive innovation. He sites three ingredients that are needed to develop a disruptive innovation.

Technological Enabler

As I mentioned here a while ago, the real innovation of this software development project is the virtualization of each and every worker in the oil and gas industry. It is impractical and inefficient for individuals who may work for a variety of JOC's and the companies that are represented there, to move from job to job during their day. Although most of the head office staff only have to go a block or two, people in the field would have farther to travel. Impractical by any measure. In order for people to physical meet and discuss the business of the JOC, the industry would be reduced to a standstill while people reconciled their calendars. The virtualization of the working environment and the introduction of Asynchronous Process Management (APM) (a cornerstone of the People, Ideas & Objects technical vision) deals with these two problems.

Business Model

It's not known if Christensen is talking about the People, Ideas & Objects business model or one of the companies in the oil and gas industry today. In terms of this projects business model, in comparison to the big SAP and Oracle applications, we provide a comprehensive vision of how we see the oil and gas industry operating. The Draft Specification captures this vision and provides an understanding of how an oil and gas producers business could operate. Oracle and SAP have made the sales to the oil and gas business, therefore they doubt anyone else would buy it, and that, other then they are not welcomed by the producers, is why they don't show up on the radar screen.

People, Ideas & Objects are making this application on a development cost plus basis. This application is not sold as an application but as a service. The costs of which are allocated throughout the global oil and gas industry. What was a major cost and effort to host and install Oracle or SAP are tasks borne of the bygone (bureaucratic) era. In addition to the software as a service, the community that is involved in building the software application provide the services to the industry in running and managing the oil and gas producers enterprises, or should I say JOC's.

The costs associated with these activities will be billed by the community members service based organization, and an assessment of a fee for the use of the software. The software will be billed on an assessment basis of x dollars per barrel of oil equivalent per year. At $10 / boe / day a firm with 200,000 boe / day would therefore pay $2 million for their use of this system. This is based on the development requirements as determined by the community.

The oil and gas industries growth strategy and / or business model for the past 30 years has been to muddle along. Great when the oil and gas was easy to find and produce. Muddling along doesn't provide a business model that can compete when the sciences and engineering are the key differentiating competitive advantages. The focus needs to be on the science in order to achieve profitable operations. Science and an ERP application like People, Ideas & Objects.

Someone tell me how a generation of oil and gas companies, reared on muddling along, are going to change their strategy to one based on the sciences in the time that the market needs more oil and gas? They haven't in five years that I have been trying, and Ill bet the ever decreasing size of the producers production will eventually trigger their own Wall Street type of collapse. As Nelson and Winter noted recently, industries evolve, companies come and go.

Commercial System

See Business Model discussion about how this community works.

Christensen goes on to note companies will either reject or co-opt a new technology. I can assure you the oil and gas industry has rejected this technology. I think I have also provided a strong case as to why their demise should be accelerated. This is the task I am calling for the shareholders of oil and gas companies, to withdraw their support of the bureaucracy and fund these software developments to ensure their best interests are managed properly. Otherwise face an uncertain future that may be as tumultuous as what Wall Street is feeling.

"Facilitated User Networks" is the third type of disruptive business model that Christensen notes. Stating they are lower in costs and more effective in terms of performance. I think this software development project meets that definition and criteria, and the oil and gas industry sure needs lower costs and higher performance. Design of the Draft Specification is clear, clean, crisp and concise. Anyone with 5 - 10 years of oil and gas experience can see the way in which the industry will operate and how their role fits within this community and the larger oil and gas environment.

The investors that have traditionally invested in oil and gas are of course the people who know and understand the business. There are others who have attained larger holdings and may have had some influence on the management in the past. These are the shareholders and investors that I think would have an interest in funding this project. If you know of any, and can send them the URL to this weblog, it would be appreciated. This project is now at a stand still until such time as funding is provided. I have established a modest budget to keep the "doors open" but up to this point I have received no indications of any funding. Please join me here.

Technorati Tags:

Tuesday, September 16, 2008

JavaScript and Java Applets.

I have expressed my concerns about exposing this applications client side to any JavaScript. JavaScript is unable to carry the freight in such an application as this software development project will demand. JavaScript has traditionally been buggy, non-standard and too functional for its own good. I have changed my mind about potentially using JavaScript on the client. The reason for this change is that I know we may not get there from here. Where there is, is accurately captured in this video.

This video discusses the effect of using small amounts of JavaScript in a browser window, or in our case a Java Web Start application. The upgrade is primarily to do with Java and its Applets, not JavaScript. These changes in Java will be what sets off an entire new revolution of client-side computing. With the applications architecture being Java and JavaFX the users of this application will be provided with an elaborate interface that will establish new paradigm's and methods of user interaction. I think this is just the beginning. Many different directions can be taken on the client side as a result of these technologies and we will see robust, stable and secure client side processing that befits the users of this application. It is therefore time to adjust our thinking regarding JavaScript.

Up until now we had incorporated Google's Widget Toolkit (GWT) to render the necessary JavaScript code from Java. No actual JavaScript was to be hand written and that provides an acceptable level of JavaScript associated risk in the People, Ideas & Objects application. Does leaving the coding of this functionality through GWT still provide us with the types of technologies that are demonstrated in the video? I don't know, as the video was using some technologies that are not generally available today, but the presenter was talking throughout about a Java to JavaScript bridge. Possibly Sun has incorporated many of the same technologies as Google's GWT. Either way these new ideas are employing JavaScript in a very small role that has very limited actual work. (Messaging)

I feel reasonably safe with these technologies such that I am not willing to give up on any of the upside of the associated benefits.

Technorati Tags:

Thursday, August 07, 2008

Marshall Carter on MIT Video

A new video worth watching from MIT has Marshall Carter, Chairman, Board of Directors, New York Stock Exchange Group, and Director, NYSE conducting a case study in the changes he implemented.

Before I get into this video I want to communicate the process this software development project is taking.

  • May 2004, Publication noting the Joint Operating Committee was the legal, financial, operation decision making and cultural framework of the oil and gas industry. And was the means in which the oil and gas industry would become innovative.
  • May 2004 to December 2007, research into the validity and requirements of a system to support the innovative oil and gas producer.
  • December 2007 to July 2008 Publication of the Draft Specification.
  • July 2008 Determine the current management, systems and leadership are failing societies demands for energy.
  • August 2008 Define and develop sources of revenue. Commence development (defined below)

Marshall Carter in setting out his case asks the following questions. I have answered these same questions from this software development projects point of view.

1) How do we know when to change?

There has to be a wide consensus that now is the time to change the current management and systems within the oil and gas industry.

2) How do we know when to launch our new strategic direction?

When the problem is evident to every energy consumer and every energy investor that the current course is a dead end.

3) How did we do it.

In a few years we may be able to answer this. I would suppose that the timetable above adds some clarity as to what has been done and where we are going.

4) How did we convince employees.

Most of the users and developers are sourced from the energy companies themselves. This is necessary as they are the ones that know and operate the business. They are also aware of the current situation and direction at the oil and gas company is futile and may not survive the disruptive changes that the industry will be going through.

5) How much effort would be needed to ensure the changes stick.

I believe that the Draft Specification answers many of the questions of what fits where. It also answers many of the problems that are systemic in the industry today. This system is the most logical means for a producer to operate. Therefore the natural tendency of users is to default towards the Draft Specification.

Marshall Carter then states that it was necessary to "build a vision from the bottom up". If anything, I think the hostility that management has shown to this project, and the hostility that I have been able to return prove this is not a "top down solution".

Carter also states "show those that resist change, that change is irresistible." I think forward progress of this software development project will soon prove to the management their way is dying. The following eight items are what Carter suggests is necessary for leading successful change. My response to each point is provided.

Leading Successful Change

  • Sense of Urgency

There is no greater sense of urgency then the one that the energy consumer currently faces.

  • Guiding Coalition

The use within this project to use the collaborative tools and methods to make this project a result of the users and developers who work within oil and gas. What has not been expressed before is an appeal that I think resonates with the users. Users have ideas on how to make things better. They don't have access to change the Information Technologies that they are required to use. This software development project enables them with a software development capability, source of revenue and chance to affect change within their area of work.

  • Vision & Strategy

A vision and strategy that is grounded in the research and academic thinking. A strategy and vision that captures the possibilities of the Information Technologies available to users today.

  • Communicating the Change Vision

Blogs and Knol's are powerful tools for reaching out to like minded groups.

  • Empowering Broad based action.

This is more of a personal decision for the users and developers to make. No company or manager needs to approve their participation here. People with Ideas and who need software Objects to help them do them their jobs.

  • Short Term Wins.

We can move to provide the short list of development targets (listed below) within a reasonable period of time.

  • Consolidate Gains and Provide More Change.

The development targets should enable the community to move further and faster then they ever believed they could.

  • Anchoring new approaches in the culture.

Using the Joint Operating Committee is enabling the use of the culture of the industry. If this is a requirement of successful change, what does that say about this software development project?

Marshal Carter towards the end notes that what gets measured gets done. So I want to set out these short term targets for the community.

  • Establish a user based definition of security and access control requirements.
  • Establish "User Archetypes" that implement the Military Command & Control Metaphor.
  • Develop and test the Security & Access Control module using Sun's Federated Identity and Project Hydrazine.
  • Go live with the users of this systems as soon as possible. Iteratively improve the products user interface, performance and security to meet and exceed user based standards.
  • Resell the security offering under license to other industries.
  • User based Wiki development towards final specifications.

Lastly Carter notes "Engineering systems at this point is a thinking [and building] process which allows you to identify and solve problems". So lets get to work. Find people to donate and participate in this project, and join me here.

Technorati Tags:

Thursday, July 10, 2008

Google Eclipse day.

Users who are interested in working with developers may be interested in viewing the state of software development tools available today. Google recently hosted a "Eclipse Day" which showed some of the more interesting developments of that tool.

Eclipse is a free software download. The product was originally donated by IBM. I use NetBeans which is a competitor to Eclipse, and is provided by Sun Microsystems. These tools are competing aggressively and provide an unbelievable level of software development capabilities, for free.

One area that will be of interest to Users is the collaborative nature of development today. We have all heard of software that is developed by people who have never physically met one another. That will be the case for the users and developers of the People, Ideas & Objects applications modules. Both will be able to communicate through the tool, the users not having to read or write any code, necessarily, maybe, but can share many different aspects of their work through the tools themselves.

I prepared a YouTube playlist of the five videos here. Each with about 50 minutes of viewing time. Enjoy.



Technorati Tags:

Thursday, October 18, 2007

MIT Energy Council on MIT Video

I am extremely disappointed with the direction of MIT's Energy Council. MIT President Susan Hockfield made a video update; you can view the video here. I originally wrote about what I thought about their focus and direction here. It now appears they have lost that focus and hence are lost on the real issues. Talking more about the concern for CO2 and alternative energies are blind, dark bunny trails for those that don't understand the real point. Coal, oil and gas make up the majority of the sources of energy and will continue to do so. The ability to meet market demand for energy is not sustainable and a world class leadership from the likes of MIT would have made the journey a little easier. It is now clear, in this almost incoherent one and a half hour presentation, nothing of material value is being done on energy issues at MIT.

Technorati Tags: , , ,

Wednesday, September 26, 2007

Jeffrey Immelt on MIT Video

Jeffrey Immelt is the current Chairman of General Electric. His video presentation is at the MIT Energy Conference. This is a very good video and I would encourage anyone to watch it for the unique perspective and insight that Immelt has. Click on the title for the url to the video.

Speaking about the difficulties in the energy field, Immelt noted that "market signals don't fit the time horizons", and I have to agree that is certainly the case. The three-months time frame that the capital markets operate within is not enough time for the energy industry to do anything. And the decades long lead times for satisfying the demand for energy are two areas where this disconnect happens. With such long lead times necessary to achieve anything in oil and gas, the markets always seem at odds.

He also spoke of the "notion that energy is free". This notion that he speaks of is, I think, is the same concept that makes people expect they have the right to energy. I hope that we can continue to experience these rights and entitlements; however, I think that our future holds occasional energy outages and increased costs.

Immelt noted from his personal experience in traveling to India that demand for energy from China and India would not stop growing. In satisfying the needs for energy he states, "This is the time that technology and innovation can have a value". He felt that coal, natural gas and oil were going to be as important as they ever have been. And noted his turbines where operating at 65% efficiency, and indicated that reducing consumption was an area where much innovation and savings would occur.

He finished his presentation with two of what he calls "Immelts".

If you want to do something, you have to do something.
and
You want it bad, you get it bad.

Technorati Tags: , , , ,

Saturday, June 30, 2007

Stanford University Statistical Course

Google video is hosting a statistics course;

"Statistical Aspects of Data Mining (Stats 202) Day #"
This course is being taught by Professor David Mease and is being broadcast on Google video. Click on the title of this blog entry for the URL of the first lecture.

This course provides valuable means of which to automate the processing of data. With most software developments, the amount of data that is produced is high. The ability to review it all in text is impossible without the use of advanced tools available to the developer / advanced user. Logs and stack traces alone will inundate even the most enthusiastic. This statistics course provides hands-on use of the statistical applications of Microsoft Excel and particularly "R". Those that are able to install and use "R" will have distinct advantages over those that are unable to review these large and growing files.

The professor is young and well organized. Using real world data such as Safeway Club cards, and hands on use of the real data within the two applications themselves. He has an entertaining fast paced style.

The time spent in reviewing these course materials and videos will pay large dividends as we move more and more into the IPv6 enabled world that is a fundamental part of the "Genesys Technical Vision". Finding errors and problems may only be able to be sourced through the types of data manipulation discussed in this course. I can assure that in 5 years these types of skills will be as generic as using a spreadsheet and word processor are today.

Technorati Tags: , , ,

Sunday, June 10, 2007

The Software Concurrency Revolution

Another Google video in advanced programming languages. Click on the title to view the video.

Advanced Topic in Programming Languages Series: Effective Static Race Detection
Stanford Professor Alex Aiken

I am certain that I share a concern with many of my potential customers. That concern is of using a "web service" that services the commercial business needs of an entire industry. The concern is that not all the bugs are worked out and as a result things begin to be recorded incorrectly. One of the unique ways in which these bugs could be exposed is by pushing the science further then it has been pushed before. And, taking into consideration, the understanding of how Moore's law is being achieved through threading and multi-core processing that "concurrency" becomes an issue.

A Race Condition is defined as;

"The same location may be accessed by different threads simultaneously without holding a common lock. (And at least one access is a write.)"
Concurrency (Race Condition) is essentially the ability to have one variable overwritten by another unrelated variable and corrupting the data. This has been handled well in the various relational databases with a number of locking scenarios that ensure that the inserts, updates and deletes are done effectively. But what about the memory registers in the program itself. With multi-core and threading enabling multiple reads and writes to those memory registers, how does the program ensure that the ordering of those memory items maintains the programming logic? One particularly nasty point about these bugs is that they may be random, no one is aware of their existence and sometimes they are unique and therefore not re-creatable as other bugs are. How common are they? Professor Aiken said he found 392 race conditions in the open source Derby Database. An open source product originally developed by IBM.

As we progress down this road, and start development, these "race" conditions are of grave concern. These bugs are state of the art in terms of the types of problems that the academic community and developer of the Java Programming Language (Sun Microsystems) are working on. This video discusses this problem and the difficulty in identifying and resolving "race" conditions. Professor Aiken does an excellent job in making it known that identification of race conditions is the tough part, with their resolution very easy from a programmatic point of view.

As the tools, policies and techniques continue to develop we will be spending more time on this critical issue. The need for data integrity is first and foremost in the usability of this system. I trust the majority of these issues will be resolved and rectified when we go into commercial use of this needed software.

Technorati Tags: , , , ,

Wednesday, April 04, 2007

Change artists; Stories from the Real World: CEOs, CIOs and Change

HP in cooperation with CNN and CIO Magazine have produced a series of videos focusing on change, and particularly technological change in organizations. Click on the title of this entry, registration is necessary to review the videos, and I highly recommend it.

A particularly interesting video is the Chevron CTO Don Paul talking about his business. If only we had such progressive forward thinking leaders here in Calgary.

Technorati Tags: , , ,

Wednesday, March 21, 2007

Hydrogen isn't it.

That is of course just one man's opinion. Thankfully the scientific community has taken up President's Bush's challenge on alternative energy sources. This video is also a continuation of the MIT Energy Research Council announced earlier and is a part of the excellent series on the challenges of energy.

The problems are derived from the energy security point of view. This video provides an understanding of the challenges that are faced in using hydrogen as an alternative energy source. This Professor stands out as being one of the recognized leaders in physics and electronic engineering and has a great delivery. As with many of the videos presenters, she identifies the problems that we face in terms of the global energy challenge from her own particular point of view. The growth in populations that are increasing their standard of living, China and India to name a few, are where the demand is increasing. Professor Dresselhaus mentions that this demand is not linear. Noting also that the U.S. consumption does not change necessarily with respect to the GDP increases.

As we are aware, the higher quality fossil fuels, are mostly located in the difficult areas of the globe. The use of coal by the U.S. and China are probably going to continue and may increase as demand for energy increases and the supply becomes more constrained. It is also projected that fossil fuels will supply 30% of the energy in the future, down from 80%, however I find that to be a surprising reduction. Some interesting points in the discussion include hydrogen provides twice the "power" of gas.

Professor Dresselhaus talk is mostly on working with hydrogen, and particularly the storage challenge it provides, and how Nano Structures provide a fresh look at these hydrogen issues. What is required in order to use hydrogen is a variety of catalysts to produce it, to store it and then to use it. The Nano Structures change the properties of catalysts. This provide three benefits during the catalysts phase. They increase the volume of hydrogen storage, they reduce the storage temperature requirements, and "borazene" gets trapped. Professor Dresselhaus also notes that gold is a catalyst in nano application even though gold is not normally a catalyst for hydrogen. She also draws a parallel to the benefits that Moore's law has provided in terms of computer processing capability. This she notes is this scope of benefit that is necessary to solve the worlds demand for energy in the future.

Hydrogen is a transmission agent of energy that were inspired by fuel cells. The use of hydrogen today is specific to its application and the U.S. produces 9 million Tonnes per year. The source of this hydrogen is from hydrocarbons and therefore limits its upward growth opportunities. What is needed is to extract the hydrogen from water which requires a catalyst and hence is very costly and difficult to do, particularly at what would be expected as commercial volumes.

Many of the areas that progress is beginning to be made seem to be more on the energy demand side of the equation. Light can be provided for half of the energy demands by using LED's and Photonics. The demand side is the area where most of the scientific advances are of benefit at this point in time. Professor Dresselhaus stresses again that the most critical issue here is the storage of hydrogen, noting that Hydrogen needs five times the storage size of gasoline. Hydrogen molecules are separated quite far apart. What is needed is a Nano particle as a binding agent to reduce the storage requirements. One binding agent Ammonia was shown to be benign and inert as a storage medium.

One can clearly see the issues that storage of hydrogen makes. The costs of these materials, the energy they consume themselves and their safety has prioritized the science community to focus on these first and foremost. One opportunity Professor Dresselhaus notes is the recent discovery in sunlight conversion multiples. Soon the output of solar cell could increase from its current 12 -13% efficiency, and with nano technologies, this can be brought up to a 30% efficiency. Professor Dresselhaus noted that an ideal application of solar may be in the production and storage of Hydrogen, this would reduce the hydrocarbon footprint of production. And a surprising comment that all of these scientific findings have been discovered since the Presidents 2003 announcement of a new energy initiative.

Technorati Tags: , , ,

Thursday, March 01, 2007

Ray Kurzweil, always a worthwhile speaker.

Ray Kurzweil is back on MIT Video. I wrote about Mr. Kurzweil on his last video performance at MIT. A very entertaining video and I highly recommend everyone go and view it.

Mr Kurzweil is the author of the Best Selling book "The Singularity is Near" as well as many others. Mr. Kurzweil debates with Professor David Gelernter of Yale the when, if, what and how of computer processing, and will it attain the level of human intelligence. A debate that provides new information regarding the capabilities and the definition of artificial intelligence.

Mr. Kurzweil suggest that the benchmark processing power of machines will emulate the human mind around 2029. He is careful not to suggest that this means a machine takes on a level of consciousness, but has attained the same level of performance of the human mind. If I understood him correctly, machines are providing an enhancement to human intelligence today, and that is what he means when he talks about artificial intelligence, an augmentation of capabilities for the human mind, with 2029 machines being produced with human like levels of performance.

Kurzweil's position is a reasonable point of view about when and how machines will achieve human like intelligence. Professor Gelernter wants the Turing test to be the ultimate test of human like performance and seems to insist on machines attaining levels of human consciousness. Something that he insists, rightly, will probably never happen.

The reference to 2029 by Kurzweil depends on the logarithmic and exponential growth in information, knowledge and processing power. He noted that knowledge was now doubling each year, with acceleration from the point where we are at now, what will be required in 2029 seems impossible, however, the acceleration is driven exponentially and logarithmically, whereas people think of the future only from the point of view of their historical experience, or as Kurzweil puts it linearly.

To me the debate is somewhat of limited value, Professor Gelernter appears not to be debating something he believes and hence his arguments fall somewhat flat.

The second video of this MIT series is very interesting particularly from the historical point of view. Professor Jack Copeland of the University of Canterbury, New Zealand. His discussion of the Turing Test and how Alan Turing solved the German Enigma machine in World War II. He continues on documenting interesting points of Turing's life and the impact that his Turing has had on the computer industry. A very worthwhile set of videos that provide very interesting views of the past and future of the computing industry.

Technorati Tags: , , , , ,

Tuesday, February 13, 2007

Growing pains - transitioning to a sustainable energy economy.

This excellent video is part of the MIT Museum Soap Box series sponsored by the MIT Energy Research Council. I wrote about the first installment of this presentation here, and this video goes off in two completely different directions. These new directions provide prescient discussion on key issues of the day. At one hour and thirty five minutes it is a worthwhile review. This presentation is primarily with Professor John B. Heywood who is the Director of the Sloan Automotive Laboratory and Co-Director, Lab for 21st Century Energy, and Professor Stephen Ansolabehere. John Durant, Director of the MIT Museum is the moderator of this presentation.

The introduction provides the standard fare comment that greenhouse gas emissions is the major issue of today. My opinion regarding green house gases is based more on the inability to grasp how humans could be responsible for the alleged damage. Raised during the time when the risks of the ice age was returning, I place as much weight on the frantic calls to reduce green house gases at any cost, as I do on the ice age returning. If you aggregated and assigned a square meter to each and every human on the face of the earth they would fill an area of approximately 50 square miles. Green house gases from this concentration of people is a bit of a stretch for me. However, this video has changed my opinion on the whole global warning issue.

Professor Heywood starts with the desire to change the title of the topic to "Making our energy use less unsustainable." Noting the discussion of the previous MIT energy related video was how much energy is produced today, and how the alternatives to coal, oil and gas pale in comparison to our current demands. Unsustainable for two reasons. The scale of our energy use, and the way we use energy is very efficient. The problem is the scale and growth in our demand shows a further unsustainability of our energy use. Dr. Heywood notes three areas that may provide value in approaching these problems.

  • "Conserve needs to be a good word"
  • "Improve mainstream technology to reduce demand."
  • "Finding new ways to produce and consume energy."

All these points seem to be a reasonable approaches to the problem. Professor Heywood then notes that new technology will not "save us". Commenting that technology will have a role but that it is a false wish and a hope to expect that technology will provide a magic bullet. Growth is making the energy problem more difficult each day. Growth being the growth in demand, growth from economic activity, growth from population and industrialization.

In my mind I have to ask why has the Segway not caught on? The ability to travel 20 km at up to 20 km / hour for the cost of a little under $1.00 in electricity is an obvious solution to the problem. When given a hammer, a child will hammer at everything in sight. Why does everything have to be solved through the auto industry? Is the car necessary for all that we do, or could there be alternative means to get around? I sometimes think that the world should have invented the Segway before they developed the car. Nonetheless the device is fast and efficient and is cheaper then transit, it must be one of those acceptance issues.

Professor Stephen Ansolabehere begins his commentary and notes that the existing known global coal reserves provide energy for 300 to 3000 years. Coal can also be the worst in terms in CO2 emissions. We have this as the issue in which the abundance of coal is a cost of pollution that is not taxed. What Dr. Ansolabehere means by this is that the cost to produce one unit of energy values coal at $1 per unit, nuclear at $2 per unit, and solar is at $5 per unit. A carbon tax would deal with raising the coal costs to be uncompetitive to solar so that investment in solar can be made to reduce the reliance on coal. This makes sense to me. It does not make sense to attach a carbon tax to the oil and gas industry. These products are less damaging then coal and the reserve life does not last nearly as long. (50 years by most estimates). Oil and gas would also benefit in its development in the same manner as solar would with a carbon tax.

Professor Ansolabehere then notes the scale at which the public is willing to pay for a carbon like tax. Noting that the average home heating bill in the U.S. is $100 per month. He states that his tracking the U.S. attitude to solve global warming is assessed at $14 / month a number of years ago, and currently this has been raised to $21 as global warming has become the number one concern. There is a very clear disconnect with people on how serious the issue of the global warming issue is. Yes my grammar is correct, in order that a carbon tax effect a change to the alternatives would require the costs of the average home heating bill to skyrocket by several hundred dollars. What the global warming issue needs is more people that don't want it in their back yards.

I am also concerned that this may lead to a carbon tax be assessed on the oil and gas producers. This is a critical time for oil and gas as we bridge the easy and cheap production of the 20th century with the costly and difficult 21st century. An assessment on the industry will only slow down the research, exploration and development. Not a choice that anyone wants to truly consider. As I believe any assessment can not be on the producer level. The competitive advantages of a country are dependent on the low costs of energy. The tax should be at the consumer level, which indirectly reduces energy demand.

So how has this video changed my opinion? I would now support a carbon tax on coal users to the point where research and development, and use of alternatives could be done profitably. If people are willing to pay extra to heat their home, and coal is the devil in these details, they must be the solution. To tax the oil and gas industry as the Canadian government is now suspected to be doing as early as March will have no effect on the reduction of green house gas emissions, but will have a remarkable effect in making our energy problems worse.

Technorati Tags: , ,

Monday, February 12, 2007

Greg Papadopolous of Sun Microsystems.

In what is dubbed as "Sun Analysts Summit 2007," Sun Microsystem co-founder and Executive Vice President of Research, Dr. Greg Papadopolous makes his presentation entitled "Redshift: The Explosion of Massive-Scale Systems. This presentation should be viewed by most users of computers today. An important video that details where the demand for computer processing is coming from, and where the solution to satisfying those demand resides. At 46 minutes it is a worthwhile review. So much of what I expect in this oil and gas software development project needs to be addressed from the hardware side. The demand for processing of an entire segment of the oil and gas industry is not something that can be taken likely. Recall that we have selected Sun as our key vendor for their support of the Java platform. This extends to Sun's Niagara Chip set, Solaris their Operating System, their Grid Computing offering, Crossbow their virtualization offering and finally the Java Programming Language.

Starting off with "Project BlackBox" which is a standard shipping container that provides substantial computing performance in one "BlackBox". Two rows of 19' standard racks, with each rack capable of housing 42 units of servers, blades and / or storage devices. The cooling of 200 kw of processing is the defining capacity of a project Blackbox. One should ask what / who would need to use such a large unit? That is the purpose of this talk. Many of these systems will be used by the market, and most importantly this software development project will use BlackBoxes in order to host the application for the oil and gas industry. The system we will be using will be owned and operated by Sun Microsystems and hence provides not only the solid reliability, performance, and availability of computing power but also the security that each producer knows their data is as secure and as confidential as possible.

"Red Shift" is a leading observation of Sun's marketplace of computing. The costs of computing is halving each year, yet the demands continue to grow. Where is this demand coming from? Core Enterprise demand has been satisfied by Moore's Law for a number of years. Dr. Papadopolous says that Band Width is the key driver to the current and future increases in computer processing demand. Band width has grown exponentially from 56 kilo-bites of analog capacity in 1995, to now 10 Gigabit Ethernet being available today. This band width is fueling an increase in the number of devices that are networked. It is clear that the proliferation of these devices assumes that processing is centralized in one location. This Band Width related demand is consistent with the technical vision I noted here, and the proliferation of IPv6 related devices. I agree with Dr. Papadopolous that the computer demand in the future will be difficult to satisfy.

Bandwidth is driving the increased demand for computing in far greater volumes then what Moore's Law provides. In addition to the conventional business market, the High Performance Computing market makes the demand for computing processing insatiable. Papadopolous notes demand from small and midsized firms that are using hosted services like Gmail, Salesforce.com and other web applications is a trend that he suspects will be showing up soon in large firms as well. Running an email server is an arduous task for any and all users. Aggregating the demand for email in the hands of large service providers provides economies of scale and better application functionality over the long term. A variety of customers are beginning to realize Service Oriented Architectures are the most effective and efficient means of managing these services.

Dr. Papadopolous notes that what he calls "Redshift" is a move to massive scale. Where scale and efficiency are available and afford-ably provided to users, when the users need them, wherever they may be. Sun believes RedShift will be redefining to the computing industry. Coporate strategies regarding Red Shift are of two possible scenarios. First Sun could be disintermediated such as what Google is doing in building their own servers. Or alternatively, follow the Sun school of thought that high levels of engineering are needed to build systems for today and the long term future. This latter strategy is also where strong integration of both software and hardware engineering is needed. "Efficiency and Predictability at massive scale are as Mission Critical to Redshift as Remote Access Servers (RAS) has been to the core enterprise."

Papadopolous is keen to differentiate what he means by the "Commoditization of computing" is not the "Commoditization of computers." The engineering of complex systems is necessary in this "RedShift" era. The cobbling together of many single core systems will only provide so much value. The approach of providing the City of New York with electrical power generated by a series of portable generators is inefficient, impractical and costly. This is the analogy he draws between what Sun is providing with their services and what many of the smaller service providers are doing.

Speaking on the Sun offering Papadopolous notes that computing infrastructure consists of three things. And to Sun's credit they have been able to integrate these components and provide commoditization of computing in an efficient manner.

  • Core Services and Platforms
  • O/S Instances
  • Base HW Plant (Server, Storage and Switches)
Base Hardware Plant.

What had happened in the past 20 years to distill the microprocessor down to a single chip is today what Symmetrical Multi-Processing (SMP) systems are being codified into one chip. That which was a full rack of servers in 1997 is contained on one Sun Niagara chip. Providing lower costs in almost any metric of computing power.

Taking these concepts further, Neptune, Sun's next processor will contain a 10 G Ethernet card embedded in the chip.

Operating System Instances

Solaris, Sun's open source operating system, Crossbow their operating system virtualization tool, and Java which is integrated into Solaris. "The Java RTS (Real Time Systems) + Solaris = Real time Application Server". With real time results, providing a solid application system performance that mirrors and exploits the value of their hardware. It is my opinion that both Apple and Sun's futures are brighter based on their ability to integrate their own operating systems on their own and x86 hardware. Companies such as Dell, IBM and HP are unable to compete in this arena due to their inability to provide the integration at this high level.

NetBeans which is the open source version of Sun's development tool is one of the best Integrated Development Environments (IDE's) available today. BlackBox as mentioned above defines the shape of Sun's very bright future.

Core Services and Platforms
  • Identity and Security
  • Procedural languages and scripting.
  • Service Oriented Architecture and Web 2.0
  • New Clients.
Finally Dr. Papadopolous notes a key component of Sun's open source business model is that "Open Source" does not apply to the binary or run time application. The Binary requires the use and service contract with the in this instance. Genesys will be paying for the use of Solaris and Java services, support and use agreements. This is in addition to the processing power purchased by the hour off the grid. All in all an excellent video, one that provides a vision of the future of computing.

Technorati Tags: , , , , , , ,

Friday, February 02, 2007

Mondrian Code Review.

Guido von Rostrum, who wrote the Python scripting language, is also an employee of Google. Guido has been with Google for a little over a year and is presenting his Mondrian Code Review project. The development of Mondrian is what he has done with his 20% time that Google pays him to do the things that interest him. Mondrian Code Review fills a nice hole in most IDE (Integrated Development Environments) and is a natural extension of the standard concurrent version systems, like Subversion, the choice of this development project. With the variety of projects that Google is involved in, the tool would provide real value. A real value for any large software development project.

I have reviewed many of the videos that MIT produces and it is a source that never fails to inform and entertain. Google on Google video looks to me to be as high a quality of source of useful videos. This review provides the petroleum user with a window on the methods and procedures of how software is built in an open environment.

Technorati Tags: , , , ,

Wednesday, January 24, 2007

MIT Museum Soap Box

The title of this entry will take you to a video from MIT's Museum "Soap Box" series. The two speakers, Professor Daniel Nocera, and Professor Angela Belcher focus the viewer’s attention on the energy challenges facing society. I highly recommend reviewing the video, the hour and a half is time well spent.

I believe that energy is one of the commodities that people fail to understand the significance of. Professor Nocera documents how the "alleged" alternatives are woefully inadequate to replace the energy produced by the oil, gas and coal industries. His analysis is accurate as he adds up the implications of each alternatives energy capability and capacity.

There are those that subscribe to the peak oil theory. There are those that say the world has adequate supplies for a long period of time. I fall on the peak oil side, based on my thirty years experience in the industry with the caveat that I am unable to discern what good reserves look like. I am not a geologist. Nonetheless both of these camps say that we are very close to having used up half of the known worlds reserves. This is the point that the peak oil theorists explain that deliver-ability will begin to decline and be irretrievably lost. This based on their projections scheduled to occur in 2012. I suspect 2012 will also be the point in time that the innovative oil and gas producer will be in highest demand. This last point is my justification for ensuring that we organize ourselves to meet those needs.

I would also assert during this transition to higher energy prices have been as a result of the short-term deliver-ability problems. Oil and gas has been in a holding and survival strategy since 1986. The producers are unable to ramp up production as quickly as they are now producing it, making for a short-term crisis that will approximate the crisis that we will see in 50 years.

In consideration of the date of 2012, the world has precious little time, around 50 years (2057) to come up with the alternatives that will fill the demands that Professor Nocera documents so well.

Technorati Tags: , , ,

Friday, December 22, 2006

MIT President Emeritus on MIT Video

Dr. Charles M. Vest provides an interesting discussion regarding the teaching and developmental challenges that the engineering disciplines will go through in the next 14 years.

At around the 35 minute point, Dr. Vest states their is a parallel to the current issues the energy industry faces, with the issues the auto industry faced in the 1970's. An interesting and accurate analogy.

During the Q and A Dr. Vest makes the point that at a diner with Secretary Rice, regarding the changes at the State Department, Newt Gingrich made it very clear, we have something that was built for a different era, that science and technology in industry have to re-organize to meet the challenges of today.

Technorati Tags: , , ,