Saturday, December 18, 2010

Thoughts on the new shape of social networks -- Delicious X Twitter

When I started this blog, back in 2006, I was very much certain that tagging was a powerful concept towards building a shared ontology: “I am curious to see how delicious will evolve, and how the data it collects will be used. It may turn out to be the next Google”. At the time del.icio.us was the innovation and the motor behind tagging.

I was motivated and I, myself, spent, like millions of others, a certain amount of time tagging and commenting on whatever I found of interest over the Web. Consulting my delicious account, I would say that my drive started to fade around 2009, where I just catalogued 11 entries in my delicious account. In May of 2009 I entered the bookmark for my Twitter account. In 2010 I just stored 1 entry in delicious. From February of 2006 to 2010 I stored 284 bookmarks and had used 458 tags. From May 2009 to today I have tweeted 654 tweets.

I have to say that I basically use Twitter as a downgrade delicious. I just comment on some site I find of interest and use a shorten link to post it. So I mostly use Twitter as bookmark software. Very seldom I use Twitter for personal communication, but I do use the follow links to stay in touch with what is of my interest. The question is why did I stop using delicious? I do not have a straight answer, but I guess that most of it, was due to my use of Twitter, because I could do it quickly. Like me a lot of Twitter users do use Twitter as bookmark software. However, Twitter does allow #tags, they are not encouraged, since it will use your 140 limit and do not have special support for tagging like delicious.

Conclusion: I left delicious, and are not tagging anymore. I missed that, but maybe I did not see much return, neither a growing enthusiasm over this tagging spree which took me from 2006 to 2009. I missed that, but worse, I believe we missed an opportunity of building a great infrastructure for a shared ontology.

So, in my opinion Twitter contributed in a major way to the decadence of delicious, but I believe the important reason is that delicious did not react on time. Why? Maybe some tech historians will write about, but it is sad that Yahoo is thinking of discontinuing delicious. It was, is, a great idea, but for some reason it got stuck and did not move fast enough. Let´s hope that this decision be overturned and Yahoo care to invest in research and development to revamp the great delicious idea.

Sunday, December 12, 2010

Software Engineering for Helping Hospitals

In 2007, at the Monterey Workshop on Requirements Engineering I heard a talk by Lori Clarke, where she presented the number of “preventable errors in hospitals” as indexed by Jumbo Jet crashes.

In 2008 she and co-authors have published a paper at ICSE telling how Software Engineering technology could help tackling this problem. It is a must read. We understand that there is yet a lot that can be done from a software engineering perspective as to meliorate this catastrophe.

Clarke, in her paper, cites 1999 data, which were around 95K lost lives per year. Googling I found a number that is almost twice the 1999 one, by this site the number is 195k! Using the Jumbo metrics, it is like to say that 40 Jumbo Jet do crash per month (195k/400)/12!

Amazing.

Wednesday, December 08, 2010

Lula on WikiLeaks

Amazing. Read it

Saturday, July 03, 2010

Cobol as an Obstacle for the Terminator

Amazing at least!

I do recall that after I watched “The Terminator” in 1984 I did comment with friends that it was funny that the code that was shown on the movie, as the code for the robot, was written in Cobol. I did recognize Cobol right away, and of course, the language was the most inappropriate for handling real time systems. Anyway. It took some time to find a confirmation of my memory, but I found it. It is here.

Well, it happens that the California State Payroll system is a legacy system written in Cobol, as I first guessed and found confirmation by googling. See it here. As Ira Baxter, chief scientist at Semantic Designs, pointed out, this crisis started in 2008. See different sources on the same topic: 1, 2, 3, 4.

Now, 26 (twenty six) years after I watched the movie, I read on the LA Times the following statement from Controller John Chiang: “The state's wheezing payroll system, he says, cannot easily be reprogrammed to make immediate, large-scale salary adjustments.”. It happens that the California State Controller is refusing to follow Gov. Arnold Schwarzenegger orders to cut the state workers’ pay to the federal minimum wage.

Software is, slowly, showing its importance to society at large.

Sunday, May 09, 2010

Software Transparency: the Case of AgorithmicTrading

On May, 6th, 2010 the Dow Jones industrial average, which had been down about 400 points just before 2:45 p.m., plunged nearly 1,000 points in a matter of minutes.
As of now, no explanation has been provided. There are guesses; even conspiracy theory has been evocated. However, the real issue is briefly explained in this quotation from Robert Reich.

…the nation’s and the world’s capital markets have become a vast out-of-control casino in which fortunes can be made or lost in an instant — which would be fine except for the fact that most of us have put our life savings there. Pension funds, mutual funds, school endowments — the value of all of this depends on a mechanism that can lose a trillion dollars in minutes without anyone having a clear idea why. So much of the market now depends on computer programs and mathematical models that no one fully understands…

Algorithmic trading, flash trading and high frequency algorithmic trading are practices made possible by the role of software in the stock market. As stock trading and other financial transactions became more and more digitalized there was a window of opportunity for automated strategies in dealing with finance.

October, 19th of 1987 was the first time the world, at large, became aware of the effects of program trading, which is, trading performed by software. As a result, the NYSE introduced the circuit breaker, a forced halt to avoid a deadlock state on a selling spree.

Since 1987, financial organizations have built teams of gifted young Ph.D.s who specialized on implementing different sorts of strategies, based on mathematics and statistics, using advances on data structures, algorithms and hardware. These people are known as quants. Quants do not necessarily have a degree in Computer Science, but do write algorithms or work closely with people who does.

The use of software over the time span of computer´s networks opened a series of new possibilities in the trading business, where a combination of a huge amount of data and speed played a crucial role bringing new possibilities on volume and price volatility. Exploring this is more akin to gamming than to real trading.

However, the crucial point is that by mixing different sorts of strategies and relying on software over a complex network is a risk business. Of course those institutions are aware of the risks, but we are not sure of how much software engineering knowledge is being used in these systems as to avoid losses.

Notwithstanding, if the market is seen as a game, it is hard to know if you win by luck (an error on the other trading party) or by a fair strategy.

On the other hand, society must have some way of monitoring the quality of trading as to avoid huge mistakes, as per the 1987 crisis and the May, 6th incident. It is seems that the solution used in 1987, halting the market, did not work in 2010, maybe because the speed is different, or maybe because the transactions provided a way of working around the circuit breaker. The point is that there is a need for other type of policy to avoid such problems.

We believe that transparency is the best way to do it. In the specific case, the idea of Software Transparency, that is software has to be transparent to the people who may be affected by it, seems one to explore.

Read more about Software Transparency on a recent paper at the BISE journal (here).