Showing posts with label business. Show all posts
Showing posts with label business. Show all posts

5/31/2013

Encouragement of Pretotyping: Approach to human-centric development


(This artcile is a tranlatied version of Nikkei Sangyou News Artcile on 5/31/2013)

Just about a year ago, I was in a restaurant in Palo Alto with Professor Toshiro Kita at Doshisha University of Graduate School of Business, who was then staying in San Francisco. We were debating the following questions over a glass of beer: Aren’t Japanese companies creating goods and services in a wrong way? Have they stopped providing “human-centric” products and services known for meticulous care in every detail although it has been considered Japanese companies’ specialty?

Alberto Savoia, a Silicon Valley entrepreneur, is advocating “pretotyping” as innovative design techniques for goods and services. It’s not “prototyping,” which means building a functional prototype to collect customer feedback before “it” is actually produced. Pretotyping, on the other hand, is about conducting customer-oriented design to minimize production efforts as much as possible. According to Alberto, he coined this term by combining “Pretend” with “Typing” in November 2009. Now that this term has become widely known, a search engine will no longer give you likely correct spellings when you type in “pretotyping.” In this new “pretotyping” approach, you should “make sure you are building the right ‘it’ before you build it right” by pretending to already have “it” so that you can determine if customers will want it. Hereinafter in this article, any reference to “it” refers to goods and services. 

A couple of successful examples better explain what pretotyping is. Let me share with you a fascinating anecdote on the “human-centric” design of the Palm Pilot in 1996 by Jeff Hawkins, Palm’s co-founder. Hawkins’ previous attempt to launch a mobile terminal failed in late 1980’s. The product was technologically original but a commercial failure. It was too big and did not look sophisticated. Several years later, he developed a pretotype of a new mobile phone. It was a pocket-sized block of wood wrapped with paper on which he printed images of a calendar and memo. Every day he carried “it” around with him in his pocket, pretending it was working. From time to time, he pulled it out from his pocket and pretended to record some plans on the calendar, checking its button configuration and display. He built a prototype of the terminal after he had become confident that “it” was going to be an innovative product. As he had predicted, it turned out to be a hit as a cool multi-functional electronic organizer. I believe that smartphones today trace their roots back to this Palm Pilot.

I was so excited about Alberto’s pretotyping story, and I have become a pretotype evangelist accredited by Alberto. Whenever people come to me with a development proposal, I always ask them these three questions: 1. Do they want “it”?  2. Can we build “it”?  3. Can we make money through “it”?

We too often get trapped here, thinking of these questions in reverse order from 3 to 1. Engineers, including myself, are particularly prone to this type of error. We start making a product or service after thinking about question 2 most seriously while giving some thought to the limitation raised by question 3; the most important question 1 is usually left on a back burner.

Can anyone expect to innovative “it” only through questionnaires or interviews with hundreds of people? Aren’t we “decorating” our products with “differentiating features” only by looking at our competition instead of customers? My belief is that we are able to build human-centric “it” only after pondering over question 1.

Behind the pretotyping techniques exist serious attitudes of developers. Before starting making new products or services, developers think through the question: Do customers want “it”? Persons in charge of development try to get to the core of the question: How will building “it” change the lives of people? What they are doing is exactly pretotyping and serious search. This can be an answer to help achieve what Professor Kita desires - “human-centric development.”


2/11/2013

Cloud Design Pattern


Reading AWS Cloud Design Pattern (CDP)  is completed.
(JP version is also available here)

The article, Cloud Design Patterns (CDP),  shows me "A programmer gets another power of programming  over dynamic system configuration."  PaaS has opened a new vista,  where cloud  enables programmable data center configuration.

1/31/2013

Re-engineering backhaul to solve mobile data puzzle

I attended PTC'13 to discuss this topic:

Is the Future of Telcos Mobile or Untethered?"


Here is a good summary of my perspective about the WiFi and LTE-Advanced backhaul issue. See  the article of Petroc Wilton  or below (thanks to Petroc Wilton)

A fresh look at mobile backhaul could be key for operators to cost-effectively keep pace with the mobile data boom, according to a panel at PTC’13. But that could mean some very different approaches to engineering – and even a fresh look at relationships with competitors. Industry consultant Norman Fekrat and NTT DoCoMo R&D MD Dr. Minoru Etoh agreed that Wi-Fi offloading, while easing some of the mobile data load, wasn’t a universally optimal solution. Etoh built on his recent arguments in favour of small cell deployments – but such deployments have implications at the backhaul layer. “Your chance exists in backhaul connections, and last-mile connections using WiMax or wireless connections,” he suggested, for the benefits of operators with both fixed and mobile assets. “Mobile broadband needs to be re-engineered for internet-type economics; the mobile platforms out there, the cell sites, the backhaul needs to be just done over the internet,” added Fekrat, who argued that the economics of wireless should be brought as far as possible in line with those of wireline systems.
“There are some initiatives going on in the industry that are pushing the mobile packet core all the way out to an enterprise small cell, where you connect to the small cell through your mobile connection but all the backhaul’s just done over the enterprise LAN and over the internet. So I think you’re going to see some very creative solutions
that are going to bridge the gap between traditional mobility and some other Wi-Fi offloading mechanisms today to drive profitability.” “The reason why untethered and Wi-Fi exists is really because backhaul is so expensive,” he continued. “And I know this changes in different geographies, but from my perspective, the reason why you connect on a Wi-Fi network and the carriers want you to is because you’re on the internet; you’re not on the carrier mobile packet core network. So once a mobile network operator can create a small cell, or a cell site, where that backhauls over the internet, then you get to a quality of economics. I would suggest or recommend that should be something that should be looked at from an industry perspective.” “To have small cells, maybe 10 or 50m cell sizes, in very high density areas, backhaul [is becoming] a more and more important area to solve,” agreed Etoh. “So the real winner could be real estate companies!”

1/07/2012

Am I "Care Taker?"

This article written in Japanese criticizes how Japanese expatriates are spoiled and they are worthless to respect.
I fully agree this criticism and have to admit that the expatriates have room to improve their way of lives.
According to this article, the expatriates are called “Care Takers" which  means “Immature Kids of whom locally hired employees always have to take care.”
CEO sent from Japan is called King of Care Taker, and well-known in terms of his/her inefficiency where instructions are vague and not-prioritized. 
The expatriates know very well what are taking place in Tokyo Head quarters, but they don’t know what is happening over the world.  They can’t discuss topics related history, culture, and politics, but can enjoy chatting on baseball, golf, and gossips. 

It’s a painful truth to overcome.

Here is the original article. Unfortunately, it is written in Japanese and worthwhile to translate to English.
http://business.nikkeibp.co.jp/article/topics/20111226/225643/

1/04/2012

A New Year's Resolution


Read an article, "Managing for breakthroughs in productivity" by  Allan L. Scher, 2005, during New Year's vacation.
How might we intentionally  produce productivity  breakthroughs in our existing
organizations and  culture?
There is a correlation between the magnitude of the possible breakthrough and the size of the gap between the business-as-usual results and the results committed to.
You commitment to pursue your dream will be your great priority.
Knowing a breakdown which is the result of the gap between what can be
predicted and what is committed, you have to come up with your agenda to fulfill the breakdown.

That is a part of my new year's resolutions.

11/05/2011

Emergencce of Big Data found in Web 2.0

October 13, 2011
translated from  my Japanese version in Wireless Wire News

From the author’s understanding, during the past decade, "social media was born through information shared instantly via human networks, and likewise, the era of 'information socialization' has arrived, in which scattered information is extensively collected, assigned value, and provided." When Tim O'Reilly proposed the concept of "What is Web 2.0" in September 2005, his insight was fresh.
One of the concepts is a database that grows in conjunction with users. As the amount of user data increases, services are enhanced, pulling in more user data. When data exceeds critical mass, a service with great value is created, against which other companies cannot compete. Typical examples are various Google services. Data is an asset; the principle asset of competitive power. O'Reilly said that "Data is the Next Intel Inside" and how to design places where data is generated is important. He showed the direction of Internet services in the Web era.

Around this same time, former Google CEO Eric Schmidt used the word "cloud" to describe a large, global scale server group, which came into being about a year later on August 9, 2006. Then, two weeks later, the Amazon EC2 service was introduced on August 24. This was not due to simple happenstance. The iPhone was launched in the US the following year, on June 29, 2007, and smartphones emerged that provide services in collaboration with the cloud. The introduction of Android, which followed the iPhone, clarified the functions of the device, that is, it generates real global data and is a cloud device.
In short, as the concept that data is an important asset for corporate activities was shown by Web 2.0, SNS using accumulated data, and Internet services such as media accumulation, distribution, and search, etc., have advanced dramatically due to the emergence of the cloud, cloud devices, and large scale database processing technology. "Information socialization", in which public services are provided on a global scale by linking as many as 100 million computers, created the value-added resource called "Global Brain," to borrow from Tim OReillys term. Examples include Google voice recognition, machine translation, Facebook, and Twitter data analysis recommendations, etc. The situation in 2011 can be expressed using the following formula.

Professor Maruyama, chairman of Japan Android Group, described the size of data accumulation and processing happening on a global scale as "Web-Scale" (2009).1
What is Web-Scale data? At this time, Web-Scale data includes server logs, sensor information, images/video, SNS data, blogs, and social graphs from Twitter and Facebook, etc. I call those items “Big Data.” In general, the characteristics of such data are that they are large-scale, their structure is not constant 2, and a quick response is required. Furthermore, much of the data has a historical meaning and thus, in many cases, it cannot be thinned out. The challenge is how to process Big Data. There are two aspects to this: algorithms and systems.
1. If it is not Web-Scale, it is not a cloud. A cloud is a system technology or platform that supports the Web-Scale explosion of information and expansion of users, and simply calling the data warehouse of a company a "private cloud" cannot be considered to be grasping its true nature.
2. Because the structure is not constant, one idea is to use NoSQL. However, since data modeling allows data to be handled as structured data, I think it is proper to handle data in SQL. Also, it is necessary to use NoSQL+Hadoop when you want statistical data, and to use SQL by placing importance on consistency when you want to reproduce data itself. I think the spread of Hadoop will depend on how popular the statistical use of Big Data becomes.


Designing the "place" for Big Data collection as part of a service
Machine learning technology that automatically learns useful rules and statistical models from data, as well as pattern recognition technology for identifying data from acquired rules or statistical models, have been researched to date. Pattern recognition researchers were interested in methodology and algorithms themselves, such as how to convert voice data to text, how to automatically enter handwritten text into a computer, and how to automatically follow images of the human face. They could easily write research papers if they conceived a good algorithm and simply conducted experiments implementing real data.
Until 2005, there was no Big Data. However, after 2006, the time came when people began trying to create and improve services by applying machine learning and pattern recognition to Big Data. You can look at the success of Google. There have been many case examples of "More Data beats Better Algorithms (MDbBA)". Google's auto driving demonstration is one good example. Without relying on combinations of complicated algorithms, the demonstration showed that a car was able to automatically drive from San Francisco to Los Angeles using collected map data and combinations of distance measurements and image sensors.
Automatic driving is an example of a service that exceeds a critical point involving machine learning when there is sufficient data. That being said, there are also many successful case studies for introducing machine learning frameworks. Machine learning is a framework in which the system, when there is correct data, automatically makes adjustments in order to obtain the appropriate answers. Therefore, if good learning algorithms are designed for a given problem class--in other words, a service--the data correction and performance improvement loops correctly.
Such machine learning frameworks are carried out by character recognition, voice recognition, machine translation, landmark recognition, or facial recognition and are provided as Internet services. For example, the level of machine translation has almost reached a practical usage level between related language families, such as English, French and Spanish. There is still room for improvement in translation algorithms, and this is a good opportunity for publishing research papers. But the important thing is to actually design a location for Big Data collection in which the machine learning framework is incorporated into part of the service. We must be aware that the pattern recognition study environment has changed significantly in the past decade.


Finding what comes next in facial recognition
In 2001, at a facial recognition conference, Paul Viola and Michael Jones gave a presentation about an object detector called boosting. The announcement of this algorithm was the moment when the service called facial identification entered the area of "More Data beats Better Algorithms (MDbBA)". After that, there were many algorithm improvements to greatly improve performance, but facial region tracking used in digital cameras was based on the method used in this announcement.
Researchers are requested to do two things.
1. Develop the next MDbBA area by inventing new algorithms and methods. Become the second Viola and Jones.
2. In the MDbBA area, study platforms for Big Data processing, in addition to algorithms. Engineering is designing both the algorithm and the platform.
IT engineers and entrepreneurs are requested to do the following:
Be the first to find out what is ready for commercialization in the MDbBA area and use it as a device for Internet service.
Silicon Valley is not the only outlet for Internet service innovation. The advent of the cloud, database processing technology, and cloud devices gave Internet design opportunities to all people. In general, it is important to think about devices that collect Big Data, but this is not limited to pattern recognition applications, since there are pattern recognition technologies scattered throughout Japan. By all means, I want them to adjust their focus.
The following five topics can be highlighted:
1. What is the next area for Big Data beats Better Algorithms? Achievement of character, voice, and facial recognition, and then the next thing is food? How about predicting the weather or consumer behavior?
2. To what degree can the latest algorithms, such as Bayesian modeling, scale for Big Data?
3. What are the facts and fictions about Big Data? How effective is it for social networking analysis?
4. How popular can Hadoop and NoSQL become?
5. What will the Global Brain look like in 10 years?
Big Data is used in a wide range of areas, including marketing, financial security, social infrastructure optimization, and medical care. This conference cannot possibly cover them all. However, above all, we would like you to focus on this conference as a place for exchange between pattern recognition researchers and the IT industry.

Minoru Etoh, Ph.D.
Director at NTT DOCOMO, Service & Solution Development Department, and President & CEO of DOCOMO Innovations, Inc. (Palo Alto, CA). Visiting 
Professor at Cybermedia Center Osaka University. He has been engaging in research and development in the multimedia communication and mobile network field for 25 years. He entered Matsushita Electric Industrial Co., Ltd. (currently Panasonic Corporation) in 1985 and researched moving image coding at the Central Research Laboratory and pattern recognition for ATR. He entered NTT DOCOMO in 2000 and wasCEO of DOCOMO USA Labs (Silicon Valley) from 2002 to 2005. Currently, he is in charge of development related to data mining, media understanding, smartphones, home ICT, machine communication, and information search.


1/04/2011

New Year's Thoughts

The next decade will show another landscape which is completely different from what the last decade has shown to us. During the last decade, we have been experiencing the successful innovation of i-mode derived in 1999 from the combination of ACCESS micro-browser and DOCOMO’s always-on mobile packet network that had been already existed in 1998, years ahead GSM GPRS packet network development.
After having the success, rules of the game are changing from local to global, from  walled-gardens to open markets, and from pay services to fee-on-free serves.  Telecommunication operators need to act otherwise they are doomed. Given no status quo, a rolling stone gathers no moss (in American Interpretation).
The i-mode business is now hitting on a plateau on the first ‘S’ curve, thus DOCOMO R&D needs to jump onto a second ‘S’ curve.


Where will be such a next ‘S’ curve? Here are some hints.

The essence of Communication is secure and reliable “Redirection” which appears at several levels: packet routing, directory services for session creation, search engine applications, and SNS relation such as Facebook. As we know, that essential redirection functionalities are now away from telecommunication operators’ monopoly, and being integrated to web services.  Any Internet company who owns data enough to provide redirection functionality may replace the telecommunication operation with their own.  Google voice, skype, and twitter are good examples. Only data with a large customer base for redirection has the power to win the game. Thus, operators’ leverage lies in two holds: 1. Scale of data, customer-base, delivery system and sufficient free cash flow,   and  2. Trustability, in other word, reliability of redirection.
Redirection may have effects in the following two areas.

  1. Machine Communication. That means “non cellular phone communication” in a broader sense.  The market expansion of wireless internet card shows very positive figures nowadays.  Dr. Keiji Tachikawa, the CEO of NTT DOCOMO at that time in 2000, predicted that around 20 million cats and dogs in Japan would wear tracking devices linked to cellular networks by 2010. Although his prediction has not come true yet, his envisaged direction is valid enough to explain the current non-voice cellular application trends.Seeking non-voice communication services is the key to expand the revenue. Here the crucial point is that we design our business model carefully so as to promote our business to be a machine communication platform business. At this moment, most “Machine Communication” businesses have remained at just selling data communication cards.My colleagues at DOCOMO Service & Solution development department invented a digital photo frame service called Otayori Photo Service™  in 2009.  (This system was adopted by Korean Telecom and exported to Korea.  See http://www.nttdocomo.com/pr/2010/001492.html).  That’s a good example to thinks about the next step.  Any Machine communication platform should have a redirection function empowered by data or reliability.
  2. Federation of Data Monetization.  There are things multiple companies can accomplish working together that they couldn't do alone. That’s O’Reilly’s remark in “What lies ahead: Data” (see http://radar.oreilly.com/2010/12/2011-data.html) .Data mining, to which I devote everything from machine learning algorithm developments to high-performance massive parallel servers deployment at DOMO, is a tough process, since without mining the data we cannot find its value. Data mining involves very ironically self definition.  If we are allowed to federate data among data-driven companies, we can reach to critical points where innovations emerge. Let us see what will happen.  http://strataconf.com/strata2011 may give us some clue to figure out the future. Data is the power. Federation is the key to reach the critical points.


That’s it about S-curve identification. In identifying the next ‘S’ and increasing its success rate, we need collaboration with external companies. The open innovation concept described by Chesbrough is essentially imperative to generate the next innovation.  That includes, in general, licensing out of patents, collecting ideas, collaborating with other businesses, external R&D or consultants on development and so on.

The term, open innovation, is easy to understand though, it is not easy to implement.
We need the culture transformation Our way of seeking next S-curves with open innovation scheme should move toward High Performance Culture. Those consists of  
        Empowered people and cross-functional communication,
        Creating focused, collaborative, results-driven teams; energizes others,
        Integrating existing solutions without not-invented-here syndrome and adding new values,
        Facilitating the creation and communication of a compelling and strategically sound vision,
        Changing our mind from “technology push” to “collaborative innovation” with business departments (ultimately i.e., our customers), and
        Transforming our value from technology consultation to commitment of any necessary technical support until its service launch; that means we need to engage “concurrent engineering.”
Implementation of Open Innovation scheme requires culture transformation. 

With the above consideration, I hope year 2011 will give us good developer experience.

1/01/2011

What Innovation entails, i.e., Neuer Kombinationen

Innovation is the source of new business. We need to understand what R&D efforts are needed to generate innovations in this era where the ecosystem is currently undergoing great changes driven by globalization.

Joseph Schumpeter, one of the leading economists of the 20th century, defined innovation as “new combinations” (Neuer Kombinationen). He asserted that innovation refers to the new goods, new production methods, new markets, and new organizations that are borne out of these “combinations.” 
This points to a new way to change society, particularly the process of setting new values and bringing about change by coming up with new combinations from existing elements. I learned from my mentors that development is based on existing technology, i.e., it is made up of validated  technologies, and that including unvalidated technologies, i.e., those still in the research stage, is not allowed. An iconic example of using only validated, i.e., dependable, technologies is the Apollo Project, which aimed to land a man on the moon and bring him back to Earth. In that massive system development, only dependable technologies were used. The plan was implemented by combining existing technologies. But research is different. In research, the goal is to come up with innovations based on technological discoveries or inventions. If inventive technologies have foundational versatility, then combinations geared towards practical applications can be made from them later. This is the reason that in the development of scientific technologies, the wave of inventions and discoveries of technical elements and basic theories and the wave of systematization, although the latter did not come right after, came about through mutual interference.

NOTE: The field of communications is still experiencing the systematization wave.

Going back to innovation, the main thing is how to be able to come up with new combinations. As an example, in 1998, i-mode was born out of the combination of DOCOMO’s always-on mobile packet network and microbrowsers.
 You can either seek out combinations around the world, or, more importantly, come up with attractive platforms that the world will seek out. To do this, open innovation and concurrent engineering must be practiced. In open innovation, a new system is designed from a combination of your own company’s and other companies’ technologies. In such cases, even the operation of the system can be delegated to other companies. In concurrent engineering, development of technology is carried out in coordination with the operations department and with a constant evaluation of its relationship with the market and with other companies. In other words, market search and technological development, which includes research, should be done in parallel. These two are indivisible activities and must be linked with investment activities.

When I was fresh out of university and entered the industry, I learned this saying, “The more half-hearted  you are as an engineer, the more conservative you become.” The average successful engineer would stick to his technology and work style of ten years ago and would not change and challenge himself to learn about new technical fields. He was content with the status quo. Since innovation is a process that brings about new values and changes to society, it revolutionizes the ecosystem. Around the world, convergence of terminal platforms and consolidation of network services on the Internet are advancing. There is no stopping the wave of innovation; there is the emergence of cloud computing, which enables a service delivery platform that can serve from millions to several hundred millions of clients, as there was the emergence of search services, internet shopping, electronic publications, social network services, and smartphone application stores. Ecosystems for these innovations did not exist ten years ago. Thus, innovation necessitates creation of new ecosystems rather than just adapting to them. In this era, there is no room for conservative and half-hearted engineers. It would be easy to find comfort in existing ecosystems, but this will not encourage innovation. I hope that we can have the readiness to challenge ourselves to create new environments and take up new technical fields.

To conclude, let me reiterate that what innovation entails is facing the challenge to pursue new combinations and to reconstruct ecosystems.

2/11/2010

Crowd Computing (not Cloud Computing)

This week, I completed a technical report draft, which was submitted to an IEICE MoMuC workshop , at Yokosuka, on Mar 4th 2010
The panel title is “Real Value of Cloud with Devices” and my document is entitled “Mobile, Cloud, and Crowd Computing”, which was uploaded to my home page.

Here is the highlight.
"Cloud Computing" is a marketing buzzword in some communities with no real user experience. Does cloud computing mean gigantic scale server integration with scale-out technologies? The question is hard to respond for people who haven't experienced benefits from cloud applications without knowing those are from clouds. More insights are needed to understand what it really means.

Don’t stick to server technologies when trying to understand “cloud computing.” We should pay more attention to “cloud devices” also. Cloud devices with personal data integration will be an extremely fertile incubation environment for new and innovative killer applications.
Let me explain the reason. In 1990s, we saw the dawn of personal communication with always-on anytime anywhere connection by phone, e-mail, and web browsing. Many web services emerged. Years 2000-2010 are characterized by social network services. i-phone has become an irreplaceable gadget for Facebook and Twitter in US and Europe. People are now using cloud devices to share (quasi) real-time information with their friends and family. That is a different communication style besides the personal communication in 1990's. Recent five years, cell-phone has become an information hub in our daily life. ``always-on'' mobile infrastructure has brought a community-based popular communication culture that was never seen.
The author is using Apple's MobileMe, Evernote, Google's services, SugarSync
, and DOCOMO's address book backup service. Those applications represent data store and integration as essential functions in communication.
The core data must be not only personal ones but also community ones.

With the recent popularity of Facebook, Twitter, and similar microblogging systems, we must note that it is increasing "social capital." Twiter is used for (1)daily chatter, (2)conversations, (3)sharing information/URLs, and (4)reporting news. Those usage are shared over people by clouds, specifically information source, information seeker, and friends.

Now, the personal data is surely being stored into clouds through information hubs (i.e., cell-phones). All the data is not necessary to carry. Those are stored in the clouds, and invoked over wireless broad band networks when necessary.
The race just began on to aggregate and to integrate the data over people so as to promote the data to social capital. Location, identity, schedule, addresses, and SNS connection of people online with whom we have a shared connection will create interpersonal functions. A system based on the integration may foster relationship building by allowing users to interact other members of their community, and consequently contributes to harnessing collective intelligence.
Let me call it “Crowd Computing.”

1/18/2010

My Talk at Japan Android Group

Japan Android Group invited me to give a talk, given two keywords:cloud and Android.
Here is the abstract and snapshot of my presentation at Japan Android Group, Tokyo, Jan. 18th, 2009.


An Operator's View on Cloud Device Era
This talk conveys three major messages: 1.Emergence of cloud devices and its impact to communication culture among people, 2.Role of broadband wireless, especially of 3G Long Term Evolution (3G LTE) in the era of cloud devices, and 3.Operator’s imperatives to foster a new ecosystem with innovative content providers (CPs), e.g., Android Market participants.The concept of cloud computing is not far from Tim O’Reilly’s insight: Web 2.0 (see http://oreilly.com/web2/archive/what-is-web-20.html) . What is changing from the original web.2.0 concept is that data aggregation and integration over clouds is in full progress and those have reached to a critical stage for communication paradigm shift. You can see those examples in Google applications, Twitter, mobile SNSes, etc. (see also my blog article at http://micketoh.blogspot.com/2010/01/my-2010-r-plan-beyond-web-20-and-cloud.html) The 3G LTE technology will support such cloud devices in two years by providing a fat pipe with very low latency, say 10msec. Lessons learned from AT&T wireless' case in launching i-phones are reported and discussed so as to emphasize inevitable broadband wireless. As for the last message, operator’s imperatives, we point out operator’s customer base, with payment systems and customers’trusting data, leverages creating a new ecosystems in the cloud device era. Those imperatives are (a) providing service charging system, (b) customers’ data aggregation and its fair use, and (c) providing network APIs especially of location, presence and AAA, to CPs in the new ecosystem. Detailed discussion follows in the talk.



I uploaded several keynote presentations which I made in the past.
If you are interested in mobile multimedia, content delivery over wireless network, and cell-phone sensors, please visit this URL http://micketoh.web.fc2.com/keynote.htm

9/06/2009

No perfect universal cell-phone so far to my own life

Using cell-phones is my profession. To understand how the cutting-edge of user interfaces and application technologies is evolving, I’m using three types of cell-phones: Apple’s i-phone, Google Android phone, and FOMA (i.e., docomo’s standard 3G cell-phones, developed by various manufacturers, branded by NTT docomo). Here is my personal summary.

i-phone (left in the picture)
Pros: fancy user interfaces, good integration with i-tunes, synchronization with Apple’s cloud service (http://www.me.com/), maturity of localized open application store
Cons: short battery durability (it cannot last one single day and not replaceable!), heavy weight (133g), no strap

Google Android phone (HTC’s phone, middle in the picture)
Pros: light weight(123g), wide variety of applications, battery charge thru a standard mini-USB, synchronization with Google’s applications(mail, calendar, address, etc.), good software architecture that allows background processes
Cons: no i-tunes, poor localization, small key-touch screen.

Docomo standard 3G phone (Panasonic phone, right in the picture)
Pros: light weight(122g), NFC(near field communication) for e-commerce, network-integrated services such as “earthquake and tsunami early warning system” , localized contents, carrier-operated quick mail system (not SMS), digital TV, long battery durability
Cons: poor integration with the Internet cloud applications (such as Apple’s mobileme, Google’s apps), small open application market

What I really need are 1. i-tunes, 2. Gmail, 3. Scheduler and address book synchronized with my assistant’s outlook, 4. Networked NFC for e-money , e-ticketing and security applications, 5. Earthquake warning system to survive, and 6. carrier-operated mail system that is not SMS but Internet-equivalent real-time one.

So far none of those can solely give the perfect answer. Thus I have been carrying the docomo standard phone and i-phone both.
This week I started to use Google phone. It is unexpectedly good. I’d rather like to use it in stead of i-phone. I found http://www.beyondpod.mobi/android/ for podcast services with my Google phone. It is working well without connecting to my PC. All the scheduled podcast (including video) self updated during midnight.

Buddy Runner (http://www.buddyrunner.com/ ) gives me a fun of running. It provides the capabilities of an expensive GPS enabled personal trainer on my Android phone. Bad news is that my andoroid phone is heavier than my ipod+Nike though, exact GPS measuring and automatic update is quite useful. The track record can be exported to FaceBook. I’m enjoying the combination of bluetooth connected audio and an armband cell-phone holder.

Velox provides me another fun of workout. See http://www.veloxgps.de/. The experience is marvelous. It shows biking statistics in realtime, which includes altitude, heading direction, pace in various durations, and Google map.

Localization (i.e., adaptation to Japanese) of Android phones is pre-mature, while English software stuffs are commodities and general population in Japan are behind the world. Android users at this moment must speak English.
Nevertheless, my Android phone (manufactured by HTC) is fitting to my life.