Web service

A Web service (also Webservice) is defined by the W3C as "a software system designed to support interoperable machine-to-machine interaction over a network. It has an interface described in a machine-processable format (specifically Web Services Description Language WSDL). Other systems interact with the Web service in a manner prescribed by its description using SOAP-messages, typically conveyed using HTTP with an XML serialization in conjunction with other Web-related standards." Web services are frequently just Internet Application Programming Interfaces (API) that can be accessed over a network, such as the Internet, and executed on a remote system hosting the requested services. Other approaches with nearly the same functionality as web services are Object Management Group's (OMG) Common Object Request Broker Architecture (CORBA), Microsoft's Distributed Component Object Model (DCOM) or Sun Microsystems's Java/Remote Method Invocation (RMI).

In common usage the term refers to clients and servers that communicate over the Hypertext Transfer Protocol (HTTP) protocol used on the Web. Such services tend to fall into one of two camps: Big Web Services[citation needed] and RESTful Web Services. Such services are also referred to as web APIs.

"Big Web Services" use Extensible Markup Language (XML) messages that follow the Simple Object Access Protocol (SOAP) standard and have been popular with traditional enterprise. In such systems, there is often a machine-readable description of the operations offered by the service written in the Web Services Description Language (WSDL). The latter is not a requirement of a SOAP endpoint, but it is a prerequisite for automated client-side code generation in many Java and .NET SOAP frameworks (frameworks such as Spring, Apache Axis2 and Apache CXF being notable exceptions). Some industry organizations, such as the WS-I, mandate both SOAP and WSDL in their definition of a Web service.More recently, REpresentational State Transfer (RESTful) Web services have been regaining popularity, particularly with Internet companies. By using the PUT, GET and DELETE HTTP methods, alongside POST, these are often better integrated with HTTP and web browsers than SOAP-based services. They do not require XML messages or WSDL service-API definitions.A highly dynamic and loosely coupled environment increases not only the probability of deviation situations that occur during the execution of composite services, but also the complexity in exception handling. Due to the distributed nature of SOA, loosely coupled feature of web services, the monitoring and exception handling issues about web services in SOA context is still an open research issue.When running composite web services, each sub service can be considered autonomous. The user has no control over these services. Also the web services themselves are not reliable; the service provider may remove, change or update their services without giving notice to users. The reliability and fault tolerance is not well supported; faults may happen during the execution. Exception handling in the context of web services is still an open research issue.

Creative Challenge: Ode to Edward Weston

Since the inception of the creative challenge, we have never held a challenge which required using a specific subject. This week, we thought we would ask you to utilize all the tips and tricks that you’ve learned in the past and challenge yourself to capture the most creative image of one subject: A bell pepper. It’s cheap, it has character and is common to find worldwide.
weston_pepper
Edward Weston photographed the bell pepper a number of times. He wrote in his Daybooks, “It is classic, completely satisfying – a pepper – but more than a pepper: abstract, in that it is completely outside subject matter.” With his own unique vision and with many tries, he was able to finally capture a satisfying image of a pepper that was unique from anyone else’s pepper.
This week, we challenge you to be inspired by Weston and create a pepper that is truly all your own.
Below are some different examples of four bell pepper images that Kevin and Penny created using clamp lights, diffusion material and a lightbox.
Pepper Challenge
The basic rules:

You may:

Use only one bell pepper within your composition. It may be yellow, green or red. However, as you are allowed to submit three entries, you may use a different color pepper per entry if you so choose.

The pepper may be photographed in any style you wish. Including, macro, wide angle, or zoom.

Use color, black and white, sepia, collage, selective color, make a mosaic…whatever you choose.

The pepper may be cut, left whole, photographed near, far…any angle you desire.

The background may be plain, textured, indoors, outdoors, in water, wherever you think your pepper looks best.

We would encourage you to explore creative lighting to make your pepper look as unique as possible. Including, but not limited to artificial lights, natural light, reflectors, flash…it’s up to you. If you need some inspiration, check out our blog entry about lighting techniques.

Use digital or film, traditional or untraditional photographic methods of creating your piece.

You may use filters (on-camera or digital) if you wish.

You may not:

Use more than one bell pepper in one shot.

It may sound redundant, but to be clear, do not use anything other a bell pepper. Do not use other peppers such as chili peppers, black pepper or a banana pepper.

Do not photograph your pepper with another subject. For example, no photos of kitty snuggling with your pepper.

OFFICIAL ENTRY RULES
To officially enter the Creative Challenge, you must tag your three submissions with creative-challenge-pepper. You can then view them and everyone else’s submissions on the DISPLAY PAGE (it updates about every hour). To find out more about this, read the Using Tags on Creative Challenges post.

Submission Guidelines
1. Our editors look for sharp, clear horizontal images that are at a minimum resolution of 800×600 (submissions can have larger resolutions than this). Currently, vertical images cannot fit in the homepage template. Therefore, we cannot publish vertical or narrow panoramic images.

2. Images with added text or images, including watermarks, logos, copyright symbols, graphic borders, frames and time stamps, will not be considered. Collages, however, are acceptable.

3. Please do not submit pictures that have already been featured on the homepage.

4. The Creative Challenge runs from 12:01 a.m. on August 27 to 3:00 p.m. on September 2 (all times EST).

5. 3 pictures per member tagged exactly (including hyphens) with: creative-challenge-pepper

*Note:Images that don’t meet the exact guidelines can be submitted and viewed for everyone to enjoy, but please understand that we cannot publish images that don’t meet minimum guidelines on the homepage. Also, now that you’re tagging your entries, you don’t need to post links in the comments section anymore.

NEXT WEEK’S THEME: Alphabet in Nature In the states, it’s back-to-school time. Use your creative eye to find the letters of the alphabet in your enviroment. Does the curl of a branch look like an “S” or does the building on your block look like an “A”? If you have children in your life, it could be a fantastic opportunity to include and educate them while they open your eyes to a world of ABC’s.

(EDITOR’S NOTE: Next week’s challenge: “Alphabet in Nature” will be limited to the English alphabet, but not be limited specifically to only organic objects found in nature. Yes, buildings and manmade objects are also eligible, but physical signage will not be. In other words, no close ups of letters printed on other objects. More detailed information will be posted next week after the pepper winners have been announced.)

Build critical mass on your website

With so many websites to join, users must decide where to invest significant time in adding their same connections over and over. For developers, this means it is difficult to build successful web applications that hinge upon a critical mass of users for content and interaction. With the Social Graph API, developers can now utilize public connections their users have already created in other web services. It makes information about public connections between people easily available and useful.

Only public data

The API returns web addresses of public pages and publicly declared connections between them. The API cannot access non-public information, such as private profile pages or websites accessible to a limited group of friends.
The Social Graph

Based on open standards

We currently index the public Web for XHTML Friends Network (XFN), Friend of a Friend (FOAF) markup and other publicly declared connections. By supporting open Web standards for describing connections between people, web sites can add to the social infrastructure of the web.

Make Friends on Webshots!

klkl1
Valentine’s Day doesn’t have to be about just romance—after all, many dread the explosion of candy, cards and overly affectionate couples.
Make this Valentine’s Day about finding a new friend on Webshots instead! When you’re browsing the site and you find a member you like, just click on the “make friends” icon and an email to them will pop up. Write a little hello and let them know why you want to be friends.
Click to see the “make friend” icon:
makefriends.JPG

Friends with Benefits

Friendship is a gift in itself, but you get extra benefits with your Webshots friends:
  • Print and make gifts with your friends’ photos and videos!*
  • See the latest photos and videos from your friends on the Webshots homepage.
  • Get cool comments on your photos, albums and message board.
We just updated the Webshots store interface, so now you can choose which friends’ photos you want to load to order prints and gifts of, making the process much faster.
Click to see the new interface:
friend_upload.JPG
Here’s to a happy Valentine’s Day filled with lots of valentines from all your friends!
Love,
Webshots

Web-based applications and desktops

Ajax has prompted the development of websites that mimic desktop applications, such as word processing, the spreadsheet, and slide-show presentation. WYSIWYG wiki sites replicate many features of PC authoring applications. Still other sites perform collaboration and project management functions. In 2006 Google, Inc. acquired one of the best-known sites of this broad class, Writely.
Several browser-based "operating systems" have emerged, including and YouOS. Although coined as such, many of these services function less like a traditional operating system and more as an application platform. They mimic the user experience of desktop operating-systems, offering features and applications similar to a PC environment, as well as the added ability of being able to run within any modern browser. However, these operating systems do not control the hardware on the client's computer.
Numerous web-based application services appeared during the dot-com bubble of 1997–2001 and then vanished, having failed to gain a critical mass of customers. In 2005, WebEx acquired one of the better-known of these, Intranets.com, for $45 million.

Internet applications

XML and RSS

Advocates of "Web 2.0" may regard syndication of site content as a Web 2.0 feature, involving as it does standardized protocols, which permit end-users to make use of a site's data in another context (such as another website, a browser plugin, or a separate desktop application). Protocols which permit syndication include RSS (Really Simple Syndication — also known as "web syndication"), RDF (as in RSS 1.1), and Atom, all of them XML-based formats. Observers have started to refer to these technologies as "Web feed" as the usability of Web 2.0 evolves and the more user-friendly Feeds icon supplants the RSS icon.
Specialized protocols
Specialized protocols such as FOAF and XFN (both for social networking) extend the functionality of sites or permit end-users to interact without centralized websites.
Other protocols, like XMPP enables services to users like Services over the Messenger

Web APIs

Machine-based interaction, a common feature of Web 2.0 sites, uses two main approaches to web APIs, which allow web-based access to data and functions: REST and SOAP.
  1. REST (Representational State Transfer) web APIs use HTTP alone to interact, with XML (eXtensible Markup Language) or JSON payloads;
  2. SOAP involves POSTing more elaborate XML messages and requests to a server that may contain quite complex, but pre-defined, instructions for the server to follow.
Often servers use proprietary APIs, but standard APIs (for example, for posting to a blog or notifying a blog update) have also come into wide use. Most communications through APIs involve XML or JSON payloads.
Web Services Description Language (WSDL) is the standard way of publishing a SOAP API and there are a range of Web Service specifications.
See also EMML by the Open Mashup Alliance for enterprise mashups.

Criticism

Critics of the term claim that "Web 2.0" does not represent a new version of the World Wide Web at all, but merely continues to use so-called "Web 1.0" technologies and concepts. First, techniques such as AJAX do not replace underlying protocols like HTTP, but add an additional layer of abstraction on top of them. Second, many of the ideas of Web 2.0 had already been featured in implementations on networked systems well before the term "Web 2.0" emerged. Amazon.com, for instance, has allowed users to write reviews and consumer guides since its launch in 1995, in a form of self-publishing. Amazon also opened its API to outside developers in 2002.[35] Previous developments also came from research in computer-supported collaborative learning and computer-supported cooperative work and from established products like Lotus Notes and Lotus Domino, all phenomena which precede Web 2.0.
But perhaps the most common criticism is that the term is unclear or simply a buzzword. For example, in a podcast interview Tim Berners-Lee described the term "Web 2.0" as a "piece of jargon":
"Nobody really knows what it means...If Web 2.0 for you is blogs and wikis, then that is people to people. But that was what the Web was supposed to be all along."
Other critics labeled Web 2.0 “a second bubble” (referring to the Dot-com bubble of circa 1995–2001), suggesting that too many Web 2.0 companies attempt to develop the same product with a lack of business models. For example, The Economist has dubbed the mid- to late-2000s focus on Web companies "Bubble 2.0".Venture capitalist Josh Kopelman noted that Web 2.0 had excited only 53,651 people (the number of subscribers at that time to TechCrunch, a Weblog covering Web 2.0 startups and technology news), too few users to make them an economically viable target for consumer applications.Bruce Sterling reports he's a fan of Web 2.0, he thinks it is now dead as a rallying concept.
Critics have cited the language used to describe the hype cycle of Web 2.0 as an example of Techno-utopianist rhetoric.
In terms of Web 2.0's social impact, critics such as Andrew Keen argue that Web 2.0 has created a cult of digital narcissism and amateurism, which undermines the notion of expertise by allowing anybody, anywhere to share - and place undue value upon - their own opinions about any subject and post any kind of content regardless of their particular talents, knowledgeability, credentials, biases or possible hidden agendas. He states that the core assumption of Web 2.0, that all opinions and user-generated content are equally valuable and relevant, is misguided and is instead "creating an endless digital forest of mediocrity: uninformed political commentary, unseemly home videos, embarrassingly amateurish music, unreadable poems, essays and novels," also stating that Wikipedia is full of "mistakes, half truths and misunderstandings".

Technology overview

Web 2.0 draws together the capabilities of client- and server-side software, content syndication and the use of network protocols. Standards-oriented web browsers may use plugins and software extensions to handle the content and the user interactions. Web 2.0 sites provide users with information storage, creation, and dissemination capabilities that were not possible in the environment now known as "Web 1.0".
Web 2.0 websites typically include some of the following features and techniques. Andrew McAfee used the acronym SLATES to refer to them:[20]
Search
Finding information through keyword search.
Links
Guides to other related information.
Authoring
The ability to create and update content leads to the collaborative work of many rather than just a few web authors. In wikis, users may extend, undo and redo each other's work. In blogs, posts and the comments of individuals build up over time.
Tags
Categorization of content by users adding one-word descriptions to facilitate searching, without dependence on pre-made categories. This is referred to as "folksonomy."
Extensions
Software that makes the Web an application platform as well as a document server.
Signals
The use of syndication technology such as RSS to notify users of content changes.

The Web As Platform

Like many important concepts, Web 2.0 doesn't have a hard boundary, but rather, a gravitational core. You can visualize Web 2.0 as a set of principles and practices that tie together a veritable solar system of sites that demonstrate some or all of those principles, at a varying distance from that core.
Web2MemeMap
Figure 1 shows a "meme map" of Web 2.0 that was developed at a brainstorming session during FOO Camp, a conference at O'Reilly Media. It's very much a work in progress, but shows the many ideas that radiate out from the Web 2.0 core.
For example, at the first Web 2.0 conference, in October 2004, John Battelle and I listed a preliminary set of principles in our opening talk. The first of those principles was "The web as platform." Yet that was also a rallying cry of Web 1.0 darling Netscape, which went down in flames after a heated battle with Microsoft. What's more, two of our initial Web 1.0 exemplars, DoubleClick and Akamai, were both pioneers in treating the web as a platform. People don't often think of it as "web services", but in fact, ad serving was the first widely deployed web service, and the first widely deployed "mashup" (to use another term that has gained currency of late). Every banner ad is served as a seamless cooperation between two websites, delivering an integrated page to a reader on yet another computer. Akamai also treats the network as the platform, and at a deeper level of the stack, building a transparent caching and content delivery network that eases bandwidth congestion.
Nonetheless, these pioneers provided useful contrasts because later entrants have taken their solution to the same problem even further, understanding something deeper about the nature of the new platform. Both DoubleClick and Akamai were Web 2.0 pioneers, yet we can also see how it's possible to realize more of the possibilities by embracing additional Web 2.0 design patterns.
Let's drill down for a moment into each of these three cases, teasing out some of the essential elements of difference.

Netscape vs. Google

If Netscape was the standard bearer for Web 1.0, Google is most certainly the standard bearer for Web 2.0, if only because their respective IPOs were defining events for each era. So let's start with a comparison of these two companies and their positioning.
Netscape framed "the web as platform" in terms of the old software paradigm: their flagship product was the web browser, a desktop application, and their strategy was to use their dominance in the browser market to establish a market for high-priced server products. Control over standards for displaying content and applications in the browser would, in theory, give Netscape the kind of market power enjoyed by Microsoft in the PC market. Much like the "horseless carriage" framed the automobile as an extension of the familiar, Netscape promoted a "webtop" to replace the desktop, and planned to populate that webtop with information updates and applets pushed to the webtop by information providers who would purchase Netscape servers.
In the end, both web browsers and web servers turned out to be commodities, and value moved "up the stack" to services delivered over the web platform.
Google, by contrast, began its life as a native web application, never sold or packaged, but delivered as a service, with customers paying, directly or indirectly, for the use of that service. None of the trappings of the old software industry are present. No scheduled software releases, just continuous improvement. No licensing or sale, just usage. No porting to different platforms so that customers can run the software on their own equipment, just a massively scalable collection of commodity PCs running open source operating systems plus homegrown applications and utilities that no one outside the company ever gets to see.
At bottom, Google requires a competency that Netscape never needed: database management. Google isn't just a collection of software tools, it's a specialized database. Without the data, the tools are useless; without the software, the data is unmanageable. Software licensing and control over APIs--the lever of power in the previous era--is irrelevant because the software never need be distributed but only performed, and also because without the ability to collect and manage the data, the software is of little use. In fact, the value of the software is proportional to the scale and dynamism of the data it helps to manage.
Google's service is not a server--though it is delivered by a massive collection of internet servers--nor a browser--though it is experienced by the user within the browser. Nor does its flagship search service even host the content that it enables users to find. Much like a phone call, which happens not just on the phones at either end of the call, but on the network in between, Google happens in the space between browser and search engine and destination content server, as an enabler or middleman between the user and his or her online experience.
While both Netscape and Google could be described as software companies, it's clear that Netscape belonged to the same software world as Lotus, Microsoft, Oracle, SAP, and other companies that got their start in the 1980's software revolution, while Google's fellows are other internet applications like eBay, Amazon, Napster, and yes, DoubleClick and Akamai.

What Is Web 2.0

The bursting of the dot-com bubble in the fall of 2001 marked a turning point for the web. Many people concluded that the web was overhyped, when in fact bubbles and consequent shakeouts appear to be a common feature of all technological revolutions. Shakeouts typically mark the point at which an ascendant technology is ready to take its place at center stage. The pretenders are given the bum's rush, the real success stories show their strength, and there begins to be an understanding of what separates one from the other.
The concept of "Web 2.0" began with a conference brainstorming session between O'Reilly and MediaLive International. Dale Dougherty, web pioneer and O'Reilly VP, noted that far from having "crashed", the web was more important than ever, with exciting new applications and sites popping up with surprising regularity. What's more, the companies that had survived the collapse seemed to have some things in common. Could it be that the dot-com collapse marked some kind of turning point for the web, such that a call to action such as "Web 2.0" might make sense? We agreed that it did, and so the Web 2.0 Conference was born.
In the year and a half since, the term "Web 2.0" has clearly taken hold, with more than 9.5 million citations in Google. But there's still a huge amount of disagreement about just what Web 2.0 means, with some people decrying it as a meaningless marketing buzzword, and others accepting it as the new conventional wisdom.
This article is an attempt to clarify just what we mean by Web 2.0.
In our initial brainstorming, we formulated our sense of Web 2.0 by example:
Web 1.0 Web 2.0
DoubleClick --> Google AdSense
Ofoto --> Flickr
Akamai --> BitTorrent
mp3.com --> Napster
Britannica Online --> Wikipedia
personal websites --> blogging
evite --> upcoming.org and EVDB
domain name speculation --> search engine optimization
page views --> cost per click
screen scraping --> web services
publishing --> participation
content management systems --> wikis
directories (taxonomy) --> tagging ("folksonomy")
stickiness --> syndication
The list went on and on. But what was it that made us identify one application or approach as "Web 1.0" and another as "Web 2.0"? (The question is particularly urgent because the Web 2.0 meme has become so widespread that companies are now pasting it on as a marketing buzzword, with no real understanding of just what it means. The question is particularly difficult because many of those buzzword-addicted startups are definitely not Web 2.0, while some of the applications we identified as Web 2.0, like Napster and BitTorrent, are not even properly web applications!) We began trying to tease out the principles that are demonstrated in one way or another by the success stories of web 1.0 and by the most interesting of the new applications.

History: From Web 1.0 to 2.0

The term "Web 2.0" was coined by Darcy DiNucci in 1999. In her article "Fragmented Future," she writes
The Web we know now, which loads into a browser window in essentially static screenfuls, is only an embryo of the Web to come. The first glimmerings of Web 2.0 are beginning to appear, and we are just starting to see how that embryo might develop The Web will be understood not as screenfuls of text and graphics but as a transport mechanism, the ether through which interactivity happens. It will [...] appear on your computer screen, [...] on your TV set [...] your car dashboard [...] your cell phone [...] hand-held game machines [...] maybe even your microwave oven.
Her use of the term deals mainly with Web design and aesthetics; she argues that the Web is "fragmenting" due to the widespread use of portable Web-ready devices. Her article is aimed at designers, reminding them to code for an ever-increasing variety of hardware. As such, her use of the term hints at - but does not directly relate to - the current uses of the term.
The term did not resurface until 2003. authors focus on the concepts currently associated with the term where, as Scott Dietzen puts it, "the Web becomes a universal, standards-based integration platform."
In 2004, the term began its rise in popularity when O'Reilly Media and MediaLive hosted the first Web 2.0 conference. In their opening remarks, John Batelle and Tim O'Reilly outlined their definition of the "Web as Platform," where software applications are built upon the Web as opposed to upon the desktop. The unique aspect of this migration, they argued, is that "customers are building your business for you." They argued that the activities of users generating content (in the form of ideas, text, videos, or pictures) could be "harnessed" to create value.
O'Reilly et al contrasted Web 2.0 with what they called "Web 1.0." They associated Web 1.0 with the business models of Netscape and the Encyclopedia Britannica Online. For example,
Netscape framed "the web as platform" in terms of the old software paradigm: their flagship product was the web browser, a desktop application, and their strategy was to use their dominance in the browser market to establish a market for high-priced server products. Control over standards for displaying content and applications in the browser would, in theory, give Netscape the kind of market power enjoyed by Microsoft in the PC market. Much like the "horseless carriage" framed the automobile as an extension of the familiar, Netscape promoted a "webtop" to replace the desktop, and planned to populate that webtop with information updates and applets pushed to the webtop by information providers who would purchase Netscape servers.
In short, Netscape focused on creating software, updating it on occasion, and distributing it the end users. O'Reilly contrasts this with Google, a company which does not focus on producing software such as a browser but instead focuses on providing a service based on data. The data here, of course, are the links Web page authors make between sites. Google exploits this user-generated content to offer Web search based on reputation through its "Page Rank" algorithm. Unlike software, which undergoes scheduled releases, a service such as Google is constantly updated, a process called "the perpetual beta."
A similar difference can be made between the Encyclopedia Britannica Online and Wikipedia: while the Britannica relies upon experts to create articles and releases them periodically in publications, Wikipedia relies on radical trust in anonymous users to constantly and quickly build content. Wikipedia is not based on expertise but rather an adaptation of the open source software adage "given enough eyeballs, all bugs are shallow," and it produces and updates articles constantly.
O'Reilly's Web 2.0 conferences have been held every year since 2004, attracting entrepreneurs, large companies, and technology reporters. In terms of the lay public, the term Web 2.0 was largely championed by bloggers and by technology journalists, culminating in the 2006 TIME magazine Person of The Year - "You." is, TIME selected the masses of users who were participating in content creation on social networks, blogs, wikis, and media sharing sites. The cover story author Lev Grossman explains:
It's a story about community and collaboration on a scale never seen before. It's about the cosmic compendium of knowledge Wikipedia and the million-channel people's network YouTube and the online metropolis MySpace. It's about the many wresting power from the few and helping one another for nothing and how that will not only change the world, but also change the way the world changes.
Since that time, Web 2.0 has found a place in the lexicon; the Global Language Monitor recently declared it to be the one-millionth English word.

Web 2.0

The term "Web 2.0" is commonly associated with web applications which facilitate interactive information sharing, interoperability, user-centered design and collaboration on the World Wide Web. Examples of Web 2.0 include web-based communities, hosted services, web applications, social-networking sites, video-sharing sites, wikis, blogs, mashups and folksonomies. A Web 2.0 site allows its users to interact with other users or to change website content, in contrast to non-interactive websites where users are limited to the passive viewing of information that is provided to them.
The term is closely associated with Tim O'Reilly because of the O'Reilly Media Web 2.0 conference in 2004. Although the term suggests a new version of the World Wide Web, it does not refer to an update to any technical specifications, but rather to cumulative changes in the ways software developers and end-users use the Web. Whether Web 2.0 is qualitatively different from prior web technologies has been challenged by World Wide Web inventor Tim Berners-Lee who called the term a "piece of jargon"

Microsoft Expects Web Hosts To Help Its Hyper-V To Hit The VPS Hosting Market

February 21, 2009
hyper-v-architectureMicrosoft’s Hyper-V virtualization solution has been welcomed in the VPS hosting industry mostly because it was supposed to save money on Virtualization software, compared to other Windows virtualization technologies (see comparison article “Hyper-V vs. VMware ESX“), at least when it comes to a single physical server or small business VPS networks.
The success of each new product or service depends on consumers’ appreciation. But when it comes to Hyper-V there is other side of the future success story - it is important how the web hosting providers will accept the new Windows virtualization solution.
Microsoft is betting on web hosts to demonstrate that Hyper-V can perform on large scale virtual infrastructures. The Windows producer however will face tough competition on the virtualization market. Microsoft’s goal is not to be a player in web hosting industry. It wants to conquer this market and this makes virtualization software producers like VMware, Parallels and etc to be very careful in watching Microsoft steps and in competing the Silicon Valley based corporation.
John Zanni, General Manager of Worldwide Hosting for Microsoft disclosed in an in-house interview how Hyper-V will be included in company’s Hosting Deployment Accelerator (HDA), a free blueprint for hosting projects. Zanni unveiled that 256 web hosting companies worldwide (including industry giants like Rackspace) are in process of adopting Windows Server 2008, which implies that they will probably decide to offer Hyper-V based VPS hosting.
Microsoft however will face tough competition from its present software Partner Parallels. Parallels is producers of the world’s leading OS virtualization solution Virtuozzo Containers (a software that offers both Linux and Windows VPS solutions). It is obvious that the web hosting market welcomes Parallels virtualization products and it will be very interesting how Microsoft will compete them with its “bare-metal hypervisor”.
Microsoft released Hyper-V Hosting Guidance (a licensing guide) named “Using and Licensing Microsoft Server Products in Hyper-V Virtual Hosting Scenarios“. The document is a 28-page guide and covers scenarios such as: Unmanaged dedicated server with Hyper-V; Virtual dedicated server (VDS) for Web scenarios (using Windows Server guests in anonymous mode); Virtual dedicated server with line-of-business (LOB) scenarios (using Windows Server guests in authenticated mode); Use of virtualization in shared hosting scenarios; Desktops as Hyper-V guests; End customers running Microsoft products using the customers own licenses on the guest OS. The documents clarifies many of the terms part of the general SLA of Microsoft - “Microsoft Services Provider License Agreement (SPLA)“.

Web Hosting Reviews - Fake or True! Where To Publish Yours.

Web Hosting Reviews - Fake or True! Where To Publish Yours.


reviews-readingThe experience site owners have with one or another web hosting provider is the most important thing that helps newbies to distinguish reliable web hosting providers from those of poor quality, up-time, and customer support. To share experience is very important and that’s why web hosting review sites are considered as very important.
False Web Hosting Reviews
There are a number of so called “web hosting review” sites which pretend that the web hosting providers they recommend worth your attention and the money you would spend with them. Some of those websites are real, but there are a long list of fake “web hosting review”.
Here is a short list of some web hosting review websites which claim to publish genuine customer reviews, but actually don’t (or at least there is no proof that they their reviews are original, as far as there is no forms to share your opinion on anything on their sites).
HostingCoupons.org: Although it is something like a “coupons blog” this website lists the “usual suspects” in Shared hosting market and classifies them as “best” and “top”. There is no chance for a consumers to say whether the companies listed there are really good or not. So, the “judgment” is “Fake”.
WebHostingGeeks.com: This “web hosting directory” pretends to target consumers’ attention to “top 10″ and “the best” hosting providers in almost any niche of the market. The directory has a excellent SEO rate and is ranked #1 in Google on many web hosting related keywords and phrases. I’ve unsuccessfully tried to submit a review for a web host and to text their “Add Customer Review” form. If you can submit one and then get it published, this should be a big they for the democracy in web hosting industry. So keep trying. The “judgmental” is “very fake” as far as this one is influential and made many people to choose any particular company which pay well for advertising.
WebHostingChoice.com: This is a web hosting directory I know for a long time. Honestly I can not find any explanation about why its lists the web hosts on its home page the way they are listed. The “Title” tag of the website says “Top 10 Web Hosting Sites”, a very spamming phrase. Sorry webhostingchoice.com, but you are fake.
Upperhost.com: This is crappy. The #1 web host is Bluhost and the website’s logo is the Bluehost’s one. I’ve seen one of Bluehost’s owners Matt and he looks like a very nice guys. But this is not about the Bluhost. It is about fake website which call’s itself “Top Inbependent Best Web Site Hosting Reviews”.H
Hosting-review.com: Forget about this one as well. It is flashy, it has all popular overselling web hosts featured in its home page, but if you want for some reason to review company like Verio or SoftLayer, then you have a problem.
Webhostingtoplist.com: No reviews. False list, which is just marketing.
Firstwebhosting.net: The same as the one above. Closed website, not chance to review anyone. Even the “Review this host” links next то the selected web hosts doesn’t work.
True Web Hosting Reviews
Well, it is the shortest paragraph in the article. The reason is that the objectiveness doesn’t pay as good as it should. But there are still some quality web hosting review sites or web hostign directories where you can get you review published.
WebHostingStuff.com: A long time among top ranked web hosting resources in all search engines, the WebHostingStuff deserves all the money they make from advertisers. There is democracy in this web hosting directory. Anyone can get listed for free. If any web host wants to be featured among Top 10 in the home page or in any category it pays on pay per click model. Users however can say their opinion about the provider.
Webhostingratings.com: An old player, a directory pioneer in publishing real and objective web hosting reviews. Unfortunately it is more an archive these days.
WebHostMagazine.com: A quality resource, with a competent editorial office. Anyone could contact them and to send a review.
HostSearch.com: A reputable directory with real and non-biased web hosting reviews. Strongly suggest you to use it.
Try Them
There also a third category of hosting review sites. Those are some that I wasn’t able to verify, but at the same time they seem to publish real user driven reviews. Here are some of them:
WebHostingReviews.com: Besides the very nice web address for a hosting review website, this site seems to provide objective consumer reviews. The “Submit Reviews” link/button is placed on a visible place in the main menu. So check the website and submit review about your web host there. However be advised that the #1 host in this website is HostGator, the same company that hosts the review website, and the contact details about the owner are Whois protected. Which means that the ownership of this hosting review website is not clear, and there is probably a good reason to be hidden.
WHReviews.com: A quite strange website, a very different from other similar hosting lists. The owner Dan Lemnaru is one of the most active members of Web Hosting Talk community. I can not estimate what percentage of the ranked web hosts listed in his WHReviews are inspired by customers. Dan says “You can trust my website and the things I say because I prove that I tell the truth. I do my best not to get too fond of any web hosting company and to keep my objectivity as intact as humanly possible”, and I think he would be trusted. However no ranking methodology can be found on WHReviews.
Web-Hosting-Top.com: I can bet that once I got into a “Submit Review” page in this hosting review site. However I wasn’t able to find it this time, which means it will not be easy for you as well. Of course there is nothing wrong to say about the website, excluding that 95% of the web hosts lsited there are those you can find in all false hosting reviews sites. And the owner’s contacts are hidden.
WebHostingSearch.com: This web hosting directory has a very cool form for rating your web host’s “reliability”, “support knowledge”, “speed”, “user-friendly control panel”. I’ve never used them, so I can not say whether they pay attention on users’ reviews or not.
Webhostingjury.com: Give them a try and the contact me to say it they work. It seems they do, that’s why i featured them here.

Cloud Consulting Is Becoming Lucrative Business

Cloud Consulting Is Becoming Lucrative Business


cloud-computing-globe“HP announced new consulting services to help enterprise customers consider incorporating the cloud as part of their broader IT strategy”, “PEER 1 launched its CloudXcelerator program… which lets its partners to develop, test and run their cloud applications on the company’s IT hosting infrastructure”, “The web designers’ guide to cloud hosting”, these are one try of the news items about Cloud computing consulting and reference services, that I have found today while browsing for Cloud computing topics.
It is obvious that the ongoing major change in computer industry is creating a new niche for IT professionals - to teach consumers and businesses on Cloud computing and to earn money from providing expertise and helping businesses in the process of moving to the clouds.
The truth is that it is very hard for most people to understand the concept of Cloud computing and how it will apply to daily life. We are already using a number of services delivered from computer clouds, but I doubt that we will see the cloud computing technologies adopted in our homes soon. The cloud computing will continue to be an enterprise trend in computer technologies. At the same time end users will continue to buy comuter software and hardware to access web services produced from cloud computing platforms.
I think that after 2012 - 2013 it would be possbile and cost effective for the major hardware vendors to release on the computer market systems and devices that adopt the concept of cloud computing. These would be devices that use computing resources produced from computer systems located in remote data centers. To have this model in practise we need a very advanced and reliable IT networks.
It will take time for telecoms and Internet connectivity providers to build reliable infrastructures. This would happen within the next 4 years. At the same time cloud computing providers would use the time to produce matured clouds.

Google To Help GoDaddy To Become Number 1 Web Hosting Provider?

The Google’s influence on any dot-com market is significant. Web Hosting is not an exception. The Search engine can turn a small web hosting company into a business with ten thousands of customers only if it ranks its website on the first page with search results (SERP). Any hosting service provider that has been ranked among top 10 web hosts withing the last 10 years has significantly increased its market share. But what would happen if a market leader goes on top in the Google’s SERP?
While talking about Cloud computing and explaining that he sees “5 types of Clouds”, in an interview with Network World about new trends in virtualization technologies and Cloud computing the CEO of Parallels Serguei Bellousov said:
“Google. Today it’s real cloud computing… It’s real applications in someone’s cloud, “honestly I believe Google is a true evil empire“. Google realized that the main impetus for it’s growth will be anti-trust laws. Because of this from the beginning Google tried to position itself as a nice company. Google is very closed, and very focused on direct control of the end customer.”
Of course Google is not “Evil Empire” in its genesis. The company’s message to anyone says exactly the opposite “Don’t be evil”. The market dominance of the Search giant however can boost the business growth of any dot-com business or even to kill businesses.
I used to have a directory that was ranked very high in Google on various web hosting related keywords between 2003 and 2005, and of course it brought me a lot of money. But when I stopped paying attention on the development of the website it went down. A lot of new competitors for the same keywords, same content area appeared, I lost my ranking, and the business went down. So my sales dropped from $3,500/month to $200/month.
Of course the first to do was to blame anyone but me. I needed to find explanation of what happened so Google went top in my guilt-list, followed by the “shameless spammers” (some of them are real spammers) with the “top hosting directories” that got the positions I had.
But after a 2 months of being panic-stricken and the only words in my head were “You are loosing!”, I’ve finally get myself together and succeed to figure the things out . “It was my mistake and there’s nobody to blame”, was the message I sent to my ego, that somehow I made it to persuade it.
Once I realized that I lost because I didn’t work hard and just decided I know everything I need about the Search Engines I’ve decided that I’ll kill the project myself by changing its content and starting from the scratch. Somewhere in that time, et the end of 2005 I have decided to create a blog which would cover the web hosting industry from a different perspective.
At the this time anyone was writing only about offerings, plans, promotions, discounts, or just trying to create the next “Best” or “Top” web hosting list (something which is still trendy), and most authors saw web hosting companies in black an white (very good or very poor of quality). So decided to talk about the things in hosting industry the way like any journalist would do it. I didn’t know whether I make it or not, but Daw Blog established itself among the leaders in the industry. Of course I still need to work a lot, to learn every day and to keep going.
I’ve said the above things to illustrate that we should always try to find whether we made a mistake or not before blaming others. But this articles is not about me. It is about Google’s dominant position on the Search market and its capacity to crown anyone or to “send them into exile”.
But Google Still Can Boost Or Kill Businesses?
GoDaddy, probably the worlds biggest domain registrar has been widely popular in U.S. with its Superbowl ads and sexy girls offering web addresses on tv. Of course the beautifull girls have nothing to do with the web hosting. Honestly, the messages they sent to consumers in GoDaddy ads are some of the dummiest I’ve ever seen in web hosting. However they worked and made GoDAddy the most recognizable name in Domain registration industry.
Of course I must admit that GoDaddy offers a quality services and was on of the first Domain registrars to offer advanced online CRM to domain name owners.
Today, GoDaddy is probably on its way to become the world’s biggest web host (at least biggest one in the retail web hosting market). It would become the most recognizable brand in web hosting ever, the way it made it in domain industry. And Google will be “guilty” for this. Why?
GoDaddy is currently number 1 ranked website on “web hosting” search term. Google counts 8,820 links that lead to GoDaddy’s index page. Of course the number of back links is not most important factor for high rank in Google. If it is like that, the first page with “web hosting” search results would be conquered by spammers.
The most powerfull Search engines “helped” many in web hositng business to get succeed. The only thing to do for any web host so it can start receiving a ot of targeted traffic is to get ranked in the first SERP in the search engine. The new guy on top is GoDaddy. The company is top on another lucrative search term as well - “hosting” and “domain hosting”. It is approaching the top 3 spots on “website hosting”.
The first place on “web hosting” in Google has been occupied for almost 2 years from overselling web hositng providers or “top web hostign” directories with insignificant buisness record on the web hosting market. Now with ranking GoDaddy first Google algorithm officialy crowns a company which consumers has already recognized as one of the very few nobels on the web hositng market.
I think that the GoDaddy’s rank on top on all important web hosting serach terms will further stimulate the World’s Number top Domain registrar to invest in web hosting market and to increase its share in web hosting industry. The company has been critisized for its poor customer support and for many otehr things, but at the same time it is very good in automating its services and products.
So we can expect very soon to see GoDaddy as a top retail web hosting provider within the next year. And this would be even without mentioning another brand, tied to the company - Wild West Domains.