Facebook Just Removed The Home Page Ticker, But It Should Be Back Soon


This post is by Josh Constine from TechCrunch


Click here to view on the original site: Original Post




Facebook Ticker

Poof. Vanished. Gone. Many Facebook users are no longer seeing the real-time news “Ticker” on the right side of their home page, but apparently it should reappear soon. While nothing was posted to Facebook’s Known Issues Page, one Twitter user relayed that his friend who works at Facebook said Ticker “will be back online soon”. There’s a chance the removal could be more permanent, though.

The Ticker is essentially Facebook’s firehose, showing abbreviated stories about nearly every action taken by your friends, no matter how insignificant. If Facebook did remove the Ticker, it could make it harder for third-party apps to grow, but give Facebook more prime real-estate above the fold to show ads.

Reports of the Ticker’s disappearance started trickling into Twitter just before 9pm PST on June 30th, and we’re waiting to hear back from Facebook for official details.

To some it’s known as the “stalker ticker” because watching it can give the feeling of peering over the shoulder of all your friends as they post comments, like photos, read articles, and more. To others it’s the “Spotify Ticker” since the music app was one of the only apps publishing to the Ticker when it first launched at f8 last year. Without other apps mixing their content in, suddenly everyone on Facebook became intimately aware of every song their high school classmates were listening to.

With time as the Open Graph app platform went public and other apps started publishing to Ticker, it refined to show fewer stories about distant acquaintances and more about your close friends. For a story to appear in the main news feed instead of the Ticker, it usually has to receive plenty of Likes and comments, be from one of your best friends, or get aggregated with similar stories like “4 of your friends listened to Japandroids.”

Without the Ticker as a distribution method, apps whose stories are less likely to reach the primary feed than status updates and photo uploads would find it harder to gain new users, which could cut down on runaway growth stories like Socialcam. However, Facebook would gain the extra space in the right sidebar, allowing it to show a full seven ads per page more frequently.

Facebook just never stops changing things ever, so the Ticker could suddenly reappear. The fact that the change was implemented on a Saturday evening Pacific time when most of Facebook’s key markets are away from their desktops is a likely sign that this could just be some maintenance. Ticker entries still appear in the Help Center too. That doesn’t mean it will be around forever of it returns, though. Facebook may be finding the Ticker was confusing people, geting few clicks, and taking up too much room on the home page to warrant its existence.

But power users and frequent visitors would mourn its loss. Without Ticker or a separate Twitter-esque firehose tab in the main feed like there was long ago, there might not always be something fresh on the home page, and we could lose touch with more distant friends. For better or worse, Facebook might not give as intimate a look into the lives of our friends without Ticker, so I’ll be happy to see it return.


27-year-old internet community The WELL up for sale


This post is by Aaron Souppouris from The Verge - All Posts


Click here to view on the original site: Original Post




the well

The WELL (Whole Earth ‘Lectronic Link), an internet community that was formed 27 years ago, has seen its staff laid-off and is now up for sale. The WELL began its life as a pay-for Bulletin Board System (BBS) in 1985 and is now a subscription-based forum that also offers email services. After changing hands a few times, It was purchased by Salon in 1999, which has owned it since. After a failed attempt to sell the site in 2005, Salon announced in a filing yesterday that it recently “laid off The WELL staff and began to look for a buyer for the property.”

Although The WELL has never been a particularly high-traffic site, it was described as “the world’s most influential online community” by Wired magazine in 1997 and still has 2,693…

Continue reading…

With Tech From Space, Ministry Of Supply Is Building The Next Generation Of Dress Shirts


This post is by Rip Empson from TechCrunch


Click here to view on the original site: Original Post




79bdc32e2d5caa1cfa800ef17e94cb36_large

Nobody likes to admit it, but if you’re a working professional, there’s a good chance you’re familiar with sweat stains. The commute to work, the stress of meeting a deadline, the faulty air conditioning in the boardroom, cotton weaves — all of these things and many more have been known to conspire against you, the working professional. Luckily, Ministry of Supply feels your stinky, stinky pain.

While athletes have Under Armour, business attire has more or less remained the same for the last century. So, armed with some of the same technology NASA uses in its space suits, Ministry of Supply has developed a line of dress shirts — called “Apollo” — that adapt to your body to control perspiration, reduce odor, and make you feel like a million bucks.

Founded in 2010 by MIT grads, Gihan Amarasiriwardena, Aman Advani, Kit Hickey, and Kevin Rustagi, Ministry of Supply launched three limited lines of premium dress shirts back in October. Of course, they quickly found that, in order to continue iterating and sell at scale, they would need funding. They went to venture capitalists for backing, and while there was interest, most wanted to see more proof of concept. So, like many before them, they took to Kickstarter to raise money for their hi-tech dress shirts.

And the working professionals of the world responded. The team set out to raise $30K and within 5 days of launching the campaign, they met their goal. Today, that total is at $123,386, and the excitement continues. The Ministry of Supply founders tell us that, over the last week, they’ve been averaging $8K in donations per day.

But what is it about these dress shirts that has people so excited? The team’s line of dress shirts, called “Apollo” use a knit, synthetic (and proprietary) blend of fibers that use “Phase Change Materials” to control your body temperature by pulling heat away from your body and storing it in the shirt. Find yourself back in air conditioning and the shirt releases the stored heat to keep you feeling warm — and like a million bucks.

The shirts, like Under Armour, also wick sweat and moisture away from your body and, by using an anti-microbial coating, get rid of that pesky bacteria that makes you smell like a barnyard. Not only that, but having done strain analysis and designing the shirt with motion in mind, the Apollo line adapts to your movements and stays tucked in and wrinkle free all the live long day.

In essence, it’s a magic shirt.

Ministry of Supply also wants to keep jobs in the U.S., so the whole production process — packaging to fabric — is done at home. The funds the startup has raised from Kickstarter will be put towards managing these costs and paying for the production of the proprietary raw materials that go into the Apollo line.

Not every Kickstarter project is lucky enough to reach its initial goal — let alone exceed that by tens of thousands of dollars — so, to keep people engaged, the team has been updating its page with video and has been setting new milestones in addition to the ones put in place at the outset.

At $75K, the team pledged to switch from XS to XXL to standard collar-sleeve length sizing; at $125K, they pledged to add two new colors to the mix, and if they reach $200K, they’ll add patterns, either a thin stripe or a plaid, the founders tell us.

And, if they reach $291,494 and become the highest-funded fashion-related Kickstarter project, the founders tell us that they plan to launch their backers’ names into space on a “ridiculous weather balloon.” To that end, they assured us that they have two aerospace members of the team, one of whom works for SpaceX, who will help make that happen.

When I asked what contributed to their success thus far, the founders said that it’s been important to them to bring the same intense iterative process to the development of their dess shirts that one sees when designing products for consumer electronics or for the consumer Web. They’ve done dozens of iterations of Apollo, A/B testing, you name it.

As the founders themselves boast experience working for IDEO, Apple, Lululemon, and more, the focus on design and iteration isn’t surprising.

As to what’s next? The team is working on finishing a showroom in Boston, which should be completed in the next couple of months, as well as a dedicated eCommerce site. The founders have been inspired by the work of Warby Parker and Indochino, and plan to initially do most of their selling online — and through their showroom in Boston. Just like their shirts — it’s a blended approach.

But with so much interest both at home and abroad, it won’t be long before the team begins to work with retailers to distribute their dress shirts. Right now they’re planning on selling them for about $130 a pop, so their Kickstarter campaign provides a good opportunity to get in early before prices start rising.

For more, find Ministry of Supply at home here. Kickstarter video below:


Pitchfork Music Festival to be streamed live on YouTube July 13-15


This post is by Evan Rodgers from The Verge - All Posts


Click here to view on the original site: Original Post




Pitchfork Music Festival

If you’re one of the many people who can’t find the time to interrupt productive life to pack up a tent, grab a box of incense, and head to a music festival, take heart: Pitchfork has announced that it will be streaming its 2012 Music Festival live on YouTube. The event takes place from July 13 – 15, and just like Bonnaroo earlier this month, the festival’s set-list will be streamed to your presumably well-stocked, air-conditioned home free of charge. Pitchfork has released a promotional teaser below with some of the bands in question, including Hot Chip, Vampire Weekend, and Beach House. The festival kicks off with Feist at 8:30AM CST / 7:30AM EST on Friday morning.


Continue reading…

Thanks, Science! New Study Says CrunchBase Is An Information Treasure Trove


This post is by Gregory Ferenstein from TechCrunch


Click here to view on the original site: Original Post




full-treasure-chest

“I believe CrunchBase will gain a lot of attention from the academia soon, which is always eager for high-quality data set,” writes Guang Xiang of Carnegie Mellon University, who found that he could predict Mergers and Acquisitions much better using the unique business variables available in CrunchBase than the traditional databases used by academics. Thanks, Xiang, flattery will get you everywhere.

“Traditionally, people only used numeric variables/features for M&A prediction, such as ROI, etc. CrunchBase and TechCrunch provided a much richer corpus for the task,” he writes. Specifically, CrunchBase gave him data on a volume of companies roughly 43 times the normal dataset (2300 vs. +100,000) and access to valuable variables, such as management structure, financing, and media coverage.

For instance, “Strong financial backing is generally considered critical to the success of a company,” but traditional datasets won’t have detailed information on the management, their experience, and the funding rounds.

Even better, the news coverage itself on Techcrunch could also be a predictor of merger or acquisition (because, well, duh, if a company’s doing well enough to make the news, there’s a good chance someone is also itching to buy it out).

But, just when we were starting to blush, Xiang brought out the criticism, “Despite its large magnitude, the CrunchBase corpus is sparse with many missing attributes,” because the community-created database tends to focus on more popular companies and features. That said, even with drawbacks, the researchers still achieved “good performance,” with CrunchBase — Which impressively enough has been managed all these years by superwoman Gene Teare.

M&A activity is just the tip of the iceberg, and there are all sorts of business questions that could be answered using the vast amounts of data provided by CrunchBase. So, statisticians and business analysts, go nuts. And, when you find something cool, let us know first (tips@techcrunch.com).


South Carolina passes bill restricting municipal broadband


This post is by Adi Robertson from The Verge - All Posts


Click here to view on the original site: Original Post




Ethernet / Internet (stock)

South Carolina has become the latest state to rule against allowing towns to set up their own broadband networks. A bill passed last week makes it much more difficult to set up a local, city-owned ISP that provides service of 190Kbps or greater. Critics say the bill, which is now awaiting signature by the governor, is an effort by carriers like AT&T to lock out competition. In North Carolina, for example, Time Warner Cable launched a year-long legal battle to shut down a municipal network that offered faster speeds than its own Road Runner service. Some have also alleged that the bill was written and lobbied for by ALEC, a conservative think tank of which AT&T is a board member.

ALEC and others, however, say that private companies could…

Continue reading…

Steve Would Be Proud: How Apple Won The War Against Flash


This post is by Ryan Lawler from TechCrunch


Click here to view on the original site: Original Post




steve-jobs-ipad

Late Thursday, an extraordinary thing happened: Adobe announced in a blog post that it would not provide Flash Player support for devices running Android 4.1, and that it would pull the plugin from the Google Play store on August 15. The retreat comes five years after the introduction of the iPhone, the device which thwarted Flash’s mobile ambitions, almost even before they began.

That Adobe would make such an announcement nearly five years to the day that the first iPhone was sold is kind of funny. I’d like to think that the Flash team has a sense of humor and was well aware of the timing when it posted the blog entry, but I could also see the entry as unintentionally ironic. Either way, it caps off a five-year battle to win the mobile landscape — a war which for Adobe ended in defeat.

At the time the iPhone was announced, lack of support for Adobe Flash seemed like a glaring omission, for a platform that was so hell-bent on being a portable computing device. But it wasn’t until the iPad came out, two-and-a-half years later, that the battle between Apple and Adobe, Flash vs. HTML5, and “open” vs. “proprietary” reached a fever pitch.

The iPad Effect

The iPad was announced in January at WWDC, but wasn’t available until March. And when it did finally become available, people began to notice that the lack of Flash, which then was the de facto standard for video playback and interactivity on the web, was missing. For the iPhone, not having Flash was a minor annoyance — after all, few other smartphones had very good Flash support at the time… But for the iPad, which in many cases was being used as a laptop replacement, at least for consumption of media, that was a big deal.

It wasn’t long before Google latched onto this and began promising an alternative to the “broken” Apple devices which wouldn’t give users access to the full web, as publishers intended them to view it. It’s tough to believe now, but at one point, Flash on mobile devices was actually considered a feature. There was Google’s Andy Rubin in April 2010, announcing that Android would have full Flash support in Froyo, the next version of the operating system to be released.

The Impact Of “Thoughts On Flash”

Battle lines were drawn, and just a few days later, Steve Jobs issued his epic missive “Thoughts on Flash,” which sought to explain, once and for all, why Apple didn’t — and wouldn’t ever — integrate Flash into its mobile and tablet devices. There were numerous reasons, and Jobs debunked the trope of Flash being “open,” as well as its ability to access the full web. He also brought up security, reliability, performance, and battery life issues that plagued devices using the plugin.

Most importantly, though, Apple didn’t want Adobe developers to create cross-platform apps which didn’t take advantage of the most latest features, development libraries and tools. Jobs wrote:

“Our motivation is simple – we want to provide the most advanced and innovative platform to our developers, and we want them to stand directly on the shoulders of this platform and create the best apps the world has ever seen. We want to continually enhance the platform so developers can create even more amazing, powerful, fun and useful applications. Everyone wins – we sell more devices because we have the best apps, developers reach a wider and wider audience and customer base, and users are continually delighted by the best and broadest selection of apps on any platform.”

It turns out Jobs was right. When Flash finally did ship on Android devices, it didn’t provide users with the full web, as was promised. Android users who wished to watch videos on Hulu through the Flash browser, for instance, were met with a message saying that the content wasn’t available on the mobile web. Same thing for users who tried to access most premium video sites on Google TV, which also supported Flash. More importantly, even when those videos or interactive Flash elements did appear on Android devices, they were often wonky or didn’t perform well, even on high-powered phones.

The end result was that users stopped seeing Flash on mobile devices as a good thing, and developers quit trying to support the framework on those devices.

The Flash Issue Isn’t Just About Mobile

But the impact of that battle goes beyond just how people view content on mobile phones. While pretty much all developers have settled on building native apps or coding for the mobile web when trying to reach those users, the battle has also had an impact on the way that developers think about multi platform web development. Even when not building for 4-inch screen, they’re increasingly turning to HTML5 to build new user experiences or render interactive applications, rather than writing to be seen in the Flash player.

Video might be the last industry where the Adobe Flash Player continues to have a hold on how content is displayed, but even then, a growing number of sites are moving to HTML5-based video players for delivery. YouTube and Vimeo are leading that charge, displaying their videos in a HTML5 player first, when available, and only falling back to Flash when the player isn’t supported. And many others are following that lead.

Frankly, Flash had never been a huge business for Adobe, even when development for interactive websites using the plugin were in high demand. As time goes on, it will become an even less important part, as its development tools — where Adobe makes the bulk of its revenue — focus on catering to a developer base that is increasingly interested in building HTML5-based web applications. As more can be accomplished in-browser without a plugin, that’s good news for users and developers alike.


Microsoft’s chief liaison with third-party manufacturers leaves position


This post is by Aaron Souppouris from The Verge - All Posts


Click here to view on the original site: Original Post




Steven Guggenheimer

Steven Guggenheimer, Corporate Vice President and head of Microsoft’s OEM division, is leaving his position for a senior role in the company. Guggenheimer has been in charge of liaisons between Microsoft and its OEM partners for many years, and leaves just as the company has announced its first home-grown computer, the Surface. There have been reports that OEMs feel uneasy about Microsoft competing with them in the hardware market, and Acer founder Stan Shih has been quoted as saying Surface is an only an effort to encourage manufacturers to produce Windows 8 and RT tablets. So far, Asus has been the only third-party manufacturer to announce a retail-bound Windows RT tablet, and HP recently announced it would not release a Windows RT…

Continue reading…

Android 4.1 Jelly Bean ‘Liveness Check’ hopes to stop Face Unlock from being fooled by photos


This post is by Dante D'Orazio from The Verge - All Posts


Click here to view on the original site: Original Post




via assets.sbnation.com

Face Unlock has been a bit of a novelty ever since Google introduced it in Android 4.0, but any sense of security offered by the feature disappeared once it was revealed that it could be easily tricked by a photograph. For Google’s part it has always labeled the option as “low-security” and “experimental,” but it has now taken a step to stop photo-equipped thieves with a new “Liveness Check” that requires a user to blink before granting access to the phone. The search company has taken a page out of Samsung’s book by rolling out the blink-recognition technology, but don’t let the change fool you into trusting face unlock just yet. We’ve seen the system let in two vaguely similar-looking people through the lock screen before, so you’re…

Continue reading…

Latest outage raises more questions about Amazon cloud


This post is by Barb Darrow from GigaOM


Click here to view on the original site: Original Post




Massive thunderstorms notwithstanding, the fact that Amazon’s U.S. East data center went down again Friday night while other cloud services hosted in the same area kept running raises anew questions about whether Amazon is suffering architectural glitches that go beyond acts of God. While most Amazon services were back up Saturday morning, the company was still working on provisioning the backlog for its ELB load balancers as of 5:31 p.m. eastern time, according to the AWS dashboard.

This outage — the second this month – took down Netflix , Instagram, Pinterest, and Heroku, as Om previously reported. The storm was undoubtedly huge, leaving 1.3 million in the Washington D.C. area without power as of Saturday afternoon, but Joyent, an Amazon rival, also hosts cloud services from an Ashburn, Virg. data center and experienced no outage, something its marketing people were quick to point out.

The implication is that Amazon, with all its talk about redundancy and availability, shouldn’t be having these issues if others are not.

Steve Zivanic, VP of marketing for Nirvanix, another rival cloud provider, said customers should simply stop defaulting to Amazon’s cloud.  ”It’s becoming rather clear that the answer for [Amazon’s] customers is not to try to master the AWS cloud and learn how to leverage multiple availability zones in an attempt to avoid the next outage but rather to look into a multi-vendor cloud strategy to ensure continuous business operations,” Zivanic said via email. “You can spend days, months and years trying to master AWS or you could simply do what large-scale IT organizations have been doing for decades — rely on more than one vendor.”

The fact that Amazon, like any other data center-dependent business is not bulletproof also raises questions about why its customers don’t pursue a multi-cloud strategy or, if they’re going to rely solely on Amazon, why they put so much of their workload in one geography —  a practice Amazon itself advises against. Of course, it isn’t good practice for any vendor to blame snafus on its customers.

Presumably the tech folks at Instagram, Heroku, et al. know better. Earlier this month,  I asked Byron Sebastian, the Salesforce.com VP of platforms who runs the Heroku business, if Heroku was actively seeking other cloud platform partners. He said the company is always evaluating its options.

Twitter was awash in comments. Many wondered why Amazon’s data center did not cut over to generator power while others, like Gartner analyst Lydia Leong preferred to wait to see what part Amazon’s data center operator played in this mess.

Reached for comment Saturday afternoon, an Amazon spokeswoman reiterated that the storm caused Amazon to lose primary and backup generator power to an availability zone in its east region overnight and that service had been mostly restored. She said the company would share more details in the coming days.

Photo courtesy of Flickr user Bruce Guenter

Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.




Where Are All The iPad Shopping Apps?


This post is by Justin Kan from TechCrunch


Click here to view on the original site: Original Post




gilt

For a tech company founder in San Francisco, I’m a terribly late adopter of new technology. My buddy in med school had a smart phone before I did. The iPhone was out for a year before I bought the 3G. The iPad? I’m embarrassed to admit, I got my first one a month ago.

I held out on the iPad because I didn’t get it. It didn’t have retina display, and comparing the screen after looking at the iPhone 4, it just seemed… pixelated. My friends who had the original version bought them as a novelty, which quickly seemed to wear off. I didn’t know what I would do with one once I had one.

So, when I finally buckled and got the iPad 3, I came to the realization that the rest of the world had over 2 years ago: the iPad is an amazing consumption device. You don’t need a keyboard, because if you’re doing any work at all it will be to send iPhone length one-liner emails. Most of what you’ll be doing on the iPad is playing games, watching videos and shopping.

There’s a plethora of iPad games, and you can download almost any movie or tv show from iTunes, but the shopping experience leaves a lot to be desired. When I first turned on the iPad, I went through and downloaded all the popular apps I recognized. In the shopping / ecommerce category, this was Gilt and Fab.

Both of these companies have amazing iPad experiences. For a while, I was browsing them every day; not because I actually needed to buy anything, but because I enjoyed the virtual window shopping experience of browsing through amazing photos of cool looking products. As any retailer knows, getting people in the store is half the battle, and pretty soon I was back to buying things off Gilt (when I had previously sworn off of it after their fulfillment sent me the wrong thing on multiple orders).

Inspired to find some shopping apps that weren’t flash sales sites, I simply couldn’t find any decent ones. All the apps for department stores and brands seemed like screenshots of their websites. In most of them, I couldn’t even purchase anything.

The ecommerce experience for iPad has been dominated by the deals sites because the deals sites are the only retailers heavily innovating on the technology side. That doesn’t have to be the case. The thing that makes a Gilt or Fab iPad app stand out is that they are extremely polished and conducive to casual browsing, which leads to serendipitous discovery and purchase. Also, they have a great excuse to bring you back in their “store” with a push notification every day — they have a new batch of inventory for you to check out.

Therere a couple other reasons iPads are natural platforms for ecommerce. On iPads shoppers are in a different state of mind (they are relaxing instead of being distracted with work or IM), and are more likely to make impulse purchases. Also, because of the high switching cost of opening up new tabs in Safari or switching between apps (when compared with a browser), a well designed app can keep users engaged for much longer than they would be on the web.

I think the next generation of ecommerce apps for iPad will focus less on the discounting and more on creating an amazing curated browsing experience. Recently, I got a preview build of an app called Monogram by founder Leo Chen (who I’m now advising), which does exactly that: curates collections of clothing from around the web, bringing the user a personalized boutique that updates every day with new outfit suggestions. Like Gilt, the emphasis on the app is about browsing and discovery. When I’m using apps like Monogram and Gilt, I find myself spending more time and browsing/buying more products than I ever do on the web. Apparently I’m not the only one.

A couple things I think this next generation of apps will have to figure out:

  1. Some way of differentiating their product inventory. Some will be vertically integrated companies that are bringing their own designs to market, like Everlane or Warby Parker. Others will focus on curation of existing products. I personally have been waiting for a store that curates the very best item I can own in every category, and tells me why it is the best.
  2. A great offline experience. Few companies in the ecommerce space have focused on innovating on what happens after you checkout with your shopping cart, and they all happen to be owned by Amazon (Amazon, Quidsi, Zappos). I believe there’s a lot of room to innovate in how products are packaged and delivered, and not many people are doing that at the moment.

Right now the iPad is like an entire country of 60 million consumers with only a few stores competing for their purchases. The denizens of iPadlandia are waiting to buy your awesome stuff. Why are you not letting them?


#waywire, a video news network for Gen Y, gets backing from Eric Schmidt and Oprah


This post is by Jennifer Van Grove from VentureBeat


Click here to view on the original site: Original Post




Raised on the Internet and versed in social media, today’s youngsters view the world in a different light than generations past. Here to serve and inspire these influential minds is #waywire, a New York-based startup that’s landed $1.75 million in funding, which includes celebrity support from Oprah, for its mission.

#waywire, which launches in beta later this summer, is a video news network targeted at Generation Y that will feature original, syndicated, and member-created content. The vision of Newark, New Jersey mayor and tech figurehead Cory Booker, #waywire’s goal is to provide today’s youth with a social platform that informs and encourages participation in positive debates with peers.

First Round Capital and Eric Schmidt’s Innovation Endeavors led the startup’s first round of funding. Oprah Winfrey, LinkedIn CEO Jeff Weiner, and Lady Gaga manager Troy Carter have also ponied up to participate in the round as well.

“The idea behind #waywire came from Mayor Booker and grew from his desire to shift America’s public conversation away from divisiveness toward a debate focused on achieving solutions. The Mayor plans to contribute original content to the #waywire network where he will discuss America’s most significant challenges with a variety of thought leaders from diverse backgrounds,” according to a press release.

Booker, a co-founder and part owner in #waywire, will only play advisor to the company while he remains in office.

Filed under: deals, social


Google Currents to become a preinstalled app in Jelly Bean


This post is by Dante D'Orazio from The Verge - All Posts


Click here to view on the original site: Original Post




Jelly Bean Currents

It looks like Google Currents is about to get promoted to the highly-vaunted “bundled app” status. That’s right, the magazine-style Flipboard competitor is joining rank among Gmail, YouTube, Talk, and Maps in Android 4.1 Jelly Bean. That means the app will be preinstalled on new devices like the Nexus 7 and Galaxy Nexus, and it will also be included in that OTA Jelly Bean update that’s set to arrive on the Galaxy Nexus, Nexus S, and Motorola Xoom in mid-July. According to Android Central, Google sent out an email to content partners informing them that the service’s user base was set to increase drastically due to the change. If you’re looking for something to do with the soon-to-be-bundled app, you’ll be glad to hear that you can find T…

Continue reading…

Careful, Twitter — remember what happened to MySpace and Digg


This post is by Mathew Ingram from GigaOM


Click here to view on the original site: Original Post




Twitter sent some shock waves through the technology community with a blog post on Friday that talked about its plans for the future, and suggested that those plans don’t necessarily involve third-party services and apps. Although the company phrased its statement as a move designed to standardize the experience for Twitter users, developers and others in the broader Twitter ecosystem clearly took the post as a warning shot across the bow — especially since the company simultaneously shut down a cross-posting partnership it had with LinkedIn . It seems clear that Twitter wants to control the network as tightly as possible so that it can monetize it more easily, but doing so also comes with substantial risks.

In his blog post, consumer product manager Michael Sippey talked a lot about the introduction of features such as “expanded tweets,” which show more information from providers like GigaOM and the New York Times when a link is included in a tweet. He said the company wants to broaden that program to more publishers, as well as giving them tools to display expanded tweets and other features on their sites — but he also made it obvious that developers who stray outside of the lines are taking a big risk:

[W]e’ve already begun to more thoroughly enforce our Developer Rules of the Road with partners, for example with branding, and in the coming weeks, we will be introducing stricter guidelines around how the Twitter API is used.

Twitter has burned the ecosystem before

These comments set off warning bells for a number of developers, who said they were concerned that Twitter was going to crack down on any third-party app or service. One developer on Hacker News said that in his view, Twitter was trying to shut down third-party services so that they could “inflict a homogenized, boring, monoculture on their user base [that] they can monetize, which will make the experience progressively worse.” Said Turntable.fm developer Jonathan Kupferman:

This isn’t the first time that Twitter has upset the developer community by throwing its weight around. In 2011, there was widespread criticism of the service for the way it issued new rules around use of the Twitter API — and also the way it behaved towards those who crossed the line by shutting off their access without even a warning, as it did in the case of entrepreneur Bill Gross and his Ubermedia network. At the time, one critic accused the company of “nuking” the Twitter ecosystem.

The company also came under fire in 2010 for the way it handled relations with third-party developers after it bought an app called Tweetie. Hunch founder Chris Dixon said Twitter was “acting like a drunk guy with an Uzi” by telling developers not to bother developing Twitter apps, and a number of companies and investors that had been putting money and time into the Twitter ecosystem stopped doing so. So some of the negative reaction to Sippey’s post stems from being burned twice already.

Some observers have argued that Twitter is just doing what it has to do in order to control its network and build a sustainable business, and that third-party developers don’t have any right to expect favorable treatment, since they are piggybacking on its API and resources. Longtime Twitter users, however, say the service’s behavior is a betrayal of all of the other services and apps that helped generate most of the goodwill it is now busy monetizing. As John Abell of Reuters pointed out on Friday, much of the value that users find in Twitter comes from the way it connects to other services.

Anti-user moves torpedoed both MySpace and Digg

And there is a very real risk to this kind of aggressive focus on control and monetization, as a commenter on Hacker News pointed out: restricting the ways that users can access and display their tweets, whether through strict API rules or moves like the LinkedIn shutdown, could irritate the user base that Twitter is relying on to click ads and do all the other things it is planning around monetization. Ultimately, the company could ruin the experience that made Twitter so compelling in the first place, in the same way that MySpace and Digg did.

There are plenty of reasons why MySpace failed, including the conflicting desires of a giant corporate owner like News Corp., but it also started to hemorrhage users because it focused more on monetization through ads and other elements than it did on maintaining a good experience for users. Digg did something similar — in an attempt to build a bigger company and leverage its user base for profit, it added a whole range of “services” and features that were designed mainly to appeal to corporate customers and advertisers. The end result was a wholesale desertion of Digg for other communities like Reddit.

Twitter has a tiger by the tail — it has an active user base in the hundreds of millions, it has become an almost indispensable tool for both news junkies and the media (although this carries risks as well) and it is starting to see some favorable responses to its ad model. But it is also a community, where the users provide the vast majority of the content that is being monetized, and while screwing around with that relationship may appear to make short-term financial sense, it could end in disaster.

Post and thumbnail images courtesy of Flickr users Rosaura Ochoa and See-ming Lee

Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.




The enterprise needs a better network to the cloud


This post is by Rick Dodd, Ciena Corp. from GigaOM


Click here to view on the original site: Original Post




While much of the networking industry today is focused on improving speeds and feeds inside the data center, we need to recognize the importance of improving the networks that connect enterprise data centers to each other, and to the public cloud. If the industry can deliver an elastic network with programmable performance, then the walls between data centers could effectively disappear.

Trying to overlay cloud services on the same pipe being used for best-effort internet is going to disappoint users, and limit cloud service adoption. Specifically, we need to add speed and intelligence to these networks, and several factors are driving this requirement. For example:

  • Virtual machine (VM) transfers between data centers are increasingly common
  • Virtual storage is no longer isolated to a single data center
  • An increasing number of mission critical enterprise applications being deployed on VMs are moving to the cloud, driving the need for carrier-class network security and performance for workload balancing and reliability.
  • Adoption of cloud-based infrastructure (IaaS) for workload mobility, collaboration and availability is creating more complicated topology deployments and opportunities for software defined networking.

Let’s imagine a company with a 200 Mbps data connection to the world, which needs to make a server platform change. To do so without shutting down the business in the process, IT staff would like to temporarily move the applications on this server to the public cloud. Let’s assume the total data to transfer would be about 10 terabytes to make this migration happen. However, transferring 10 TB of data over a typical 200 Mbps network connection would take nearly a week, assuming full bandwidth utilization, no re-transmissions and 80 percent utilization. Clearly this company is not going to be able to run this simple workload job over this network service.

This issue is quite debilitating for IT organizations, and is something service providers like Verizon have been hearing about from their enterprise customers. In fact, the company just opened a new innovation center dedicated to finding solutions that improve the integration of networks and data center infrastructure.

To make this work, enterprises require new cloud network connectivity options for efficient operations—an intelligent Network-as-a-Service model that uses software defined networking to dynamically provide performance as dictated by the application. In the cloud world, demands on capacity and connectivity are fluid — entirely dependent on businesses’ specific requirements at any given time. The network supporting this environment needs to be as elastic, programmable and, in a sense, “virtualized,” as storage and servers are today.

Using the example above for a 10 TB data transfer, an intelligent network could more easily accommodate these workloads by dynamically expanding to 5 Gbps and completing the job in less than 5 hours without requiring the VM applications to go offline. When the job is done, the network would immediately return to standard levels so that the premium bandwidth is billed only as used.

The fictional company finds this use of the premium network service worthwhile, quite simply, because it makes using the provider cloud practical for this particular workload. On-premise data center capital and operational cost becomes avoidable, replaceable with a time-limited -– and thus net-smaller –- IaaS “rental” expense.

Enterprises and service providers both benefit. Enterprises minimize permanent data center-related costs and reduce return-on-investment risks, while providers attract more workload and demand to their cloud services, which boosts their revenues. More, and more affordable, network when needed is essential to vreate these benefits.

The ability to respond to varying workload demands with performance generated on demand is a key benefit of an intelligent network for the cloud. In addition to dynamic bandwidth, this network must have higher availability, lower latency and greater reliability, as it would be designed for critical infrastructure services. Programmable interfaces into an open cloud networking framework might also be used to adjust for policies, authentication or network events.

This open, programmable networking model can be implemented as a cloud backbone or as a fully integrated cloud and network operation. A single vendor could provide a dynamic packet transport core and data center endpoints. Or multi-vendor switching and transport equipment can be used in the core with data center connect performance optimization and cloud operations at the end points.

Many vendors are adopting an open network philosophy and looking to implement interoperability by using new open protocol standards and application programming interfaces (APIs) with a virtualized network.

With this sort of network, the IT manager will have the freedom to consider resources outside the physical walls of his or her building to be natural extensions of an owned data center. In effect, that IT manager would now have a “data center without walls,” that provides the same user experience as a completely dedicated data center, but on a partially rented, and thus more economical, basis.

Rick Dodd is vice president of marketing at Ciena Corp.

Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.




US Cellular to carry Windows Phone 8 devices


This post is by Justin Rubio from The Verge - All Posts


Click here to view on the original site: Original Post




Gallery Photo: Windows Phone 8 new start screen

Regional carrier US Cellular will be joining AT&T, Verizon, and T-Mobile as a provider of Windows Phone 8 devices. The carrier told PC Mag that “We believe in giving our customers the latest technology and device options and we are excited about the Windows 8 phone. It will be an important part of our device lineup going forward.” Along with Verizon, US Cellular represents increased support for the platform from CDMA-based carriers, which will hopefully lead to a wider selection of compatible handsets in the future.

Continue reading…

NFC Is Great, But Mobile Payments Solve A Problem That Doesn’t Exist


This post is by Jordan Crook from TechCrunch


Click here to view on the original site: Original Post




Screen shot 2012-06-30 at 1.52.37 PM

For the past few years, we’ve been told over and over again that NFC will eventually replace the common wallet. And yes, NFC is a great technology. Parts of Europe and China are using it for public transport transactions, and the sharing of content between devices is incredibly cool (just check out this commercial). And moreover, the ability to ditch all of your loyalty cards and combine them in one place (potentially) PassBook-style would be highly convenient. But where mobile payments are concerned, there is no problem to be solved.

Let’s just start with the small stuff. For one, the motion itself should be no different. It’s not like contactless payments via mobile is a more physically efficient form of living and transacting. You grab your credit card out of your wallet in your pocket, and swipe it through the reader (or in some cases tap it, just like the phone). In the case of NFC, you grab your phone out of your pocket, open Google Wallet (or whatever), and tap it to the reader. It’s the same exact motion.

But that doesn’t even matter when we start to consider the real obstacles for NFC mobile payments. There are two issues: the smaller is that, along with not being any faster or easier physically, no one is actually getting rid of their wallet. For one, everyone needs an ID and an ID isn’t safe in a pocket or loose in a bag. So, until I can use my phone as a form of identification at the airport, with the police, or to go to a Dr.’s appointment, my wallet will still remain. And it’s fair to assume that at least some people prefer to have a little cash on them, just in case.

I took a quick Twitter poll using PopTip (a newly launched TechStars company), and it turns out that the few respondents I had mostly feel comfortable without any cash. But, I also assume that the majority of my Twitter followers are generally tech-savvy early adopters, so I still stand behind the fact that you’ll continue carrying a wallet, or some other carrier of small, valuable pieces of paper like insurance cards, IDs, etc.

Moreover, all merchants would need to be set up for NFC transactions to allow the consumer to ditch their wallet, not just forward thinking giants like American Eagle, Macy’s and OfficeMax. It’s not like consumers will stop shopping at non-NFC merchants just because they aren’t set up — paying with a credit card is just as easy, so why even go through the trouble of setting up Google Wallet? Google Offers is a nice incentive, but it isn’t enough to sway all consumers, and it certainly isn’t attractive enough to woo merchants.

In essence, the only true value given to the consumer is the fact that it’s “cool.”

And then the problems intensify when we visit the merchant side of things. There is no benefit to merchants to implement these systems. Sure, Google and Isis can try to convince these SMBs that NFC is the future, but in reality it’s only an added cost to overhaul the system. Even at a minimal cost, the only value is a slight increase in efficiency pushing customers through POS. Companies could potentially market through their POS using NFC, as is the case with Google Offers, though I’m not sure this is welcome on either side. As Mirth so gracefully stated at Disrupt, merchants aren’t quite as enthusiastic about deals services as consumers are.

This comment thread on LoopInsight says it well:

There’s no tangible, proven way to get any return on investment for the implementation. So why do it?

Credit cards are ubiquitous. Credit cards are fast and easy. Almost all merchants have the ability to process payments via credit card. So why? Why are we solving a problem that doesn’t exist?

And even if there is some added benefit, most research predicts that the ubiquity of mobile payments via NFC is between five and ten years away. That’s more than enough time for another disruptive payments solution, likely something that doesn’t require a complete merchant systems overhaul, to supplant NFC before it ever hits its stride.

Again, NFC is an incredibly useful technology. In fact, the social media implications of NFC ubiquity in mobile devices (not at POS) are kind of mind-boggling. Just look at these TagStand figures, and pair them with Google’s recent announcement of 1 million NFC Android devices shipped every week, and then imagine Facebook and Twitter bigger than they’ve ever been before. That is the future of NFC.

Very soon, we’ll be using it in all kinds of interesting and productive ways. I just don’t think mobile payments is one of them.