Jakarta – Yahoo! management announced that the company recently acquired a new company engaged in the field of advertising, namely AdMovate.
On the company’s official website, Wednesday, July 17, 2013, management AdMovate said, “We started the company last year to make it easier for advertisers to create highly personalized offers via the mobile platform.”
Management said the company tried to keep the advertisers can reach consumers through ads that are personalized to match. And Yahoo! has a team with a similar vision.
“Yahoo! is willing to invest in a massive scale,” he says on the site. AdMovate management company hopes to realize his dream of building an effective mobile advertising services.
In a statement, the management of Yahoo! said, “This acquisition is part of our efforts to further develop our advertising technologies such as Apt, Genome, and Right Media.”
Yahoo! hopes that the advertisers can more easily advertise and advertising agencies can be more easily connect. The technology developed AdMovate expected to accelerate this development.
This action adds to list of companies that have made acquisitions Yahoo! In May, Yahoo! bought the social networking site Tumblr with a value of U.S. $ 1.1 billion. Analysts expect, Tumblr is the largest acquisition in the history of corporate.
Buying Tumblr will meet the goals of Yahoo! CEO Marissa Mayer to expand the market share of the smartphone and tablet computer users. Tumblr paper presents a constantly changing, photos, and other digital content presented by users who can now connect through mobile applications. This service is also one of the most popular websites among teenagers and young adults, it is considered appropriate to the development of Yahoo! Mayer
Jakarta – Yahoo! management announced that the company recently acquired a new company engaged in the field of advertising, namely AdMovate.
Comments Off on Intel Aims to “Re-Architect” Datacenters to Meet Demand for New Services
- Reveals new details of the forthcoming 22nm Intel® Atom™ processors C2000 product family, enabling the company to target a larger portion of the datacenter market.
- Unveils future roadmap of 14nm datacenter products including a system-on-chip (SoC) that for the first time will incorporate Intel’s next-generation Broadwell architecture to address an even broader range of workloads.
- Rackspace Hosting* announces that it will deploy a new generation of rack designs as part of its hybrid cloud solutions aligned with Intel’s Rack Scale Architecture vision.
As the massive growth of information technology services places increasing demand on the datacenter, Intel Corporation today outlined its strategy to re-architect the underlying infrastructure, allowing companies and end-users to benefit from an increasingly services-oriented, mobile world.
The company also announced additional details about its next-generation Intel® Atom™ processor C2000 product family (codenamed “Avoton” and “Rangeley”), as well as outlined its roadmap of next-generation 14nm products for 2014 and beyond. This robust pipeline of current and future products and technologies will allow Intel to expand into new segments of the datacenter that look to transition from proprietary designs to more open, standards-based compute models.
“Datacenters are entering a new era of rapid service delivery,” said Diane Bryant, senior vice president and general manager of the Datacenter and Connected Systems Group at Intel. “Across network, storage and servers we continue to see significant opportunities for growth. In many cases, it requires a new approach to deliver the scale and efficiency required, and today we are unveiling the near and long-term actions to enable this transformation.”
As more mobile devices connect to the Internet, cloud-based software and applications get smarter by learning from the billions of people and machines using it, thus resulting in a new era of context-rich experiences and services. It also results in a massive amount of network connections and a continuous stream of real-time, unstructured data. New challenges for networks, computing and storage are emerging as the growing volume of data is transported, collected, aggregated and analyzed in datacenters. As a result, datacenters must be more agile and service-driven than ever before, and easier to manage and operate.
The role of information technology has evolved from being a way to reduce costs and increase corporate productivity to becoming the means to deliver new services to businesses and consumers. For example, Disney* recently started providing visitors with wirelessly connected-wristbands to enhance customers’ in-park experience through real-time data analytics. Additionally, a smart traffic safety program from Bocom* in China seeks to identify traffic patterns in a city of ten million people and intelligently offers better routing options for vehicles on the road.
‘Re-Architecting’ Network, Storage and Servers
To help companies prepare for the next generation of datacenters, Intel revealed its plans to virtualize the network, enable smart storage solutions and invest in innovative rack optimized architectures.
Bryant highlighted Intel’s Rack Scale Architecture (RSA), an advanced design that promises to dramatically increase the utilization and flexibility of the datacenter to deliver new services. Rackspace Hosting*, an open cloud company, today announced the deployment of new server racks that is a step toward reaching Intel’s RSA vision, powered by Intel® Xeon® processors and Intel Ethernet controllers with storage accelerated by Intel Solid State Drives. The Rackspace design is the first commercial rack scale implementation.
The networking industry is on the verge of a transition similar to what the server segment experienced years ago. Equipping the network with open, general purpose processing capabilities provides a way to maximize network bandwidth, significantly reduce cost and provide the flexibility to offer new services. For example, with a virtualized software defined network, the time to provision a new service can be reduced to just minutes from two to three weeks with traditional networks. Intel introduced Open Network Platform reference designs to help OEMs build and deploy this new generation of networks.
Data growth is a challenge to all datacenters and transferring this large volume of data for processing within a traditional, rigid storage architecture is costly and time consuming. By implementing intelligent storage technologies and tools, Intel is helping to reduce the amount of data that needs to be stored, and is improving how data is used for new services.
Traditional servers are also evolving. To meet the diverse needs of datacenter operators who deploy everything from compute intensive database applications to consumer facing Web services that benefit from smaller, more energy-efficient processing, Intel outlined its plan to optimize workloads, including customized CPU and SoC configurations.
As part of its strategy, Intel revealed new details for the forthcoming Intel® Atom™ processors C2000 product family aimed for low-energy, high-density microservers and storage (codenamed “Avoton”), and network devices (codenamed “Rangeley”). This second generation of Intel’s 64-bit SoCs is expected to become available later this year and will be based on the company’s 22nm process technology and the innovative Silvermont microarchitecture. It will feature up to eight cores with integrated Ethernet and support for up to 64GB of memory.
The new products are expected to deliver up to four times1,3 the energy efficiency and up to seven times1,2 more performance than the first generation Intel Atom processor-based server SoCs introduced in December last year. Intel has been sampling the new Intel Atom processor server product family to customers since April and has already more than doubled the number of system designs compared to the previous generation.
Roadmap for Expansion
The move to services-oriented datacenters presents considerable opportunities for Intel to expand into new segments. To help bolster the underlying technologies that power much of the next generation of datacenters, Intel outlined its roadmap of next-generation products based on its forthcoming 14nm process technology scheduled for 2014 and beyond. These products are aimed at microservers, storage and network devices and will offer an even broader set of low-power, high-density solutions for their Web-scale applications and services.
The future products include the next generation of Intel Xeon processors E3 family (codenamed “Broadwell”) built for processor and graphic-centric workloads such as online gaming and media transcoding. It also includes the next generation of Intel Atom processor SoCs (codenamed “Denverton”) that will enable even higher density deployments for datacenter operators. Intel also disclosed an addition to its future roadmap – a new SoC designed from the ground up for the datacenter based on Intel’s next-generation Broadwell microarchitecture that follows today’s industry leading Haswell microarchitecture. This SoC will offer higher levels of performance in high density, extreme energy efficient systems that datacenter operators will expect in this increasingly services-oriented, mobile world.
Comments Off on Q&A: Microsoft Talks Changes to SkyDrive in Windows 8.1
When was the last time you had to delete a bunch of photos or apps on your mobile device to clear out space? With the massive amount of data generated every day, it’s easy to exhaust all the available storage on your phone or tablet.
And this problem is only getting worse. Industry trends suggest that device storage capacities are growing at 25 percent per year, but the amount of data being produced is increasing even faster — by around 50 percent a year, according to Microsoft. The software giant is looking to address this problem with SkyDrive, which will be updated in Windows 8.1 with the goal of giving you access to your files at all times, without taking up all your available storage or Internet bandwidth.
The updated service utilizes what Microsoft refers to as “placeholder files,” which look and feel like normal folders and files with one major change — you don’t download the full file until you access it. The placeholder file contains just a thumbnail image and some basic properties, making it significantly smaller than its actual size. This means that 100GB of files in SkyDrive will use up less than 5GB of storage on the hard drive of your Windows 8.1 device, Mona Akmal.
“I have a Pictures folder in SkyDrive that’s 5.6GB in size but it’s only taking up 185MB on the local disk,” Akmal wrote.
Another major change to SkyDrive in Windows 8.1 deals with offline access to files. With the SkyDrive app, you’ll now be able to mark any folders or files you want remain available when you lose Internet connectivity.
Any edits you make to a file while offline will automatically be synced back up to SkyDrive when you regain a connection. For added convenience, all the files you open or edit on your device will automatically be marked for offline access.
As a reminder, new SkyDrive users get 7GB of storage for free. After that, an additional 20GB costs $10 per year, while 50GB will set you back $25 a year, and 100GB costs $50 a year.
We sat down with Angus Logan, group product marketing manager for SkyDrive (pictured below), last week to get the scoop on the most important changes to the online storage service in Windows 8.1.
Comments Off on Facebook Hashtags Not Catching on With Consumers
While using hashtags in Facebook posts might be a fun tactic for brands trying to engage consumers, it doesn’t appear to be paying off, a new study finds.
Research from social media analytics firm Simply Measured revealed that while 20 percent of Facebook posts among top brands now include hashtags (which give users a way to group messages of similar content), there is no evidence that hashtags are influencing engagement.
The study shows that posts with hashtags —a new feature added with in the last several months — perform as well as those without, suggesting that people are not yet discovering brand posts by their tags.
Overall, the study shows nearly all of the companies in the Interbrand 100 — which ranks businesses based on financial status — now have a Facebook fan page, with 60 percent posting something at least once a day.
[No, Really, Facebook Makes Employees More Productive]
The research revealed that visual content is by far the primary driver for engagement on Facebook. Photos posted by top brands average more than 9,400 engagements, which includes likes, comments and shares, per post, while video posts average more than 2,500.
When it comes to text posts, brands must walk a fine line. Analysis of more than 500 status updates from the top brands shows that the longer a status update is, the less engagement it typically receives. However, if a status update is too short — less than 50 characters — it may not be long enough to capture viewers’ attention or provide the necessary context to drive the number of likes, shares and comments a brand would like.
“For most brands, Facebook is no longer just a network; it has become the hub of their social marketing efforts and one of the most effective ways to engage with fans,” said Adam Schoenfeld, CEO of Simply Measured. “This latest research once again proves that knowing your audience, understanding your content assets and measuring your efforts are extremely important to develop the social strategies that will work best for you.”
Businesses that limit Facebook fans from writing on their page might want to reconsider their strategy. The research shows that nearly 30 percent of top brands do not allow users to post on their wall. For those brands, user engagement on their page is limited to likes, comments and shares, resulting in 15 percent less engagement than brands that do allow user posts.
When it comes to drawing the most Facebook fans, no one does it better than Facebook itself. The social media giant claims the top spot with 93 million fans, followed by Coca-Cola and MTV.
Comments Off on Google Selects HE-AAC Multichannel From Fraunhofer IIS For Google Play Movies in Surround Sound
Google and Fraunhofer IIS deliver the first movies with a true 5.1 channel surround sound experience from Google Play. The immersive sound quality consumers have come to expect from TV, Blu-ray disc or DVD is now available with movies streamed or downloaded from Google Play directly to their Android devices running 4.1 or later. Google chose HE-AAC Multichannel as Android’s only surround sound codec due to its open-standard nature and excellent bit-rate efficiency.
When connected to a surround sound system and TV with an HDMI cable, Android users will be able to play high quality audio and video from their smartphones and tablets in surround. On the go, Android devices will play movies in great stereo quality and selected Nexus products will also include the Fraunhofer Cingo virtual surround rendering technology, which will play movies in realistic surround sound on earphones or tablet stereo speakers.
Android’s HE-AAC Multichannel implementation includes full support for loudness and downmix metadata commonly known from the broadcast TV world, as well as other features that allow the sound to be tailored for an optimum user experience in any listening mode and environment.
“Google Play movies in 5.1 HE-AAC Multichannel sound are the first realization of our vision of bringing true theatrical surround sound to mobile devices,” said Robert Bleidt, Division General Manager at Fraunhofer USA Digital Media Technologies. “The Google and Fraunhofer partnership creates a tremendous value for consumers by offering one format that delivers a high quality experience both in-home and while mobile. Consumers may experience surround sound over headphones while on their way home from work, and finish the movie in true, exciting surround in their living room,” he added.
HE-AAC Multichannel is part of the Fraunhofer FDK AAC codec library for Android since version 4.1 and a required feature of all Android-compatible devices. This software makes open-source Fraunhofer implementations of the MPEG audio codecs AAC, HE-AAC, HE-AACv2, and AAC-ELD available to the Android community.
HE-AAC is today’s most efficient high-quality surround and stereo audio codec deployed in over 5 billion devices and used in TV, radio, and streaming services worldwide. The codec is natively integrated into most operating systems, streaming platforms and consumer electronics devices. In addition to its unique coding efficiency, HE-AAC has the dynamic ability to change audio bit-rates seamlessly in order to adapt to changing network conditions as consumers stream content to a variety of devices. It can be used with any adaptive streaming technology including MPEG-DASH, Apple HLS, Adobe HDS and Microsoft Smooth Streaming.
Comments Off on Microsoft Internet Explorer Pushes Beyond Second Screen To Companion Web
“We’re at a tipping point with connected devices,” a recent blog post from Microsoft Microsoft‘s Internet Explorer team reads. “Every day, 3.6 million mobile devices and tablets are activated worldwide. That’s over five times more than the number of babies born each day!” They’ve got a point, but it is a sad irony for Microsoft that so few of those mobile devices run their software.
But Microsoft has sold more than 70 million Xbox 360s and has a very TV-centric followup, the Xbox One, coming in November. As Forbes.com contributor Tristan Louis points out in today’s post on Smarter TVs, ”the upcoming battle for the living room is a chance to redeem itself and turn its fortune around.” The parody video that Louis refers to shows all of the instances of the words “TV,” “television,” “sports” and “Call of Duty” in the launch announcement. Although the announcement raised the ire of hard core gamers, the emphasis on TV (and perhaps the two things TVs are most used for, watching sports and playing Call of Duty) must have been highly intentional.
Games have been Microsoft’s route into the living room, but that strong association is now an impediment to its more generalized assault of the living room. Non-gamers are probably thinking more about the future AppleApple TV than about the Xbox as their upgrade path to interactive TV. In response to this perception, Microsoft has launched a new program called “Companion Web.” The idea is to facilitate real time interactions between different devices. And because Microsoft has no footprint to speak of in the world of mobile, they are now trying to emerge as a unifying force between iOS and Android.
The problem Microsoft is trying to solve (other than the risk of their own irrelevance) is that “the majority of sites on the web are built for only one device at a time.” The user can search for related information to what they are watching on their TV, for instance, but real time it ain’t. And content owners can make second screen experiences, but they have tended to be operating system (and sometimes even device) specific. Microsoft is after a more generalized solution that does not impose an unmanageable burden on developers.
“Regardless of who makes the device or software that powers the device, the Companion Web enables the internet to bridge the gap between these devices,” the IE blog post reads. “For developers, Companion Web represents an opportunity to reuse code that works across multiple scenarios, enabling greater reach and ways to engage an audience. For consumers, Companion Web means you’ll seamlessly move from one device to the next, interacting with your photos, videos, music, movies, television shows, files, and more.”
Companion Web would seem to be a more generalized version of the Xbox SmartGlass, which also allowed you to interact with your TV via Windows devices and select iOS and Android devices, but only on very specific games and content. The promise of the Companion Web is of a much broader range of experiences that the user could have between devices.
So far, Microsoft has released three such “Companion Web experiences” working with outside developers. I became aware of the program through Luke Wroblewski who has created a version of his Polar app that works in this companion manner with Internet Explorer. As you can see in the video below, Polar uses IE’s snap mode to assign a “sidebar” portion of the screen (in this case a Surface tablet acts a s a proxy for a Windows 8/Xbox One enabled TV) to itself while the user uses the balance of the screen to watch Futurama.
Wroblewski demonstrates the ways that you can find polls with Polar about Futurama and watch the results update in real time while you are watching the show. You can imagine something like this being a lot of fun for big live TV events like the Oscars or the Super Bowl, where the amount of real time activity would be high and seeing how other people are reacting becomes part of the entertainment. Similarly, you can make up your own hashtags for polls in Polar so that the reactions you are monitoring are only a select group of people. Either way, mass or niche, the real time linkage with the content on the big screen really extends the idea of the Polar app by making these interactions available to a room full of people—each potentially interacting with their own mobile devices.
And, important to note (since this is IE, after all, that we are talking about) that this all uses standard open web technology. Specifically, Wroblewski tells me, Companion Web uses web sockets to create the real time connections between devices. He says, “you can make a connection between pretty much any two ‘modern’ Web browsers regardless of device.” One of the other really interesting things about the Polar demonstration is that, as I described in a recent post, it uses a multi-device web page that enables all kinds of input (touch, mouse and keyboard) depending on device. And in the Companion Web experience, all all of these inputs can be used to control the connected screen.
What the other “modern” browsers don’t have that Internet Explorer 10 has is this snap mode. If there was one thing that iOS 7 should have copied from Windows (instead of all that flatness stuff) it would have been snap mode. So these Companion Web experiences will work across virtually all devices (because they use standard web tech) but the Xbox One will retain an advantage of being the only way to uses these “companions” on the screen simultaneously with other activities. And Polar, I think, has shown how this could become a really powerful feature.
The other two Companion Web experiments released so far do not make use of this snap mode feature. DailyBurn, see video below, uses a smartphone or tablet to get real time data related to workouts you view on your TV. This app is clearly trying to appeal to users who may need some constructive excuse to get an Xbox One.
Mix Party, introduced in the (purposely?) obnoxious video below, allows people at a party to create real time, collaborative playlists with their phones. As with Polar, the real time aspect of this is part of the entertainment value. I’m not sure if DailyBurn is intended as a solo experience or if multiple people could monitor their own individual performance of a shared video workout or not, but Mix Party and Polar clearly have real time, fact to face interactions in mind.
What is interesting to me about this strategy is that there are some extra capabilities that Microsoft has built into IE 10/Xbox One (and likely will build more) that will give it an advantage as an app enabled web TV platform, but the apps developers write will also work well on all devices. This strategy of “progressive enhancement” is a comfortable one to developers because it keeps their options open. Allowing for these entropic possibilities is a smart way to get developers on board, which, in turn, could be the means to Microsoft’s resurgence through the big screen.
Comments Off on Apple Developer Website Online Again Following Cyberattack
Apple’s developer website was back online Friday, more than a week after it was targeted by a hacker who reportedly attempted to steal personal information, various media outlets have reported.
According to Bloomberg’s Jordan Robertson, the website used by engineers who write Mac and iOS device applications was said to be active as of 5pm Pacific time on July 26. The attack had forced it offline for a total of eight days.
“Developers use the site for software downloads, documentation and engineering information,” Robertson said. “The maker of iPhones and iPads said this week that it’s ‘completely overhauling’ its developer systems to prevent a security threat from happening again. While some of the website’s information was encrypted, Applesaid it hadn’t been able to rule out whether names, mailing addresses or e-mail addresses may have been accessed.”
The website targeted by the cyberattack is used by the Cupertino, California-based tech giant to communicate with its community of nearly six million software developers, according to Reuters reporter Aman Shah. A UK-based Turkish researcher named Ibrahim Balic has claimed responsibility for the attack, which he says was not malicious in nature, but not everyone buys his story.
“Balic, who describes himself as a security consultant, claimed on Sunday that he had discovered a number of weaknesses in the site at developer.apple.com which allowed him to grab email addresses of registered developers,” Guardian reporter Charles Arthur explained. “In all, Balic said he had been able to grab the details of 100,000 people registered on the site, and that he included 73 of them in a bug report to Apple.
Arthur said they attempted to contact 29 people whose emails were allegedly extracted by Balic during the hacking, but seven of those emails bounced and none of the remaining 22 responded to requests to state whether or not they are registered Apple developers. Furthermore, none of the names or email addresses could be located online, which the Guardian notes would be unusual for active software developers.
“Many of the names and email addresses either don’t look like they would belong to Apple developers, or appear to have left no footprints anywhere else on the net,” added independent security consultant Graham Cluley. In addition, in reference to ten emails featured in a YouTube video created by the alleged hacker, Cluley said, “It’s almost as though these are long-discarded ghost email addresses from years ago or have been used by Balic in his video for reasons best known to himself.”
In related news, a new phishing scam involving Apple has arisen on the heels of the developer website hacking,according to CNET’s Charlie Osborne. Attempting to capitalize on security concerns raised by the cyberattack, the new phishing scam warns users to click on a link in order to change their passwords.
While the email is short, it may appear legitimate to some users, Osborne said. However, it includes a grammar mistake in the title, fails to capitalize Apple on several locations and includes a link that clearly does not lead to a domain registered or owned by the tech giant.
“Users have taken to Twitter to warn others of the phishing attacks, and security firm Kaspersky Lab has found that Apple-related phishing scams have skyrocketed in the last six months, with scammers focused on stealing login credentials and financial data,” the CNET writer added.
Comments Off on 7 Months, Google Clear 100 Million Pirated Content
CALIFORNIA – July is almost over and Google has removed more than 100 million web pages containing links content copyright infringement. The removal of the copyright holder hope these links can keep consumers from pirated content sites.
Reported by Softpedia, Friday (27/07/2013), according to TorrentFreak reports, the number of links that have been removed by Google starting this year more than in 2012. Meanwhile, Google decided to be transparent about all removal requests pirated content in search results.
Since January this year, Google has been asked to remove more than 105.3 million web pages that allegedly contain pirated content. A number of sites that claimed to contain pirated contents has also been reported.
Search engine files, FilesTube, is the most hunted with more than 5.8 million links, and Rapidgator.net Torrentz.eu followed, with each more than 2 million links.
Meanwhile, The Pirate Bay, which had become the hunted, now is not in the list of top 20. This is because the torrent sites are changing domains and only has two million links.
Comments Off on Facebook speeds PHP by crafting a PHP virtual machine
Social networking giant Facebook has taken another step at making the PHP Web programming language run more quickly. The company has developed a PHP Virtual Machine that it says can execute the language as much as nine times as quickly as running PHP natively on large systems.
“Our goal is to make PHP run really, really quickly,” said Joel Pobar, a Facebook engineering manager. Facebook has been using the virtual machine, called the HipHop Virtual Machine (HHVM), across all of its servers since earlier this year.
Pobar discussed the virtual machine at the O’Reilly Open Source Conference (OSCON) being held this week in Portland, Oregon.
Shares its development tools
HHVM is not Facebook’s first foray into customizing PHP for faster use. PHP is aninterpreted language, meaning that the source code is executed by the processor directly. Generally speaking, programs written in interpreted languages such as PHP tend not to run as quickly as languages, such as C or C++, that have been compiled beforehand into machine language byte code. Facebook has remained loyal to PHP because it is widely understood by many of the Web programmers who work for the company.
To keep up with the insatiable user demand, however, Facebook originally devised a compiler, called HipHop, that would translate PHP code into C++, so it then it could be compiled ahead of time for faster performance.
While Facebook enjoyed considerable performance gains of this first version of HipHop for several years, it sought other ways to speed the delivery of the dynamically created Web pages to its billion or so users. “Our performance strategy for that was going to tap out,” Pobar admitted.
HHVM is the next step for Facebook. Under development for about three years, HHVM actually works on the same principle as the Java Virtual Machine (JVM). HHVM has a just-in-time (JIT) compiler that converts the human readable source code into machine-readable byte code when it is needed. (The previous HipHop, renamed HPHPc, has now been retired within Facebook.)
This JIT approach allows the virtual machine to “make smarter decisions at runtime,” Pobar said. For instance, if a call is made to the MySQL database to read a row of data, the HHVM can, on the fly, figure out what type of data it is, such as an integer or a string. It then can generate or call code on the fly that would be best suited for handling this particular type of data.
With the old HipHop, “the best it can do is analyze the entire Facebook codebase, reason about it and then specialize code based on its reasoning. But it can’t get all of the reasoning right. There are parts of the code base that you can not simply infer about or reason about,” Pobar said.
Virtual system speedier
Pobar estimated that HHVM is about twice as fast as HPHPc was, and about nine times as fast as running straight PHP.
Facebook has posted the code for HHVM on GitHub, with the hopes that others will use it to speed their PHP websites as well.
HHVM is optimized for handling very large, and heavily used, PHP codebases. Pobar reckoned that using HHVM for standard sized websites, such as one hosting a WordPress blog, would gain only about a fivefold performance improvement.
“If you take some PHP and run it in on HipHop, the CPU execution time [may] not be the limiting factor for performance. Chances are [the system is] spending too much time talking to the database or spending too time talking to [the] memcache” caching layer, Pobar said.