Hands down, a WordPress website is one of the best investments that can be made in a local company’s online lead generation efforts. Admittedly, it can also be one of the biggest investments. My Local Leads, a Maine based marketing firm, works on design and development as well as continued maintenance of affordable WordPress websites for local businesses across the US. There are undoubtedly some decisions that go into the initial planning and development stages that can greatly impact marketing effectiveness of the site. Then, there are things that require continued and near constant observation and work. These few techniques are important to helping make a WordPress website part of a successful local search marketing campaign.

Choosing the right domain name can be pivotal in helping increase a websites reach, especially in the early stages. It was popular to stuff a domain with keywords, but recent search engine algorithm updates have lessened the value of such a practice. Branding is important for the domain name. It needs to be simple enough for people to remember. If a keyword fits naturally, sure it can and should be added. If it ends up making it too weird, too long, or too hard to remember, it is most likely not worth it.

It is also important to remember to turn on WordPress’s built in SEO tools when putting up a site for a local business. There is a setting under privacy of a WordPress site that will automatically hide the site from search engines, and this certainly is not what anyone attempting to build a site for search engine optimization wants to have set. At the same time, the WordPress website should have the permalinks set to be friendly URLs, which will help with the site ranking.

There are important add-ons that any WordPress website should have including sitemap and SEO tools. The meta titles and descriptions can be easily added with warnings and suggestions through using a tool like Yoast. The sitemap is a factor in overall page ranking and can be done automatically through a plugin whenever new pages or posts are added.

Another factor that business owners will want to keep an eye on with their WordPress site is the actual functionality. The site should load quickly, have no dead links, and have clean code. Ugly and broken code can harm the rankings as well as user experience. Slow sites or sites that go down frequently can also be detrimental to both customer experience and search engine rankings.

Jakarta – After a week after Google Maps launched for Android users, the information technology company is now re-presenting feature map for iPad users. Google Maps 2.0 can be enjoyed globally since Tuesday, July 16, 2013.
Features that are designed for use with the device’s larger screen choose the look with clearer images and soft colors.
Just like the Android version, iOS-based applications include traffic information in real time. Google Maps 2.0 comes with Foursquare and navigation that includes information about a variety of places, including restaurants and shopping centers.
This application actually has not been downloaded from the App Store could in their home country, the United States. Yet it can be downloaded for users in Asia.
Google Maps before, they can be enjoyed by users of smart phones iPhone. Launch the application updates to follow up on complaints that come from iPhone users, especially iPhone 5 to Apple Maps. Users complained about the lack of accuracy especially on Apple Maps. Apple Management then apologized in writing to the users of the iPhone and iPad.
Service on the map offers features traffic information, navigation, transit directions, satellite, road, indoor photos, restaurant reviews, and the integration of the Google profile. Unlike other Google apps made​​, this map is a special feature produced for the default operating system iOS. The application advantages are speed and small file size so it does not take much memory.

  • Reveals new details of the forthcoming 22nm Intel® Atom™ processors C2000 product family, enabling the company to target a larger portion of the datacenter market.
  • Unveils future roadmap of 14nm datacenter products including a system-on-chip (SoC) that for the first time will incorporate Intel’s next-generation Broadwell architecture to address an even broader range of workloads.
  • Rackspace Hosting* announces that it will deploy a new generation of rack designs as part of its hybrid cloud solutions aligned with Intel’s Rack Scale Architecture vision.

As the massive growth of information technology services places increasing demand on the datacenter, Intel Corporation today outlined its strategy to re-architect the underlying infrastructure, allowing companies and end-users to benefit from an increasingly services-oriented, mobile world.

The company also announced additional details about its next-generation Intel® Atom™ processor C2000 product family (codenamed “Avoton” and “Rangeley”), as well as outlined its roadmap of next-generation 14nm products for 2014 and beyond. This robust pipeline of current and future products and technologies will allow Intel to expand into new segments of the datacenter that look to transition from proprietary designs to more open, standards-based compute models.

“Datacenters are entering a new era of rapid service delivery,” said Diane Bryant, senior vice president and general manager of the Datacenter and Connected Systems Group at Intel. “Across network, storage and servers we continue to see significant opportunities for growth. In many cases, it requires a new approach to deliver the scale and efficiency required, and today we are unveiling the near and long-term actions to enable this transformation.”

As more mobile devices connect to the Internet, cloud-based software and applications get smarter by learning from the billions of people and machines using it, thus resulting in a new era of context-rich experiences and services. It also results in a massive amount of network connections and a continuous stream of real-time, unstructured data. New challenges for networks, computing and storage are emerging as the growing volume of data is transported, collected, aggregated and analyzed in datacenters. As a result, datacenters must be more agile and service-driven than ever before, and easier to manage and operate.

The role of information technology has evolved from being a way to reduce costs and increase corporate productivity to becoming the means to deliver new services to businesses and consumers. For example, Disney* recently started providing visitors with wirelessly connected-wristbands to enhance customers’ in-park experience through real-time data analytics. Additionally, a smart traffic safety program from Bocom* in China seeks to identify traffic patterns in a city of ten million people and intelligently offers better routing options for vehicles on the road.

‘Re-Architecting’ Network, Storage and Servers

To help companies prepare for the next generation of datacenters, Intel revealed its plans to virtualize the network, enable smart storage solutions and invest in innovative rack optimized architectures.

Bryant highlighted Intel’s Rack Scale Architecture (RSA), an advanced design that promises to dramatically increase the utilization and flexibility of the datacenter to deliver new services. Rackspace Hosting*, an open cloud company, today announced the deployment of new server racks that is a step toward reaching Intel’s RSA vision, powered by Intel® Xeon® processors and Intel Ethernet controllers with storage accelerated by Intel Solid State Drives. The Rackspace design is the first commercial rack scale implementation.

The networking industry is on the verge of a transition similar to what the server segment experienced years ago. Equipping the network with open, general purpose processing capabilities provides a way to maximize network bandwidth, significantly reduce cost and provide the flexibility to offer new services. For example, with a virtualized software defined network, the time to provision a new service can be reduced to just minutes from two to three weeks with traditional networks. Intel introduced Open Network Platform reference designs to help OEMs build and deploy this new generation of networks.

Data growth is a challenge to all datacenters and transferring this large volume of data for processing within a traditional, rigid storage architecture is costly and time consuming. By implementing intelligent storage technologies and tools, Intel is helping to reduce the amount of data that needs to be stored, and is improving how data is used for new services.

Traditional servers are also evolving. To meet the diverse needs of datacenter operators who deploy everything from compute intensive database applications to consumer facing Web services that benefit from smaller, more energy-efficient processing, Intel outlined its plan to optimize workloads, including customized CPU and SoC configurations.

As part of its strategy, Intel revealed new details for the forthcoming Intel® Atom™ processors C2000 product family aimed for low-energy, high-density microservers and storage (codenamed “Avoton”), and network devices (codenamed “Rangeley”). This second generation of Intel’s 64-bit SoCs is expected to become available later this year and will be based on the company’s 22nm process technology and the innovative Silvermont microarchitecture. It will feature up to eight cores with integrated Ethernet and support for up to 64GB of memory.

The new products are expected to deliver up to four times1,3 the energy efficiency and up to seven times1,2 more performance than the first generation Intel Atom processor-based server SoCs introduced in December last year. Intel has been sampling the new Intel Atom processor server product family to customers since April and has already more than doubled the number of system designs compared to the previous generation.

Roadmap for Expansion

The move to services-oriented datacenters presents considerable opportunities for Intel to expand into new segments. To help bolster the underlying technologies that power much of the next generation of datacenters, Intel outlined its roadmap of next-generation products based on its forthcoming 14nm process technology scheduled for 2014 and beyond. These products are aimed at microservers, storage and network devices and will offer an even broader set of low-power, high-density solutions for their Web-scale applications and services.

The future products include the next generation of Intel Xeon processors E3 family (codenamed “Broadwell”) built for processor and graphic-centric workloads such as online gaming and media transcoding. It also includes the next generation of Intel Atom processor SoCs (codenamed “Denverton”) that will enable even higher density deployments for datacenter operators. Intel also disclosed an addition to its future roadmap – a new SoC designed from the ground up for the datacenter based on Intel’s next-generation Broadwell microarchitecture that follows today’s industry leading Haswell microarchitecture. This SoC will offer higher levels of performance in high density, extreme energy efficient systems that datacenter operators will expect in this increasingly services-oriented, mobile world.

Jakarta – A giant technology company formed an alliance to demand transparency from the U.S. government related information monitoring program of the National Security Agency (NSA) called PRISM.
The alliance involves companies such as Apple, Google, Facebook, and Microsoft. In alliance was also a number of civic groups. In a letter that will be announced in a few hours, the alliance which amounts to 63 companies, investors, and non-profit organizations is asking for clearer rules about such things.
“Basic information about the way the government enforce laws related activities have been published during this investigation does not interfere with the investigation,” read one of the points in the letter obtained by AllThingsD on Wednesday, July 17, 2013.
The Alliance requested that they be allowed the right to announce the number of government requests for data on the users of their services. They also requested that the number of users, accounts, and information held gadget that can be announced.
Alliance also assess the amount of requests basic information about the content and the user, could be made public as part of their transparency.
At the same time, the alliance will also ask the government to announce a number of requests and the number of individuals whose data is requested from the technology companies.
This letter will be addressed to President Barack Obama and congressional leaders. Here are some names of firms and associations involved in this alliance:
Yahoo, AOL, Apple, Digg, Dropbox, Evoca, Facebook, Google, HeyZap, LinkedIn, Meetup, Microsoft, Mozilla, Reddit, salesforce.com, Tumblr, and Twitter.
Also there younow, Union Square Ventures, Y Combinator, New Atlantic Ventures, The Electronic Frontier Foundation, Human Rights Watch, the American Civil Liberties Union, the Center for Democracy & Technology.
In addition, there are elements of the press and advocacy organizations such as Reporters Committee for Freedom of the Press, Public Knowledge, the Computer & Communications Industry Association, Reporters Without Borders, and the Wikimedia Foundation.

California – Google Translate now comes with a new feature that allows users to translate handwriting, the handwriting input Google Translate. This feature supports 45 languages ​​handwriting.
Of the 45 languages ​​available, Google provides one example of Chinese language handwriting. If users want to know the meaning of this article é ¥ º å??, Users simply select the Google Translate Chinese menu, and select the pencil-shaped icon to activate the language features handwriting. The user needs to do is describe the characters in the main panel features handwriting. Furthermore, Google Translate will do it.
Previously, Google has presented the Google Translate feature for handwriting input devices with the Android operating system in December 2012. Then, in early 2013, the company renewed the browser engine Google Input Tools to desktop by adding a new virtual keyboard, editing method, and device for carrying handwritten translation into a web. Google recently announced later on Wednesday, July 24, 2013, as reported by The Next Web sites.
Not surprisingly, Google first presented the new features of Google Translate on Android because this feature is more appropriate for mobile users with high activity.

Google on Tuesday released Chrome 28, the first polished version of the browser to use the company’s home-grown “Blink” rendering engine. On Windows, the upgrade also sported Google’s new notification service that lets developers of Chrome apps and add-ons display messages and alerts outside the browser window.

The upgrade was the first since May 21, when Google shipped Chrome 27 and touted some minor performance improvements.

[ Also on InfoWorld: Bug bounty programs provide strong value for Google, Mozilla. | Get your websites up to speed with HTML5 today using the techniques in InfoWorld’s HTML5 Deep DivePDF how-to report. | Learn how to secure your Web browsers in InfoWorld’s “Web Browser Security Deep Dive” PDF guide. ]

Google announced in April that it was dropping the open-source WebKit browser engine — at the time also used only by Apple’s Safari — and was instead launching Blink, a WebKit variant, to power Chrome. Since then, Opera Software’s Opera has also adopted WebKit as an interim step before it eventually moves to Blink.

Google cited difficulties in adapting WebKit to Chrome, and in the first weeks after the announcement, stripped copious amounts of unnecessary-for-Chrome code from the fork that became Blink. Previously, only the rougher “Dev” and “Beta” builds of Chrome relied on the Blink engine. Users can verify that Blink is present by typing chrome://version/ in the Chrome address-search bar, dubbed the “Omnibox.”

Also included in Chrome 28 is new support for more sophisticated notifications that appear outside the browser pane and display even when the browser’s not running. “Packaged apps” — ber-Web apps that look and behave like “native” code written specifically for the underlying OS — and add-ons can push brief messages and alerts to Chrome users after their developers have enabled the feature.

Only the Windows version of Chrome 28 currently supports these next-generation notifications, but Google promised that the feature would soon make its way to OS X and Linux. On a Mac, Chrome notifications are not integrated with OS X Mountain Lion’s Notification Center.

Along with the debut of Blink and notifications, Chrome 28 contained patches for 15 security vulnerabilities, one of them rated “critical,” Google’s most serious threat ranking. According to Google’s terse security advisory, that flaw was a memory management bug — dubbed a “use-after-free” vulnerability — in the browser’s network sockets code.

But while Colin Payne, who reported the bug, received an impressive reward of $6,267.40, another researcher was handed triple that. Andrey Labunets was paid a record $21,500 for filing several vulnerability reports, including two in the Google synchronization service and an unknown number of others that Google said were “…since-fixed server-side bugs.”

That last phrase and the amount paid were clues that Labunets discovered one or more flaws in a core Google service. In April, Google boosted bounties for vulnerability reports in its core websites, services and online apps, resetting the top reward to $20,000 for remote code executable bugs, those that attackers could use to slip malicious code onto a server or into an app or site.

Labunets is no stranger to large bug bounties. Earlier this year, after reporting a string of weaknesses in Facebook’s authentication protocol, Labunets was awarded $9,500 by the social networking giant.

Altogether, Google this week paid bounties totaling $34,901 to six researchers, including Payne and Labunets, for reporting eight different bugs. Through Tuesday, the Mountain View, Calif., company has awarded nearly $250,000 thus far this year in bounties or hacking contest prizes.

Users can download Chrome 28 from Google’s website. Active users can simply let the automatic updater retrieve the new version.

Did you pour your heart out on a MySpace blog and make hourly checks on your Friends total? Now the social network has been accused of erasing the personal histories of its dedicated members after a $20 million relaunch designed to bury its past and attract a new teenage audience.

The music-centred platform, which helped launch Lily Allen to fame and attracted 100 million users at its 2007 peak, is seeking to climb out of the social network “graveyard” after years of being a source of digital derision.

The site, lacking innovation and overtaken by Facebook, shed users and was abandoned by Rupert Murdoch, whose News Corporation had bought the company in a disastrous $580 million deal.

Backed by new investors, including singer Justin Timberlake, Myspace (after dropping the capital ‘S’) has been rebranded as a music-streaming service, with a new sleek interface, and an iPhone app for radio play and animated GIF creation.

The new Myspace has shown signs of life, attracting 31 million unique visitors and one million app downloads since a high-profile relaunch last month.

However its owners do not appear to want those loyal users, who stuck around even when MySpace became a tarnished brand, to spoil the party for its new target audience of young “millenials”.

Furious users complained that Myspace has erased all of their blogs, private messages, videos, comments and posts, when they try and log-in to the new site.

Myspace veterans, whose lives have been marked out by the blogs and photos posted daily over nearly a decade, are threatening a class-action lawsuit over what they see as the destruction of their personal histories.

One disgruntled member wrote: “I was a loyal user who never deserted Myspace. I used it almost everyday since 2006. I wrote hundreds of blogs that, to my horror, were simply gone as of last night with no prior warning given. That is no way to treat us. Please give us a chance to recover old blogs. This is like losing family photographs, and it is really horrible.”

Another posted on the site’s forum: “This is no different than losing one’s writing or photographs in a house fire, and I am feeling awful right now.” “You in essence just stole our blogs without permission to delete them. How dare you!,” complained another user.

Myspace, bought for just $35 million in 2011 by Timberlake and the Specific Media Group, told users that it had made changes to create a “better experience.” The company said: “That means you won’t see a few products on the new site. We know that this is upsetting to some but it gives us a chance to really concentrate on creating a new experience for discovery and expression.”

The “year zero” approach extends to stars who once built huge followings through MySpace. Britney Spears, who enjoyed 1.5 million Friends on the old MySpace, found her new “Connections” count set to 0.

Myspace won’t mind irritating 30-somethings who enjoyed sharing family photos if the network maintains a positive buzz from younger users since the relaunch, which featured the rapper Pharrell in a major advertising campaign.

The new app is ranked among the Top 20 social networking apps and the site makeover received positive feedback on Twitter.

Yet Myspace has been forced to respond to the backlash from its loyal users.  “Change isn’t easy and there has been a lot going on lately,” the company said. It told angry members: “We understand that this (blog) information is very important to you. Please understand that your blogs have not been deleted. Your content is safe and we have been discussing the best ways possible to provide you your blogs.” Pictures and music playlists can be located and transferred over to the new Myspace, the company added.

Founded in 2003 by a team of California web pioneers led by Tom Anderson and Chris DeWolfe, MySpace generated $800 million in revenue by 2008. Arctic Monkeys were among the bands who used the network’s music-sharing feature as a springboard for success.

When was the last time you had to delete a bunch of photos or apps on your mobile device to clear out space? With the massive amount of data generated every day, it’s easy to exhaust all the available storage on your phone or tablet.

And this problem is only getting worse. Industry trends suggest that device storage capacities are growing at 25 percent per year, but the amount of data being produced is increasing even faster — by around 50 percent a year, according to Microsoft. The software giant is looking to address this problem with SkyDrive, which will be updated in Windows 8.1 with the goal of giving you access to your files at all times, without taking up all your available storage or Internet bandwidth.

The updated service utilizes what Microsoft refers to as “placeholder files,” which look and feel like normal folders and files with one major change — you don’t download the full file until you access it. The placeholder file contains just a thumbnail image and some basic properties, making it significantly smaller than its actual size. This means that 100GB of files in SkyDrive will use up less than 5GB of storage on the hard drive of your Windows 8.1 device, Mona Akmal.

“I have a Pictures folder in SkyDrive that’s 5.6GB in size but it’s only taking up 185MB on the local disk,” Akmal wrote.

Another major change to SkyDrive in Windows 8.1 deals with offline access to files. With the SkyDrive app, you’ll now be able to mark any folders or files you want remain available when you lose Internet connectivity.

Any edits you make to a file while offline will automatically be synced back up to SkyDrive when you regain a connection. For added convenience, all the files you open or edit on your device will automatically be marked for offline access.

As a reminder, new SkyDrive users get 7GB of storage for free. After that, an additional 20GB costs $10 per year, while 50GB will set you back $25 a year, and 100GB costs $50 a year.

We sat down with Angus Logan, group product marketing manager for SkyDrive (pictured below), last week to get the scoop on the most important changes to the online storage service in Windows 8.1.

CALIFORNIA – Google is taking the handwriting feature in Google Translate. This feature allows users to write their own posts with hard letters like Japanese, Chinese, and Arabic, and then translated by their gadgets by Google Translate.

Google reveals, this feature is provided for the traveler who currently travel to countries with languages ​​that do not use the letters of the alphabet or has complicated letters.

Can imagine if a British tourist who was a walk to China and see the street names with writing that is not understood. With handwriting input features of Google Translate, its just write or draw a silly row of letters on his smartphone or tablet, and then stay translate into the desired language.

“Handwriting feature allows you to translate phrases, even if you yourself did not know and did not know how to type the character,” said Google product manager, Xiangye Xiao, as reported by CNet.

For example, you see the writing in Chinese characters, but you do not know how to type in the text. With Handwriting feature, you just need to mimic the shape of the piece and then translate it, “she added.

Handwriting feature has been provided by Google in 45 kinds of language that are considered complicated character. Among them are Arabic, Chinese, Japanese, Laos, and Greece.

The Apache Software Foundation (ASF), the all-volunteer developers, stewards, and incubators of nearly 150 Open Source projects and initiatives, announced today that Apache Mesos has graduated from the Apache Incubator to become a Top-Level Project (TLP), signifying that the project’s community and products have been well-governed under the ASF’s meritocratic process and principles.

Apache Mesos is a cluster manager that provides efficient resource isolation and sharing across distributed applications, or frameworks. It can run multiple frameworks, including Apache Hadoop, MPI, Hypertable, Jenkins, Storm, and Spark, as well as other applications and custom frameworks.

“It was our goal all along to see Mesos become a kernel of the infrastructure stack of the future,” said Benjamin Hindman, Vice President of Apache Mesos. “The project’s graduation from the Apache Incubator is recognition that the software is mature and has brought together a diverse community to sustain it in the future.”

Initially created at the University of California at Berkeley’s AMPLab (the research center also responsible for the original development of Apache Spark) to manage resource sharing and isolation in data centers, Mesos acts as a layer of abstraction between applications and pools of servers. Mesos helps avoid the necessity of creating separate clusters to run individual frameworks and instead making it possible to optimize how jobs are executed across shared machines.

Whilst in the Apache Incubator, Mesos had four releases, and established an Open Source community according to The Apache Way of governance. Additional improvements to the project includes its flexibility to support several application framework languages, and scalability that has been production tested to thousands of nodes and simulated to tens of thousands of nodes and hundreds of frameworks.

Apache Mesos has proven to be reliable for use in production, and has already been adopted by several organizations for cluster management.

“Mesos is the cornerstone of our elastic compute infrastructure,” explained Chris Fry, Senior Vice President of Engineering at Twitter. “It’s how we build all our new services and is critical for Twitter’s continued success at scale … one of the primary keys to our data infrastructure efficiency.”

“We’re using Mesos to manage cluster resources for most of our data infrastructure,” said Brenden Matthews, Engineer at Airbnb and Apache Mesos Committer. “We run Chronos, Storm, and Hadoop on top of Mesos in order to process petabytes of data.” (Chronos is an Airbnb-developed Mesos framework as a replacement for cron, and an example of how custom frameworks can be developed on Mesos to leverage its resource sharing).

“Community support for Apache Mesos is encouraging, particularly as more companies assess how they manage their clusters and look for more efficiency,” added Hindman. “Now that we’ve graduated, we look forward to continuing to grow the number of Mesos adopters and fostering an ecosystem around the project.”

Availability and Oversight
As with all Apache products, Apache Mesos software is released under the Apache License v2.0, and is overseen by a self-selected team of active contributors to the project. A Project Management Committee (PMC) guides the Project’s day-to-day operations, including community development and product releases.