Planet Sysadmin               

          blogs for sysadmins, chosen by sysadmins...
(Click here for multi-language)

May 03, 2016

Everything Sysadmin

I am Satoshi Nakamoto, inventor of Bitcoin

There is a long and fraught history in Bitcoin of claims and counterclaims about who Satoshi is. I might as well confess that he is me.

I come forward at this time because Craig Wright claims to be Satoshi and I can't stand such intentional scammery.

If you read any of my pre-Bitcoin books, you'll see there are many pages where the first letter of each line reads "I am Satoshi Nakamoto" and "Someday I will invent Bitcoin". If you can't find the page that contains this, buy more copies of the books. You just haven't found the right one. Please use this link, since it includes my Amazon Associates code. Buy additional copies since each one might be slightly different.

Further proof: The first letter of the chapter titles of Time Management for System Administrators, it spells "tfrttttpseeda". I mean, what hacker doesn't know what that means?

Oh, I guess you aren't elite. My bad.

Now let me address my critics: Some say that this is just Tom trying to promote his books. Well, if that's what I was doing, do you think I'd write this on my book-promotion blog?

By the way... you can get a sneak preview of my next book: First, the new issue of ACM Queue magazine has the complete text of chapter 2 (free to ACM members, everyone else pays a small fee). Alternatively you can also see the latest complete draft on Safari Books Online which you probably already have a subscription to.

That's all I have to say on this matter.

by Tom Limoncelli at May 03, 2016 09:20 PM

Rands in Repose

Medium or WordPress?

As you may have noticed, I’ve been posting work to Medium for several months now. This started out as an experiment to see the magnitude of the reaction to successful pieces I’ve already written here.

The results? There are a lot of humans out there and many of them traipsing around Medium had never read these pieces. In general, an article that performed well here will play well on Medium provided that it hasn’t been posted here recently.

Folks have asked. No, I’m not done posting here. My policy is to continue to post all new content here and occasionally post pieces to Medium.

Now you know.

#

by rands at May 03, 2016 05:43 PM

Ubuntu Geek

Upgrade ubuntu 15.10 to Ubuntu 16.04

Sponsored Link
The Ubuntu team is very pleased to announce our sixth long-term support release, Ubuntu 16.04 LTS for Desktop, Server, Cloud, and Core.

Codenamed "Xenial Xerus", 16.04 LTS continues Ubuntu’s proud tradition of integrating the latest and greatest open source technologies into a high-quality, easy-to-use Linux distribution. The team has been hard at work through this cycle, introducing new features and fixing bugs.
(...)
Read the rest of Upgrade ubuntu 15.10 to Ubuntu 16.04 (103 words)


© ruchi for Ubuntu Geek, 2016. | Permalink | No comment | Add to del.icio.us
Post tags: ,

Related posts

by ruchi at May 03, 2016 02:07 PM

apt-get

My Free Software Activities in April 2016

My monthly report covers a large part of what I have been doing in the free software world. I write it for my donators (thanks to them!) but also for the wider Debian community because it can give ideas to newcomers and it’s one of the best ways to find volunteers to work with me on projects that matter to me.

Debian LTS

I handled a new LTS sponsor that wanted to see wheezy keep supporting armel and armhf. This was not part of our initial plans (set during last Debconf) and I thus mailed all teams that were impacted if we were to collectively decide that it was OK to support those architectures. While I was hoping to get a clear answer rather quickly, it turns out that we never managed to get an answer to the question from all parties. Instead the discussion drifted on the more general topic of how we handle sponsorship/funding in the LTS project.

Fortunately, the buildd maintainers said they were OK with this and the ftpmasters had no objections, and they both implicitly enacted the decision: Ansgar Burchardt kept the armel/armhf architectures in the wheezy/updates suite when he handled the switch to the LTS team, and Aurélien Jarno also configured wanna-build to keep building armel/armhf for the suite. The DSA team did not confirm that this change was not interfering with one of their plans to decommission some hardware. Build daemons are a shared resource anyway and a single server is likely to handle builds for multiple releases.

DebConf 16

This month I registered for DebConf 16 and submitted multiple talk/BoF proposals:

  • Kali Linux’s Experience of a Debian Derivative Based on Testing (Talk)
  • 2 Years of Work of Paid Contributors in the Debian LTS Project (Talk)
  • Using Debian Money to Fund Debian Projects (BoF)

I want to share the setup we use in Kali as it can be useful for other derivatives and also for Debian itself to help smooth the relationship with derivatives.

I also want to open again the debate on the usage of money within Debian. It’s a hard topic but we should really strive to take some official position on what’s possible and what’s not possible. With Debian LTS and its sponsorship we have seen that we can use money to some extent without hurting the Debian project as a whole. Can this be transposed to other teams or projects? What are the limits? Can we define a framework and clear rules? I expect the discussion to be very interesting in the BoF. Mehdi Dogguy has agreed to handle this BoF with me.

Packaging

Django. I uploaded 1.8.12 to jessie-backports and 1.9.5 to unstable. I filed two upstream bugs (26473 and 26474) for two problems spotted by lintian.

Unfortunately, when I wanted to upload it to unstable, the test suite did not ran. I pinned this down to a sqlite regression. Chris Lamb filed #820225 and I contacted the SQLite and Django upstream developers by email to point them to this issue. I helped the SQLite upstream author (Richard Hipp) to reproduce the issue and he was quick to provide a patch which landed in 3.12.1.

Later in the month I made another upload to fix an upgrade bug (#821789).

GNOME 3.20. As for each new version, I updated gnome-shell-timer to ensure it works with the new GNOME. This time I spent a bit more time to fix a regression (805347) that dates back to a while and that would never be fixed otherwise since the upstream author orphaned this extension (as he no longer uses GNOME).

I have also been bitten by display problems where accented characters would be displayed below the character that follows. With the help of members of the GNOME team, we found out that this was a problem specific to the cantarell font and was only triggered with Harfbuzz 1.2. This is tracked in Debian with #822682 on harfbuzz and #822762 in fonts-cantarell. There’s a new upstream release (with the fix) ready to be packaged but unfortunately it is blocked by the lack of a recent fontforge in Debian. I thus mailed debian-mentors in the hope to find volunteers to help the pkg-fonts team to package a newer version…

Misc Debian/Kali work

Distro Tracker. I started to mentor Vladimir Likic who contacted me because he wants to contribute to Distro Tracker. I helped him to setup his development environment and we fixed a few issues in the process.

Bug reports. I filed many bug reports, most of them due to my work on Kali:

  • #820288: a request to keep the wordpress package installable in older releases (due to renaming of many php packages)
  • #820660: request support of by-hash indices in reprepro
  • #820867: possibility to apply overrides on already installed packages in reprepro
  • #821070: jessie to stretch upgrade problem with samba-vfs-modules
  • #822157: python-future hides and breaks python-configparser
  • #822669: dh_installinit inserts useless autoscript for System V init script when package doesn’t contain any
  • #822670: dh-systemd should be merged into debhelper, we have systemd by default and debhelper should have proper support for it by default

I also investigated #819958 that was affecting testing since it has been reported to Kali as well. And I made an NMU of dh-make-golang to fix #819472 that I reported earlier.

Thanks

See you next month for a new summary of my activities.

No comment | Liked this article? Click here. | My blog is Flattr-enabled.

by Raphaël Hertzog at May 03, 2016 07:13 AM

Aaron Johnson

Links: 5-2-2016

by ajohnson at May 03, 2016 06:30 AM

Google Blog

Helping for the long term in Flint, Michigan

Access to clean drinking water is a concern all over the world, but in the United States it’s often a foregone conclusion. That is not the case recently for the residents of Flint, Michigan, many of whom we now know have been exposed to lead in their tap water. It’s a crisis, one to which the American people readily responded by donating water and resources to help alleviate the immediate pain. But the problem won’t go away quickly, and understanding its extent is both challenging and an absolute necessity. Today, Google.org is providing $250,000 to partners in the Flint community to help, with a special focus on a technical solution for understanding and resolving the crisis for the long term.

First, we’re making a $150,000 grant to the University of Michigan-Flint to enable the University of Michigan-Flint to develop a comprehensive data platform that will assist government and community leaders in making more informed decisions about the crisis and providing critical information to citizens. The funds will support student researchers at the University of Michigan, Flint and Ann Arbor campuses, to do this work under the leadership of Professors Mark Allison (Flint) and Jake Abernathy (Ann Arbor) to answer key questions about the crisis and response, such as the probability of lead levels before they are tested. The team plans to develop a platform and app that visualizes the data and also provides the ability for citizens to seek out and request key services, such as reporting concerns about water and requesting testing kits. Google volunteers will provide guidance and mentoring on the technology and product design.

We’re also making a $100,000 donation to the Community Foundation of Greater Flint for the Flint Child Health & Development Fund. The Flint Child Health & Development Fund was founded to ensure the long-term health of Flint families, especially newborns to children 6 years old—the group most vulnerable to developmental issues from lead. The Fund is a supplemental resource to state and federal funding and gives grants for childcare-related initiatives such as early childhood education, student support services, continuous access to a pediatric medical home, access to infant and child behavioral health services, and research.

With Google offices in Ann Arbor and Birmingham, Flint and its residents are also our neighbors. In the immediate aftermath of the crisis, a group of 20 Google volunteers went to Flint and volunteered at the Food Bank of Eastern Michigan, where they helped with distributing bottled water and food in the greater Flint area. Around $35,000 has been donated through employees and Google's gift match program to the United Way of Genesee County and the Flint Water Fund to aid in the crisis, and our employee groups, like the Black Googler Network, continue to explore more ways to help.

As a native Michigander, I'm proud that we can help our neighbors in Flint. We hope we can support a resolution to this crisis and assist the residents of Flint in getting the resources they need and deserve, both for the short and long term.

by Google Blogs (noreply@blogger.com) at May 03, 2016 04:00 AM

May 02, 2016

UnixDaemon

CloudFormation Linting with cfn-nag

Over the last 3 years I’ve done a lot of CloudFormation work and while it’s an easy enough technology to get to grips with the mass of JSON can become a bit of a blur when you’re doing code reviews. It’s always nice to get a second pair of eyes, especially an unflagging, automated set, that has insight in to some of the easily overlooked security issues you can accidentally add to your templates. cfn-nag is a ruby gem that attempts to sift through your code and present guidelines on a number of frequently misused, and omitted, resource properties.

gem install cfn-nag

Once the gem and its dependencies finish installing you can list all the rules it currently validates against.

$ cfn_nag_rules
...
IAM policy should not apply directly to users.  Should be on group
...

I found reading through the rules to be quite a nice context refresher. While there are a few I don’t agree with there are also a some I wouldn’t have thought to single out in code review so it’s well worth having a read through the possible anti-patterns. Let’s check our code with cfn-nag.

cfn_nag --input-json-path . # all .json files in the directory
cfn_nag --input-json-path templates/buckets.json # single file check

The default output from these runs looks like:

./templates/buckets.json
------------------------------------------------------------
| WARN
|
| Resources: ["AssetsBucketPolicy"]
|
| It appears that the S3 Bucket Policy allows s3:PutObject without server-side encryption

Failures count: 0
Warnings count: 1

./templates/elb.json
-------------
| WARN
|
| Resources: ["ELB"]
|
| Elastic Load Balancer should have access logging configured

Failures count: 0
Warnings count: 1

If you’d like to reprocess the issues in another part of your tooling / pipelining then the json output formatter might be more helpful.

cfn_nag --input-json-path . --output-format json

    {
        "type": "WARN",
        "message": "Elastic Load Balancer should have access logging configured",
        "logical_resource_ids": [
            "ELB"
        ],
        "violating_code": null
    }

While the provided rules are useful it’s always a good idea to have an understanding of how easy a linting tool makes adding your own checks. In the case of cfn-nag there are two typed of rules. Some use JSON and jq and the others are pure ruby code. Let’s add a simple pure ruby rule to ensure all our security groups have descriptions. At the moment this requires you to drop code directly in to the gems contents but I imagine this will be fixed in the future.

First we’ll create our own rule:

# first we find where the gem installs its custom rules
$ gem contents cfn-nag | grep custom_rules

./.rbenv/versions/2.3.0/lib/ruby/gems/2.3.0/gems/cfn-nag-0.0.19/lib/custom_rules

Then we’ll add a new rule to that directory

touch $full_path/lib/custom_rules/security_group_missing_description.rb

Our custom check looks like this -

class SecurityGroupMissingDescription

  def rule_text
    'Security group does not have a description'
  end

  def audit(cfn_model)
    logical_resource_ids = []

    cfn_model.security_groups.each do |security_group|
      unless security_group.group_description
        logical_resource_ids << security_group.logical_resource_id
      end
    end

    if logical_resource_ids.size > 0
      Violation.new(type: Violation::FAILING_VIOLATION,
                    message: rule_text,
                    logical_resource_ids: logical_resource_ids)
    else
      nil
    end
  end
end

The code above was heavily ‘borrowed’ from an existing check and a little bit of object exploration was done using pry. Once we have our new rule we need to plumb it in to the current rule loading code. This is currently a little unwieldy but it’s worth keeping an eye on the docs for when this is fixed. We need to edit two locations in the $full_path/lib/cfn_nag.rb file. Add a require to the top of the file along side the other custom_rules and add our new classes name to the custom_rule_registry at the bottom.

--- ./.rbenv/versions/2.3.0/lib/ruby/gems/2.3.0/gems/cfn-nag-0.0.19/lib/cfn_nag.rb  2016-05-01 18:00:14.123226626 +0100
+++ ./.rbenv/versions/2.3.0/lib/ruby/gems/2.3.0/gems/cfn-nag-0.0.19/lib/cfn_nag.rb  2016-05-02 09:55:16.842675430 +0100
@@ -1,4 +1,5 @@
 require_relative 'rule'
+require_relative 'custom_rules/security_group_missing_description'
 require_relative 'custom_rules/security_group_missing_egress'
 require_relative 'custom_rules/user_missing_group'
 require_relative 'model/cfn_model'
@@ -175,6 +176,7 @@

   def custom_rule_registry
     [
+      SecurityGroupMissingDescription,
       SecurityGroupMissingEgressRule,
       UserMissingGroupRule,
       UnencryptedS3PutObjectAllowedRule

We can then add a simple CloudFormation security group resource and test our code when it does, and does not include a “description” property.

cat single-sg.json

{
  "Resources": {
    "my_sg": {
      "Type": "AWS::EC2::SecurityGroup",
      "Properties": {
        "GroupDescription": "some_group_desc",
        "SecurityGroupIngress": {
          "CidrIp": "10.1.2.3/32",
          "FromPort": 34,
          "ToPort": 34,
          "IpProtocol": "tcp"
        },
        "VpcId": "vpc-12345678"
      }
    }
  }
}

If you run cfn_nag over that template then you shouldn’t see our new rule mentioned. Now go back and remove the GroupDescription line and run it again.

| FAIL
|
| Resources: ["my_sg"]
|
| Security group does not have a description

It’s quite early days for the project and there are a few gaps in functionality, controlling which rule sets to apply and easier addition of custom rules are the two I’d like to see, but considering how easy it is to install and run cfn-nag over your templates I think it’s well worth giving your code an occasional once over with a second pair of (automated) eyes. I don’t think I’d add it to my build/deploy pipelines until it addresses that missing functionality but as a small automated code review helper I can see it being quite handy.

by dwilson@unixdaemon.net (Dean Wilson) at May 02, 2016 05:27 PM

Google Webmasters

How we fought webspam in 2015


Search is a powerful tool. It helps people to find, share, and access an amazing wealth of content regardless of how they connect or where they are located. As part of Google’s search quality team, we work hard to ensure that searchers see high quality search results—and not webspam. We fight spam through a combination of algorithms and manual reviews to ensure that sites don’t rise in search results through deceptive or manipulative behavior, especially because those sites could harm or mislead users.

Below are some of the webspam insights we gathered in 2015, including trends we’ve seen, what we’re doing to fight spam and protect against those trends, and how we’re working with you to make the web better.


2015 webspam trends

  • We saw a huge number of websites being hacked – a 180% increase compared to the previous year. Stay safe on the web and take preventative measures to protect your content on the web.

  • We saw an increase in the number of sites with thin, low quality content. Such content contains little or no added value and is often scraped from other sites.


2015 spam-fighting efforts

  • As always, our algorithms addressed the vast majority of webspam and search quality improvement for users. One of our algorithmic updates helped to remove the amount of hacked spam in search results.

  • The rest of spam was tackled manually. We sent more than 4.3 million messages to webmasters to notify them of manual actions we took on their site and to help them identify the issues.

  • We saw a 33% increase in the number of sites that went through spam clean-up efforts towards a successful reconsideration process.


Working with users and webmasters for a better web

  • More than 400,000 spam reports were submitted by users around the world. After prioritizing the reports, we acted on 65% of them, and considered 80% of those acted upon to be spam. Thanks to all who submitted reports and contributed towards a cleaner web ecosystem!

  • We conducted more than 200 online office hours and live events around the world in 17 languages. These are great opportunities for us to help webmasters with their sites and for them to share helpful feedback with us as well.

  • The webmaster help forum continued to be an excellent source of webmaster support. Webmasters had tens of thousands of questions answered, including over 35,000 by users designated as Webmaster Top Contributors. Also, 56 Webmaster Top Contributors joined us at our Top Contributor Summit to discuss how to provide users and webmasters with better support and tools. We’re grateful for our awesome Top Contributors and their tremendous contributions!

We’re continuously improving our spam-fighting technology and working closely with webmasters and users to foster and support a high-quality web ecosystem. (In fact, fighting webspam is one of the many ways we maintain search quality at Google.) Thanks for helping to keep spammers away so users can continue accessing great content in Google Search.


by Google Webmaster Central (noreply@blogger.com) at May 02, 2016 12:25 PM

May 01, 2016

UnixDaemon

Terraform Modules - My Sharing Wishlist

I’ve been writing a few Terraform modules recently with the aim of sharing them among a few different teams and there are a couple of things missing that I think would make reusable modules much more powerful.

The first and more generic issue is using the inability to use more complex data structures. After you’ve spent a while using Terraform with AWS resources you’ll develop the urge to just create a hash of tags and use it nearly everywhere. Hopefully with the ability to override a key / value or two when actually using the hash. If your teams are using tags, and you really should be, it’s very hard to write a reusable module if the tag names in use by each team are not identical. Because you can only (currently) pass strings around, and you’re unable to use a variable as a tag name, you’re stuck with requiring everyone to use exactly the same tag names or not providing any at all. There’s no middle ground available.

tags {
    "${var.foo}" = "Baz"
}

# creates a Tag called literally '${var.foo}'

My second current pain point, and the one I’m more likely to have missed a solution to, is the ability to conditionally add or remove resource attributes. The most recent time this has bitten me is when trying to generalise a module that uses Elastic Load Balancers. Sometimes you’ll want an ELB with a cert and sometimes you won’t. Using the current module system there’s no way to handle this case.

If I was to do the same kind of thing in CloudFormation I’d use the AWS::NoValue pseudo parameter.

    "DBSnapshotIdentifier" : {
        "Fn::If" : [
            "UseDBSnapshot",
                {"Ref" : "DBSnapshotName"},
                {"Ref" : "AWS::NoValue"}
        ]
    }

If DBSnapshotName has a value the DBSnapshotIdentifier property is present and set to that value. If it’s not defined then the property is not set on the resource.

As an aside, after chatting with @andrewoutloud, it’s probably worth noting that you can make entire resources optional using a count and setting it to 0 when you don’t want the resource to be included. While this is handy and worth having in your Terraform toolkit it doesn’t cover my use case.

variable "include_rds" {
    default = 0
    description = "Should we include a aws_db_instance? Set to 1 to include it"
}

resource "aws_db_instance" "default" {
    count = "${var.include_rds}" # this serves as an if

    # ... snip ...
}

I’m sure these annoyances will be ironed out in time but it’s worth considering them and how they’ll impact the reusability of any modules you’d like to write or third party code you’d want to import. At the moment it’s a hard choice between rewriting everything for my own use and getting all the things I need or vendoring everything in and maintaining a branch with things like my own tagging scheme and required properties.

by dwilson@unixdaemon.net (Dean Wilson) at May 01, 2016 04:27 PM

April 30, 2016

SysAdmin1138

Resumè of failure

There has been a Thing going through twitter lately, about a Princeton Prof who posted a resume of failures.

About that...

This is not a bad idea, especially for those of us in Ops or bucking for 'Senior' positions. Why? Because in my last job hunt, a very large percentage of interviewers asked a question like this:

Please describe your biggest personal failure, and what you learned from it?

That's a large scope. How to pick which one?

What was your biggest interpersonal failure, and how did you recover from it?

In a 15+ year career, figuring out which is 'biggest' is a challenge. But first, I need to remember what they are. For this one, I've been using events that happened around 2001; far enough back that they don't really apply to the person I am now. This is going to be a problem soon.

What was your biggest production-impacting failure, and how was the post-mortem process handled?

Oh, do I go for 'most embarrassing,' or, 'most educational'? I'd snark about 'so many choices', but my memory tends to wallpaper over the more embarassing fails in ways that make remembering them during an interview next to impossible. And in this case, the 'post-mortem process' bit at the end actually rules out my biggest production-impacting problem... there wasn't a post-mortem, other than knowing looks of, you're not going to do that again, right?

Please describe your biggest failure of working with management on something.

Working in service-organizations as long as I have, I have a lot of 'failure' here. Again, picking the right one to use in an interview is a problem.

You begin to see what I'm talking about here. If I had realized that my failures would be something I needed to both keep track of, and keep adequate notes on to refer back to them 3, 5, 9, 14 years down the line, I would have been much better prepared for these interviews. The interviewers are probing how I behave when Things Are Not Going Right, since that sheds far more light on a person than Things Are Going Perfectly projects.

A Resumè of Failure would have been an incredibly useful thing to have. Definitely do not post it online, since hiring managers are looking for rule-outs to thin the pile of applications. But keep it next to your copy of your resume, next to your References from Past Managers list.

by SysAdmin1138 at April 30, 2016 01:53 PM

April 28, 2016

Everything Sysadmin

Have you downloaded the March/April issue of acmqueue yet?

The March/April issue of acmqueue - the magazine written for and by software engineers that leaves no corner of the development world unturned - is now available for download.

This issue contains a preview of a chapter from our next book, the 3rd edition of TPOSANA. This issue contains a preview of a chapter from our next book, the 3rd edition of TPOSANA. The chapter is called "The Small Batches Principle". We are very excited to be able to bring you this preview and hope you find the chapter fun and educational. The book won't be out until Oct 7, 2016, so don't miss this opportunity to read it early!

The bimonthly issues of acmqueue are free to ACM Professional members. (One-year subscription cost is $19.99 for non-ACM members.) You can also buy a single issue. For more information.

by Tom Limoncelli at April 28, 2016 02:27 PM

Google Blog

This year’s Founders' Letter

Every year, Larry and Sergey write a Founders' Letter to our stockholders updating them with some of our recent highlights and sharing our vision for the future. This year, they decided to try something new. - Ed.

In August, I announced Alphabet and our new structure and shared my thoughts on how we were thinking about the future of our business. (It is reprinted here in case you missed it, as it seems to apply just as much today.) I’m really pleased with how Alphabet is going. I am also very pleased with Sundar’s performance as our new Google CEO. Since the majority of our big bets are in Google, I wanted to give him most of the bully-pulpit here to reflect on Google’s accomplishments and share his vision. In the future, you should expect that Sundar, Sergey and I will use this space to give you a good personal overview of where we are and where we are going. 

- Larry Page, CEO, Alphabet



When Larry and Sergey founded Google in 1998, there were about 300 million people online. By and large, they were sitting in a chair, logging on to a desktop machine, typing searches on a big keyboard connected to a big, bulky monitor. Today, that number is around 3 billion people, many of them searching for information on tiny devices they carry with them wherever they go.

In many ways, the founding mission of Google back in ’98—“to organize the world’s information and make it universally accessible and useful”—is even truer and more important to tackle today, in a world where people look to their devices to help organize their day, get them from one place to another, and keep in touch. The mobile phone really has become the remote control for our daily lives, and we’re communicating, consuming, educating, and entertaining ourselves, on our phones, in ways unimaginable just a few years ago.

Knowledge for everyone: search and assistance
As we said when we announced Alphabet, “the new structure will allow us to keep tremendous focus on the extraordinary opportunities we have inside of Google.” Those opportunities live within our mission, and today we are about one thing above all else: making information and knowledge available for everyone.

This of course brings us to Search—the very core of this company. It’s easy to take Search for granted after so many years, but it’s amazing to think just how far it has come and still has to go. I still remember the days when 10 bare blue links on a desktop page helped you navigate to different parts of the Internet. Contrast that to today, where the majority of our searches come from mobile, and an increasing number of them via voice. These queries get harder and harder with each passing year—people want more local, more context-specific information, and they want it at their fingertips. So we’ve made it possible for you to search for [Leonardo DiCaprio movies] or [Zika virus] and get a rich panel of facts and visuals. You can also get answers via Google Now—like the weather in your upcoming vacation spot, or when you should leave for the airport—without you even needing to ask the question.

Helping you find information that gets you through your day extends well beyond the classic search query. Think, for example, of the number of photos you and your family have taken throughout your life, all of your memories. Collectively, people will take 1 trillion photos this year with their devices. So we launched Google Photos to make it easier for people to organize their photos and videos, keep them safe, and be able to find them when they want to, on whatever device they are using. Photos launched less than a year ago and already has more than 100 million monthly active users. Or take Google Maps. When you ask us about a location, you don’t just want to know how to get from point A to point B. Depending on the context, you may want to know what time is best to avoid the crowds, whether the store you’re looking for is open right now, or what the best things to do are in a destination you’re visiting for the first time.

But all of this is just a start. There is still much work to be done to make Search and our Google services more helpful to you throughout your day. You should be able to move seamlessly across Google services in a natural way, and get assistance that understands your context, situation, and needs—all while respecting your privacy and protecting your data. The average parent has different needs than the average college student. Similarly, a user wants different help when in the car versus the living room. Smart assistance should understand all of these things and be helpful at the right time, in the right way.

The power of machine learning and artificial intelligence
A key driver behind all of this work has been our long-term investment in machine learning and AI. It’s what allows you to use your voice to search for information, to translate the web from one language to another, to filter the spam from your inbox, to search for “hugs” in your photos and actually pull up pictures of people hugging ... to solve many of the problems we encounter in daily life. It’s what has allowed us to build products that get better over time, making them increasingly useful and helpful.

We’ve been building the best AI team and tools for years, and recent breakthroughs will allow us
to do even more. This past March, DeepMind’s AlphaGo took on Lee Sedol, a legendary Go master, becoming the first program to beat a professional at the most complex game mankind ever devised. The implications for this victory are, literally, game changing—and the ultimate winner is humanity. This is another important step toward creating artificial intelligence that can help us in everything from accomplishing our daily tasks and travels, to eventually tackling even bigger challenges like climate change and cancer diagnosis.

More great content, in more places
In the early days of the Internet, people thought of information primarily in terms of web pages. Our focus on our core mission has led us to many efforts over the years to improve discovery, creation, and monetization of content—from indexing images, video, and the news, to building platforms like Google Play and YouTube. And with the migration to mobile, people are watching more videos, playing more games, listening to more music, reading more books, and using more apps than ever before.

That’s why we have worked hard to make YouTube and Google Play useful platforms for discovering and delivering great content from creators and developers to our users, when they want it, on whatever screen is in front of them. Google Play reaches more than 1 billion Android users. And YouTube is the number-one destination for video—over 1 billion users per month visit the site—and ranks among the year’s most downloaded mobile apps. In fact, the amount of time people spend watching videos on YouTube continues to grow rapidly—and more than half of this watchtime now happens on mobile. As we look to the future, we aim to provide more choice to YouTube fans—more ways for them to engage with creators and each other, and more ways for them to get great content. We’ve started down this journey with specialized apps like YouTube Kids, as well as through our YouTube Red subscription service, which allows fans to get all of YouTube without ads, a premium YouTube Music experience and exclusive access to new original series and movies from top YouTube creators like PewDiePie and Lilly Singh.

We also continue to invest in the mobile web—which is a vital source of traffic for the vast majority of websites. Over this past year, Google has worked closely with publishers, developers, and others in the ecosystem to help make the mobile web a smoother, faster experience for users. A good example is the Accelerated Mobile Pages (AMP) project, which we launched as an open-source initiative in partnership with news publishers, to help them create mobile-optimized content that loads instantly everywhere. The other example is Progressive Web Apps (PWA), which combine the best of the web and the best of apps—allowing companies to build mobile sites that load quickly, send push notifications, have home screen icons, and much more. And finally, we continue to invest in improving Chrome on mobile—in the four short years since launch, it has just passed 1 billion monthly active users on mobile.

Of course, great content requires investment. Whether you’re talking about Google’s web search, or a compelling news article you read in The New York Times or The Guardian, or watching a video on YouTube, advertising helps fund content for millions and millions of people. So we work hard to build great ad products that people find useful—and that give revenue back to creators and publishers.

Powerful computing platforms
Just a decade ago, computing was still synonymous with big computers that sat on our desks. Then, over just a few years, the keys to powerful computing—processors and sensors—became so small and cheap that they allowed for the proliferation of supercomputers that fit into our pockets: mobile phones. Android has helped drive this scale: it has more than 1.4 billion 30-day-active devices—and growing.

Today’s proliferation of “screens” goes well beyond phones, desktops, and tablets. Already, there are exciting developments as screens extend to your car, like Android Auto, or your wrist, like Android Wear. Virtual reality is also showing incredible promise—Google Cardboard has introduced more than 5 million people to the incredible, immersive and educational possibilities of VR.

Looking to the future, the next big step will be for the very concept of the “device” to fade away. Over time, the computer itself—whatever its form factor—will be an intelligent assistant helping you through your day. We will move from mobile first to an AI first world.

Enterprise
Most of these computing experiences are very likely to be built in the cloud. The cloud is more secure, more cost effective, and it provides the ability to easily take advantage of the latest technology advances, be it more automated operations, machine learning, or more intelligent office productivity tools.

Google started in the cloud and has been investing in infrastructure, data management, analytics, and AI from the very beginning. We now have a broad and growing set of enterprise offerings: Google Cloud Platform (GCP), Google Apps, Chromebooks, Android, image recognition, speech translation, maps, machine learning for customers’ proprietary data sets, and more. Our customers like Whirlpool, Land O’Lakes and Spotify are transforming their businesses by using our enterprise productivity suite of Google Apps and Google Cloud Platform services.

As we look to our long-term investments in our productivity tools supported by our machine learning and artificial intelligence efforts, we see huge opportunities to dramatically improve how people work. Your phone should proactively bring up the right documents, schedule and map your meetings, let people know if you are late, suggest responses to messages, handle your payments and expenses, etc.

Building for everyone
Whether it’s a developer using Google Cloud Platform to power their new application, or a creator finding new income and viewers via YouTube, we believe in leveling the playing field for everyone. The Internet is one of the world’s most powerful equalizers, and we see it as our job to make it available to as many people as possible.

This belief has been a core Google principle from the very start—remember that Google Search was in the hands of millions long before the idea for Google advertising was born. We work on advertising because it’s what allows us to make our services free; Google Search works the same for anyone with an Internet connection, whether it is in a modern high-rise or a rural schoolhouse.

Making this possible is a lot more complicated than simply translating a product or launching a local country domain. Poor infrastructure keeps billions of people around the world locked out of all of the possibilities the web may offer them. That’s why we make it possible for there to be a $50 Android phone, or a $100 Chromebook. It’s why this year we launched Maps with turn-by-turn navigation that works even without an Internet connection, and made it possible for people to get faster-loading, streamlined Google Search if they are on a slower network. We want to make sure that no matter who you are or where you are or how advanced the device you are using ... Google works for you.

In all we do, Google will continue to strive to make sure that remains true—to build technology for everyone. Farmers in Kenya use Google Search to keep up with crop prices and make sure they can make a good living. A classroom in Wisconsin can take a field trip to the Sistine Chapel ... just by holding a pair of Cardboard goggles. People everywhere can use their voices to share new perspectives, and connect with others, by creating and watching videos on YouTube. Information can be shared—knowledge can flow—from anyone, to anywhere. In 17 years, it’s remarkable to me the degree to which the company has stayed true to our original vision for what Google should do, and what we should become.

For us, technology is not about the devices or the products we build. Those aren’t the end-goals. Technology is a democratizing force, empowering people through information. Google is an information company. It was when it was founded, and it is today. And it’s what people do with that information that amazes and inspires me every day.

by Google Blogs (noreply@blogger.com) at April 28, 2016 10:00 AM

Expeditions career tours can take kids to work, virtually

Soledad O’Brien is a broadcast journalist and founder of Starfish Media Group. She is also CEO of the Starfish Foundation, which provides financial assistance and mentoring to help kids go to college. Recently, the Starfish Foundation launched virtual career tours using Google Expeditions, about which O’Brien joins us to talk about today. To become part of the Expeditions Pioneer beta program, sign up via this form. -Ed.

Kids dream about what they want to be when they grow up, but these dreams are often limited—built around the few professional people they know. What if children don’t know a veterinarian, an airplane pilot, a paleontologist, or someone in dozens of other careers? What if they lack access to internships or mentors? Can they ever dream big?

I know from watching my own kids visit me at work, and from the scholars I mentor, that exposure to all kinds of professionals is the key to inspiring young people. When I first found out about Expeditions, I saw its potential for broadening the horizons of the student scholars we help at Starfish Foundation. I envisioned creating virtual reality Expeditions that let kids step into someone’s work day, simply by using phones and Google Cardboard viewers. So that’s what we did.
Soledad O'Brien with scholars from the Starfish Foundation.
Working with the Google Expeditions team, we created virtual reality tours that show kids the ins and out of careers they might not ever learn about otherwise. From flying an airplane to testing fossil samples, kids can see with their own eyes exactly what people do in many different scenarios. They can watch Carolyn Brown, director of surgery for the American Society for the Prevention of Cruelty to Animals, perform a procedure on a cat. Or join Mark Norell, a paleontology professor with the American Museum of Natural History, as he examines a velociraptor specimen up close. And today, schools participating in the Google Expeditions Pioneer Program and Expeditions beta will be able to go on an Expedition of the Google Mountain View campus to see what it’s like to work at Google.
A career Expedition on American Airlines Pilot, Pam Torell. The view is from the cockpit of one of her scheduled flights.
These Expeditions reveal what professionals like about their jobs, what they studied in school, and how they apply their knowledge to their work. Regular field trips are logistically challenging, and they don’t usually focus on careers. But with Expeditions, teachers can share an experience with students right in the classroom. You can’t fit 30 students in the cockpit of a plane, but you can get a virtual reality tour of one using Expeditions. And today, on “Take Your Kids to Work Day,” there’s no better time to get creative about exposing students to different types of jobs and workplace environments.

Children won’t know what jobs are possible if they don’t know the careers exist. Rather than just telling them, teachers can actually show them. With these career Expeditions, students can travel outside the classroom walls and be exposed to more ideas, places and opportunities than ever before.



(Cross-posted on the Google for Education Blog)

https://4.bp.blogspot.com/-KP8iBDEQkQI/VyInM3bFYbI/AAAAAAAASPw/Ae89NIr84L0cv5VSZTeUEs2df68M2uWlwCLcB/s1600/EDU-Expeditions-b-13.png

by Google Blogs (noreply@blogger.com) at April 28, 2016 09:09 AM

Ten years of Google Translate

Ten years ago, we launched Google Translate. Our goal was to break language barriers and to make the world more accessible. Since then we’ve grown from supporting two languages to 103, and from hundreds of users to hundreds of millions. And just like anyone’s first 10 years, we’ve learned to see and understand, talk, listen, have a conversation, write, and lean on friends for help.

But what we're most inspired by is how Google Translate connects people in communities around the world, in ways we never could have imagined—like two farmers with a shared passion for tomato farming, a couple discovering they're pregnant in a foreign country, and a young immigrant on his way to soccer stardom.

Here’s a look at Google Translate today, 10 years in:

1. Google Translate helps people make connections.
Translate can help people help each other, often in the most difficult of times. Recently we visited a community in Canada that is using Translate to break down barriers and make a refugee family feel more welcome:
2. There are more than 500 million of you using Google Translate.
The most common translations are between English and Spanish, Arabic, Russian, Portuguese and Indonesian.

3. Together we translate more than 100 billion words a day.
4. Translations reflect trends and events.
In addition to common phrases like “I love you,” we also see people looking for translations related to current events and trends. For instance, last year we saw a big spike in translations for the word "selfie,” and this past week, translations for "purple rain" spiked by more than 25,000 percent.

5. You’re helping to make Google Translate better with Translate Community.
So far, 3.5 million people have made 90 million contributions through Translate Community, helping us improve and add new languages to Google Translate. A few properly translated sentences can make a huge difference when faced with a foreign language or country. By reviewing, validating and recommending translations, we’re able to improve the Google Translate on a daily basis.

6. Brazil uses Google Translate more than any other country.
Ninety-two percent of our translations come from outside of the United States, with Brazil topping the list.
7. You can see the world in your language.
Word Lens is your friend when reading menus, street signs and more. This feature in the Google Translate App lets you instantly see translations in 28 languages.
8. You can have a conversation no matter what language you speak.
In 2011, we first introduced the ability to have a bilingual conversation on Google Translate. The app will recognize which language is being spoken when you’re talking with someone, allowing you to have a natural conversation in 32 languages.

9. You don't need an Internet connection to connect.
Many countries don’t have reliable Internet, so it’s important to be able to translate on the go. You can instantly translate signs and menus offline with Word Lens on both Android and iOS, and translate typed text offline with Android.
10. There's always more to translate.
We’re excited and proud of what we’ve accomplished together over the last 10 years—but there’s lots more to do to break language barriers and help people communicate no matter where they’re from or what language they speak. Thank you for using Google Translate—here’s to another 10!
https://2.bp.blogspot.com/-zgn9UxvjSb4/VyGkFV1XZTI/AAAAAAAASPY/E2_FbLDul3gFhDfmQ4KtsTjrPFzTHjv6gCLcB/s1600/Screen%2BShot%2B2016-04-26%2Bat%2B4.19.42%2BPM.png

by Google Blogs (noreply@blogger.com) at April 28, 2016 07:00 AM

April 27, 2016

Ubuntu Geek


Administered by Joe. Content copyright by their respective authors.