Making Google Scripts & Google Sidebars Talk


flickr photo shared by Internet Archive Book Images with no copyright restriction (Flickr Commons)

Sometimes you want a Google Sidebar element to be able to pull variables from a Google Script. The key element is google.script.run.withSuccessHandler. I’m still not entirely sure I have this in my head deeply/correctly but this is functional and might help someone else (even if that someone is just me at a later point in time when I’ve forgotten this).

This one sets some default times based on spreadsheet variables stashed elsewhere and then automates some date/time entries using the form sidebar.

Sidebar Code

function onSuccess(startTime) {        
        document.getElementById('startTime').value = startTime; //thing you want done after getting the variable
      }

  google.script.run.withSuccessHandler(onSuccess)
          .getDefaultStartTime(); //name of Google Script function you want the variable from                
          

Google Script Code

function getDefaultStartTime(){
  var ss = SpreadsheetApp.getActiveSpreadsheet();
  var sheet = ss.getSheetByName('validation');
  var am = sheet.getRange("C1").getValue();
   h = (am.getHours()<10?'0':'') + am.getHours(),
   m = (am.getMinutes()<10?'0':'') + am.getMinutes();
   var startHr = h + ':' + m;
   return startTime;
}

The entire sidebar/script setup is below in case anyone cares for some reason.

//adds menu item
function onOpen() {
  SpreadsheetApp.getUi() 
      .createMenu('Add Time')
      .addItem('Do it!', 'addTime')
      .addToUi();
}


//set validation elements 
function validationRighter(){
    var ss = SpreadsheetApp.getActiveSpreadsheet();
  
    var meetColumn = ss.getSheetByName('Sheet1').getRange('C2:C'); //meeting types
    var meetValidationValues = ss.getSheetByName('validation').getRange('A1:A40').getValues();
    var meetRule = SpreadsheetApp.newDataValidation().requireValueInList(meetValidationValues).build();
    meetColumn.setDataValidation(meetRule);
  
    var appColumn = ss.getSheetByName('Sheet1').getRange('D2:D'); //application options
    var appValidationValues = ss.getSheetByName('validation').getRange('B1:B40').getValues();
    var appRule = SpreadsheetApp.newDataValidation().requireValueInList(appValidationValues).build();
    appColumn.setDataValidation(appRule);
  
    var userColumn = ss.getSheetByName('Sheet1').getRange('E2:E'); //teacher signup options
    var userValidationValues = ss.getEditors();
    Logger.log(userValidationValues);
    var userRule = SpreadsheetApp.newDataValidation().requireValueInList(userValidationValues).build();
    userColumn.setDataValidation(userRule);
 
}


//add time sidebar creation
function addTime() {
  var html = HtmlService.createHtmlOutputFromFile('sidebar')
      .setTitle('Add Time')
      .setWidth(300);
  SpreadsheetApp.getUi() // Or DocumentApp or FormApp.
      .showSidebar(html);
}

//enter days in 15 min intervals
function enterDay(theDate,startTime,endTime) {
 var ss = SpreadsheetApp.getActiveSpreadsheet();
 var sheet = ss.getActiveSheet();
 var lastRow = parseInt(sheet.getLastRow())+1;
  
  var startH = startTime.split(':')[0];
  var endH = endTime.split(':')[0];
  var startM = startTime.split(':')[1];
  var d = new Date();
  d.setHours(startH);
  d.setMinutes(startM);  
  
  var totalHours = endH-startH;
    
  for (i = 0; i < (totalHours*4)+1; i++) { 
    sheet.getRange("A"+(lastRow+i)).setValue(theDate);
    
    var mins = (d.getMinutes()<10?'0':'') + d.getMinutes();//make sure has leading 0
    sheet.getRange("B"+(lastRow+i)).setValue(d.getHours()+':'+mins);
    d = new Date(d.getTime() + (15*60000)); //adds 15 mins
  }
}

function hideAllRows(){
  var ss = SpreadsheetApp.getActiveSpreadsheet();
  var sheet = ss.getActiveSheet();
  var lastRow = sheet.getLastRow();
  sheet.hideRows(2, lastRow);
}


function processForm(formObject) {
  var theDate = formObject.theDate;
  var startTime = formObject.startTime;
  var endTime = formObject.endTime;
  var hideRows = formObject.hider;
  Logger.log(hideRows);
  if (hideRows === 'on'){
    hideAllRows();
  }
  
  enterDay(theDate, startTime, endTime);
  validationRighter();
  
}

function getDefaultStartTime(){
  var ss = SpreadsheetApp.getActiveSpreadsheet();
  var sheet = ss.getSheetByName('validation');
  var am = sheet.getRange("C1").getValue();
   h = (am.getHours()<10?'0':'') + am.getHours(),
   m = (am.getMinutes()<10?'0':'') + am.getMinutes();
   var startHr = h + ':' + m;
   return startHr;
}

function getDefaultEndTime(){
  var ss = SpreadsheetApp.getActiveSpreadsheet();
  var sheet = ss.getSheetByName('validation');
  var pm = sheet.getRange("C2").getValue();
   h = (pm.getHours()<10?'0':'') + pm.getHours(),
   m = (pm.getMinutes()<10?'0':'') + pm.getMinutes();
  var endHr = h + ':' + m;;
  Logger.log(endHr);
  return endHr;
}






Pick Your Date


Start/End Times





Hide Previous Dates




Interactive Google Sheets Dashboards

dropdowns

It’s pretty easy to put a bunch of data and charts in a spreadsheet and call it a dashboard. It became a more interesting challenge to make those charts change to reflect variables chosen via dropdown cell menus.

The key it turns out is using =query.

I can do some really powerful things with query and as long at the data bounds (columns/rows) are the same, I can change the content and it’ll replicate in the chart assigned to those columns/rows.

For instance, I can have raw data on a sheet called data.

I can use query on that data on another sheet with something like =query(data!A1:N,”select B,C,D,E,F,G,H,I,J,K,L,M,N where A=”&”‘”&A2&”‘”)

That formula is going to the sheet named data and querying cells A1 through N(whatever the last row is) and selecting columns B through N where the content of cell A matches the text of cell A2 on the local page. As a result we can manipulate the contents of cell A2 and change the data being returned. In this case I did a little data validation drop down to restrict it to the three items for which we have data. Do take note that for numbers I could have just appended the cell reference (&A2 in this case) but because it was a text match I had to staple on single quotes on either side of the value (“&”‘”&A2&”‘”). It took me a while to realize that. Hopefully this saves someone else from the hassle.

Once you’ve got the data you want showing up, you can select it and insert the chart. Now changing your drop down variable changes your chart because it changes the data it’s based off of. Neat.

You’ll note I stuck the query data way off to the right. Turns out hidden columns won’t display chart data.

The actual spreadsheet is embedded below. You should be able to select different values in cells A2 and B2 and see the subsequent change in charts.

Spreadsheet Karma

=QUERY({C:C,C:C},”select Col1, count(Col2) where Col1 <> ” group by Col1″,1)

This is just so handy for getting all the unique values from a column and spitting them out with the counts of their occurrences.1

I can’t tell you how many times I’ve done this a much harder way.

I saw the function above as an answer after I’d responded to a Stack Overflow question about getting unique word counts.

Just another little example of how doing things in the open and being involved in communities ends up benefitting you in unexpected ways.


flickr photo shared by New York Public Library with no copyright restriction (Flickr Commons)


1 If you data isn’t in column C, just change it to the right letter.

Email Support Data

I thought it might be interesting to look at my last chunk of emails by what hour they were answered.

It does seem to show some patterns. Some that may not be healthy for any real separation between on/off work hours.

It is good that I’ve got no email in the 1AM to 5AM zone.

It is bad (probably) that I answer a large amount of email around 9PM. I probably also need to think harder about creating some defined email response windows so that email is not interrupting work that requires more focus and concentration.

It’s easy to let the busy work drag you down while feeling “productive.”


flickr photo shared by Tyne & Wear Archives & Museums with no copyright restriction (Flickr Commons)

Re-Scraping Instagram

Back when Instagram’s API rules didn’t completely suck, I wrote a few posts on scraping it so that some of our faculty could use those data in their research.

Then all their rules changed and everything broke. That’s their prerogative but it’s also my option to complain about it.

But because I posted about it, I got a comment from raiym1 who let me know he wrote a PHP scraper that avoided the API limitations.

I’ve now got that up and running and set up a simple GET so that the URL determines the tagged content that is returned. The PHP for that page is below and allows you to replace the API URL in the old Google Scripts with a new url like http://bionicteaching.com/creations/ig/scrape.php?tag=fish

<?php

require_once 'vendor/autoload.php';

use InstagramScraper\Instagram;

error_reporting(E_ALL);
ini_set("display_errors", 1);

$tag = $_GET['tag'];

$medias = Instagram::getMediasByTag($tag, 20); //sets the number of results returned
echo json_encode($medias, JSON_UNESCAPED_SLASHES | JSON_PRETTY_PRINT); //the second argument doesn't do what I want
?>

You can then make your own custom displays based on that. I made a quick custom page template for the artfulness WP theme (currently showing filler data from the exciting ‘fish’ tag).

screen-shot-2016-09-15-at-10-25-12-am

This example has the tag hardcoded in but could easily use a custom field to pass the value.

<?php
/**
 * Template Name: INSTA GRAMMMMMMzzzz
 * The template for displaying something on the front page.
 *
 * This is the template that displays instagram stuff
 * Please note that this is the WordPress construct of pages and that
 * other "pages" on your WordPress site will use a different template.
 *
 * @package H-Code
 */
get_header(); ?>

<?php
// Start the loop.
while ( have_posts() ) : the_post();

    $layout_settings = $enable_container_fluid = $class_main_section = $section_class = '';
    
    $hcode_options = get_option( 'hcode_theme_setting' );
    
    $layout_settings = hcode_option('hcode_layout_settings');
    $enable_container_fluid = hcode_option('hcode_enable_container_fluid');
    
    switch ($layout_settings) {
        case 'hcode_layout_full_screen':
            $section_class .= 'no-padding';
            if(isset($enable_container_fluid) && $enable_container_fluid == '1'){
                $class_main_section .= 'container-fluid';
            }else{
                $class_main_section .= 'container';
            }
        break;

        case 'hcode_layout_both_sidebar':
            $section_class .= '';
            $class_main_section .= 'container col3-layout';
        break;

        case 'hcode_layout_left_sidebar':
        case 'hcode_layout_right_sidebar':
            $section_class .= '';
            $class_main_section .= 'container col2-layout';
        break;

        default:
            $section_class .= '';
            $class_main_section .= 'container';
        break;
    }
    //echo $section_class.'test';
?>

<!-- THIS IS THE INSTAGRAM PART -->
<section class="parent-section <?php echo $section_class; ?>">
    <div class="<?php echo $class_main_section; ?>">
        <div class="row">
            <script type="text/javascript">                
        jQuery(document).ready(function(){ 
          console.log("ready"); //just to check in the firebug console.
          
        jQuery.ajax({
          url:'https://YOURLINKTOTHESCRIPT/extras/ig/scrape.php?tag=fish',//also could be put into theme or plugin
          jsonp:"cb",
          dataType:'json',
          success: function(data) {
            console.log(data); //dumps the data to the console to check if the callback is made successfully.
            $.each(data, function(index, item){
                           
                $('#insta').append('<figure><a href="' + item.link+ '"><div class="gram gallery-img col-md-4" style="background-image: url('+item.imageThumbnailUrl+'); height:400px;"><figcaption>' + item.caption + '</<figcaption></div></a></figure>');              
              
              }); //each
            } //success
          }); //ajax
          
        });//ready
  </script>
<div class="col-md-12 grid-gallery overflow-hidden content-section" id="insta"></div>
        </div>
    </div>
</section>
<?php 
endwhile;
// End the loop.
?>
<?php get_footer(); ?>


1 On this post. And apparently this theme doesn’t support direct links to comments. About time I wrote my own theme . . .

Google Script to Copy Row Above to Blank Row Below

robot video

I had a spreadsheet that entered blank cells when there was more than one admin for a WordPress site.

So if Site_1 had two admins, I’d get two rows of data. The first row for the site would have-

siteURL | siteTitle | siteAdmin

but the second row for that site would have something like-

< blank > | < blank > | siteAdmin

I started to just drag down and fill but there was lots of data and it just felt like something for the machine to do.

The following Google Script did it for me in a few seconds despite a couple thousand rows of data. Not rocket science but it might be handy for someone and it was a really convenient example of variables and loops when I had a conversation with my son last night.

function myFunction() {
  var ss = SpreadsheetApp.getActiveSpreadsheet();
  var sheet = ss.getActiveSheet();
  var last = sheet.getLastRow();//how many times should I do this?

  for (i = 1; i < last; i++) { 
    var test = sheet.getRange(i, 2);
    Logger.log(test);
  //looks to see if the cell is empty
    if (test.isBlank()) {
      var rewind = sheet.getRange(i-1, 2, 1, 2).getValues();//gets values from the row above
      sheet.getRange(i, 2, 1, 2).setValues(rewind);//sets the current range to the row above
    } 
  
  }
}

Grabbing JSON


flickr photo shared by Library Company of Philadelphia with no copyright restriction (Flickr Commons)

What I wanted to do was grab data from the WordPress API and use that to provision chunks of my new portfolio site. The portfolio is hosted on GitHub and GitHub is HTTPS. At the moment my bionicteaching site is not HTTPS.1 That causes problems as secure and insecure are not friends. I wanted a quick and easy solution so I could continue until I do the HTTPS switch.

The following is how I wandered towards a solution. A number of the things worked but don’t quite work for what I wanted. So they’re worth remembering/documenting for later and it’s kind of fun to see a mix of javascript, php, url manipulation, Google API, and the WordPress V2 API all in one little bit of wandering.

My first thought was to grab the JSON via a Google Script and store it in Google Drive. I can do that but can’t seem to make it available for use the way I want. I tried messing with various URL parameters but it wasn’t flowing and I only started there because I thought it would be easy.

var urlOfTheJson = 'http://bionicteaching.com/wp-json/wp/v2/posts/?filter[category_name]=photography&filter[posts_per_page]=30';// URL for the JSON
var folderId = 'gobbledygookthatyouwantoreplace';//folder ID

function saveInDriveFolder(){
  var folder = DriveApp.getFolderById(folderId);// get the folder
  Logger.log(folder.getFoldersByName);
  var file = UrlFetchApp.fetch(urlOfTheJson); // get the file content as blob 
  Logger.log(file);
  folder.createFile('weeklypics.json',file);//create the file directly in the folder
}

I did eventually get the file accessible in DropBox (the only other place I could think of immediately for https file storage) but setting up OAuth etc. to automate putting it there seemed more hassle than it was worth for a short-time solution. For the record, you can make a JSON file accessible for this kind of use. To do so, get your DropBox share URL (the one from the link icon) and change the first portion as shown below.

https://www.dropbox.com/s/0snpxj1b6wpe267/weeklypics.json?dl=0  //doesn't work
https://dl.dropboxusercontent.com/s/0snpxj1b6wpe267/weeklypics.json?dl=0  //works

With a tiny bit more thought, I opted to use my local development environment (Vagrant) to run a simple PHP file (below) which’ll save the JSON files to the directory that’s already synched to GitHub.

<?php 
$urls = [
"http://bionicteaching.com/wp-json/wp/v2/posts/?filter[category_name]=photography&filter[posts_per_page]=30", 
"http://bionicteaching.com/wp-json/wp/v2/posts/?filter[tag]=javascript&filter[posts_per_page]=10",
"http://bionicteaching.com/wp-json/wp/v2/posts/?filter[tag]=angular&filter[posts_per_page]=10",
"http://bionicteaching.com/wp-json/wp/v2/posts/?filter[tag]=plugin&filter[posts_per_page]=10",
"http://bionicteaching.com/wp-json/wp/v2/posts/?filter[category_name]=wordpress&filter[posts_per_page]=10"];

$fileNames = [
"weeklyPhotos",
"javascriptPosts",
"angularPosts",
"wpPluginsPosts",
"wordpressPosts"
];

for ($x = 0; $x < count($urls); $x++){
	$url = $urls[$x];
	$file = file_get_contents($url);
	file_put_contents('woodwardtw.github.io/json/'.$fileNames[$x].'.json', $file);
}

?>

1 It is not even https

YouTube View Count to Tweet

Untitled

I was just messing around a bit more with mashing up the YouTube API post from earlier with the Twitter post from earlier today.

This Google Script will get the view count of a particular video (Gangnam Style) and send out the total views and the time elapsed watching the video (assuming I did the math right). It has no real purpose (other than the scale of the number amuses me) but it’s one of those things with concepts that I’ll be able to apply more usefully in the future. That’s a gut feeling but I’m pretty confident.

Don’t forget to add the Twitter library as directed here.1


1 in the Apps Script project, include the library MKvHYYdYA4G5JJHj7hxIcoh8V4oX7X1M_ with the identifier set to Twitter

YouTube API to Google Spreadsheets

Because I love Alan. Here’s the API version in Google Script to grab YouTube stats. It does a bit more than the previous XPath version and you can set it to be triggered repeatedly. I’m going to add a loop to add multiple videos etc. in the near future but it’s a good start for anyone who’s doing research on stuff like this.

It is funny what you might notice when you can see the data like this. I triggered it manually twice just to get a few lines in there. Notice that between the first two entries there are no additional views but a chunk more likes/dislikes. Makes me wonder if people are just weighing in without watching or if the data are collected differently resulting in some delay.

Here’s the script1 and it’s pretty well commented up. You’ll need an API key. 🙂

You do see some weird stuff in the raw JSON. Like there’s a Favorites field. Does that exist in YouTube? I didn’t really think about it until it came up 0 for every video . . . even Gangnam Style.

Here’s the result running every hour on a video that I’m hoping changes a bit. I got it off the trending page so it has to be cool right?


1 It took me a good while to realize I need to do the [0] piece to navigate the first element. My best friend is Logger.log(yourVariable) in Google Script. When I finally got methodical enough and used it to spit out the variable results at each level, I was able to figure it out.

YouTube Scraping, XPath, and Google Sheets

APIs can give you much more power but they are often overkill for what people are trying to around here- lightweight social media

Here’s a lightweight example of how you can use Google Sheets and the IMPORTXML function to grab quite a bit of data from various video pages with no API or technical skills.

Straight off, we’re going to want the URL of the video. We’ll put that in column A and we’ll use it as a variable in all our other formulas.

Getting the Paths to the Data

=IMPORTXML(A2,”(//*[contains(@class, ‘watch-title’)])[1]”)

So how’d that come to be? A2 is just asking what URL we want to go to. The XPATH stuff gets a little more interesting. It’s going to look for any class that is named watch-title. I found out the title was in that div by right clicking on the title and choosing inspect in Chrome. The appended [1] will only give us the first item that meets those qualifications. Otherwise the title shows up twice.

The rest of the formulas are pretty much variations on that theme.

  • =IMPORTXML(A2,”//*[contains(@class, ‘watch-view-count’)]”) – View count
  • =IMPORTXML(A2,”(//*[contains(@class, ‘like-button-renderer-like-button’)])[1]”) – Likes count
  • =IMPORTXML(A2,”(//*[contains(@class, ‘like-button-renderer-dislike-button’)])[1]”) – Dislikes count
  • =IMPORTXML(A2,”(//*[contains(@class, ‘yt-user-info’)])[1]”) – User name

Throw in a video of your own if you’d like.