CSV to MYSQL to JS Filtered Table Tutorial

I’m not a programmer. If you’re a programmer, you don’t need me. This tutorial is written for people who are equally clueless. Notice, I am clueless. This is cobbled together from a series of different tutorials, libraries etc. I may very well have done some stuff wrong but it works.

So with that warning out of the way, why might you want to do this? Speed with large(r) amounts of data.

I needed a page that loaded a fairly large amount of information (~1700 rows) and then let people sort through the data quickly. It is a lot like what you can do with Exhibit but I built one version with a Google Spreadsheet backend but it was too slow. I tried pulling all that data into straight JSON and it did make it faster . . . but still too slow.

So there went my tried and true methods. I considered pulling the data into WordPress which is the other way I normally cheat my way out of programming1 but given the information was going to change, it was likely to be more of a pain than it was worth. For the record, there are CSV to post plugins for WP that work decently.

CSV to MySQL

To start I had a CSV file of courses, step one was to get those data in a database. This is pretty easy with phpMyAdmin. I had to make a database, make a user/pw combo, and then make a table with the right number of columns. Those columns needed names and data types.

There are lots of tutorials that go into the details of creating a database and table using phpMyAdmin. There are even conversations about naming conventions in mysql databases. I should have read this last one, but I just used the names already created in the CSV export which came from a SQL database and figured it’d be ok. Then there’s a handy import CSV option in phpMyAdmin. Do that. You can also replace all the data each time if you screw up.

A couple of things tripped me up initially. First, I didn’t check the box to allow “NULL” entries. I needed to do that. If you don’t do that whenever data are missing, the import dies. Secondly, I left in on the default data type instead of changing almost all of my fields to “TEXT”. There are lots of fairly detailed conversations around this but I think for most people who’d get this far, you’ll be using text or will be fairly unconcern with the nuances of difference that matter quite a bit when you’re building for huge scale.

cc licensed ( BY SA ) flickr photo shared by Tom Woodward

cc licensed ( BY SA ) flickr photo shared by Tom Woodward

PHP to MySQL

Now that your information is in its nice database house, we’ll need to prove we are not a stranger and have the keys to the house. PHP is going to be the way we do that.

W3Schools is where I started with how to do that.

I went with this version, which is more bulkier but felt clearer to me.

$user_name = "THIS IS YOUR USER NAME FOR THE DB";
$password = "THIS IS YOUR PASSWORD FOR THAT USER";
$database = "THIS IS THE NAME OF YOUR DB";
$server = "127.0.0.1";


cc licensed ( BY SA ) flickr photo shared by Tom Woodward


cc licensed ( BY SA ) flickr photo shared by Tom Woodward
I picked up a rough idea of what to do to build the HTML table over here.

JQuery and Drop Down Table Filter work together to give me the on the fly filtering without having to do all the work. To make it work I need a table header row.


cc licensed ( BY SA ) flickr photo shared by Tom Woodward

Here’s how the whole thing comes together below. It works.






";
echo "

";
echo "

";
echo "

";
echo "

";
}

mysql_close($db_handle);

}

else {

print "Database NOT Found";
mysql_close($db_handle);

}

?>

Header 1 Header 2 Header 3 Header 4
" , $db_field['ACADEMIC_PERIOD'] , " " , $db_field['SUBJECT'] , " " , $db_field['COURSE_NUMBER'] , " " , $db_field['END_DATE'], "



1 #Nonprogramistan4life even if Jim keeps blogging about APIs. πŸ™‚

Comments on this post

  1. Jim Groom said on February 2, 2014 at 7:44 am

    Tom,

    You really got to talk to Martha and TIm about the work they are doing right now. There are some serious crossovers, and they have effectively programmed the syndicated hub for Domain of One’s Own that is light years beyond anything we approximated with UMW Blogs.

    What’s more, they are starting to enable us to start filtering the broad, distributed community of UMW Domains by all kinds of views, instructor, class student, topic, etc. These are heady days, and I can’t help put think something like this would be even more awesome statewide. As far as APIs, it’s easy to blog about them when you still don’t really know how they work πŸ™‚

    • Tom Woodward said on February 2, 2014 at 8:07 am

      Ha. You know I just did that to get a comment from you.

      I will talk to them now that we have our spaces set up and I can actually play along. I may or may not know enough to keep up or add to the conversation. My last experience with a stupid jquery sort resulted in having to call in Alan. My one solace there is that it was a mess for him as well. I can at least create difficult problems. There’s value in that right?

      My problem is that programming really does tempt me. The hard part for me is being patient enough. I want to get to the action. I need to rewatch Tai Chi Man. Although instead I’m considering taking up Systema (you Β‘must! watch this- out of shape Russians destroying people).

      How is that for the most random comment response ever?

  2. Tim Owens said on February 2, 2014 at 12:52 pm

    I’m not sure what you think programming is if not this? πŸ™‚ This is awesome! Martha and I are learning right along with you. In many ways it’s been refreshing to get off of WordPress for some stuff and get back to basics, but then we’re all in on WordPress to aggregate and filter feeds so I guess it works both ways. I like the MySQL database approach but thought it worth mentioning If you do want the ease of Google Spreadsheets Martha has some code using this (https://github.com/erunyon/FeedCache) that caches it locally as an XML file and refreshes occasionally. Helps if the data entering is something you want to be pretty simple (like a Google Form or just pasting into a Google Spreadsheet).

    • Tom Woodward said on February 2, 2014 at 9:33 pm

      I appreciate that. I’ll have to try that on some of the larger Exhibit stuff and see if it speeds it up.

      The complicating factor here is that the course stuff changes often and is normally tied into an onsite VCU database but I can’t hook into it from our offsite server. It looks like they’re going to expose a json feed so I’m going to have to reconsider things but I figure it ought to be pretty clean. We’ll see.

Leave a Reply

Trackbacks and Pingbacks on this post

No trackbacks.

TrackBack URL