Building a Video Sharing Site using PHP in AWS

By: Clay Loveless, Chief Architect, Mashery Emailed: 1643 times Printed: 2111 times    

Latest comments
By: rohit kumar - how this program is work
By: Kirti - Hi..thx for the hadoop in
By: Spijker - I have altered the code a
By: ali mohammed - why we use the java in ne
By: ali mohammed - why we use the java in ne
By: mizhelle - when I exported the data
By: raul - no output as well, i'm ge
By: Rajesh - thanx very much...
By: Suindu De - Suppose we are executing

There's a blog post out there that gives a brief rundown on building a video sharing site. The post gives a quick overview of what's necessary, and even provides sample code for some of the steps. Obviously you can build something simple for your home movies, or something as complex as YouTube. Yes, you really could build something like the next YouTube on top of Amazon Web Services. All you'd need to do is:

  1. Automate the conversion of uploaded videos to a web-friendly format.
  2. Coordinate a heavy workload of automated conversions.
  3. Create a directory of videos hosted on the site.
  4. Serve the video files on demand.

Use Amazon Web Services as a basis for this, and you're all set. Briefly, here's how we'll do it:

  • Accept video file uploads
  • Create a job in Amazon Simple Queue Service for conversion of the video
  • Pick the job out of SQS and convert to a web-friendly format
  • Create a record in a video database on Amazon SimpleDB
  • Serve the video from a robust location: Amazon S3

Ready?

Step 1: Accept Video File Upload

There are a lot of fancy ways to upload files these days, such as SWFUpload. We won't be using techniques like that in this article. Instead, we'll use the super-simple index.php file the awsfiles.zip mentioned previously.

On your EC2 instance, grab the tutorial files ZIP and unzip it in your home directory.

From your instance:

PROMPT> cd /var/www
PROMPT> cp -f awsfiles/index.php html/index.php
PROMPT> mkdir /var/www/html/uploads
PROMPT> chown apache:apache /var/www/html/uploads

Without digging into all the details, the commands above will create an uploads directory, place a really simple file upload form on your instance homepage, and put files uploaded through it in the uploads folder.

To test the upload, use this sample QuickTime file, if you don't have one of your own lying around. This sample is from Apple's XCode QuickTime examples.

Step 2: Queue Conversion Job

In order to use the Amazon SQS service, first make sure you've signed up for it with your AWS account. Then, use the following code sample to create an 'awstutorial' queue for your video site's conversion jobs.

File: sqs-makequeue.php

<?php
/**
 * Script to check for an 'awstutorial' queue, and create
 * one if necessary.
 */
require_once 'example_setup.php';

// load the service
$sqs = new Amazon_SQS_Client(
    $creds['access_key'], 
    $creds['secret_key']
);

try {
    // get the list of queues
    $newqueue = 'awstutorial';
    $req = new Amazon_SQS_Model_ListQueuesRequest();
    $req->withQueueNamePrefix($newqueue);
    
    $resp = $sqs->listQueues($req);
    $list = $resp->getListQueuesResult()
                 ->getQueueUrl();

    $exists = false;
    foreach ($list as $url) {
        if (strpos($url, $newqueue) !== false) {
            $exists = true;
            break;
        }
    }
    
    if ($exists) {
        echo "Queue $newqueue exists!\n";
    } else {
        // create it
        echo "Creating $newqueue...\n";
        $req = new Amazon_SQS_Model_CreateQueueRequest();
        $req->withQueueName($newqueue)
            ->withDefaultVisibilityTimeout(30);

        $resp = $sqs->createQueue($req);
        if ($resp->isSetCreateQueueResult()) {
            $result = $resp->getCreateQueueResult();
            if ($result->isSetQueueUrl()) {
                echo "Queue {$newqueue} created "
                     . "at "
                     . $result->getQueueUrl()
                     . "\n";
            }
        } else {
            // cope with non-exceptional abject failure
        }
    }

} catch (Amazon_SQS_Exception $e) {
    echo $e->getErrorMessage() . "\n";
    echo $e->getXML() . "\n";
    exit;
}

As you can see, use of the SQS PHP library is very similar to use of the EC2 library.

From your instance:

PROMPT> cd /var/www/awsfiles
PROMPT> php sqs-makequeue.php

Now that your queue is set up, we can submit the conversion job related to the upload of the sample QuickTime movie.

File: sqs-queue-conversion.php

<?php
/**
 * Script to shuttle an uploaded movie to the 'awstutorial'
 * queue so that another process can pick it up and convert
 * it.
 */
require_once 'example_setup.php';

// load the service
$sqs = new Amazon_SQS_Client(
    $creds['access_key'], 
    $creds['secret_key']
);

// Put the Uploaded File on S3
Killersoft_Wrapper_S3::selfRegister();

// establish a unique bucket name, based on 
// hash of access_key, secret_key, and public
// instance hostname.

$hostfile = $creds['tutorial_file_path']
    . DIRECTORY_SEPARATOR
    . 'awstutorial-hostname.txt';

if (file_exists($hostfile)) {
    $host = file_get_contents($hostfile);
} else {
    $host = file_get_contents(
        'http://169.254.169.254/latest/meta-data/public-hostname'
    );
}

$bucket = 's3://awstutorial-'
        . md5(
            $creds['access_key'] .
            $creds['secret_key'] .
            $host
        );

if (! is_dir($bucket)) {
    mkdir($bucket);
}

// put the uploaded file on S3.
// For your YouTube-killer, you'll want to make sure this 
// filename doesn't conflict with other files.

// NOTE: This won't work until it's injected into the 
// upload script from Step 1.
file_put_contents(
    "{$bucket}/{$upload_name}", 
    file_get_contents($upload_file)
);

// now that the file to be converted is in place, put 
// a message in the queue for another server
// to pick it up

// In a production script, you'd skip this step
// and store the URL to your queue in a config file
$req = new Amazon_SQS_Model_ListQueuesRequest();
$req->withQueueNamePrefix('awstutorial');
$resp = $sqs->listQueues($req);
$list = $resp->getListQueuesResult()
             ->getQueueUrl();
$queue_url = array_shift($list);

// Send the actual message
$msg = new Amazon_SQS_Model_SendMessageRequest();
$msg->withMessageBody("{$bucket}/{$upload_name}")
    ->withQueueUrl($queue_url);

$resp = $sqs->sendMessage($msg);

Easy enough; now let's inject this work into the flow of our upload script from Step 1. That's already been done in the "index2.php" file of the awsfiles.zip.

From your instance:

PROMPT> cd /var/www
PROMPT> cp -f awsfiles/index2.php htdocs/index.php

Now a successfully uploaded file will be sent off to S3 and queued into SQS for processing.

Try it out, and then let's peek in S3 and SQS to see if everything happened as it should have. Note: SQS used to have a method that was actually called "PeekMessage", but that functionality was removed in the 2008-01-01 update. We can still take a peek, though simply by receiving messages in the 'awstutorial' queue without removing them. Since the Simple Queue Service is a "deliver at least once" system, the message will stay in the queue until it is deleted, or until is greater than four days old.

File: job-check.php

<?php
/**
 * Check to see if sqs-queue-conversion.php tasks
 * actually put a file on S3 and a message in SQS.
 */
require_once 'example_setup.php';

// Register S3 wrapper
Killersoft_Wrapper_S3::selfRegister();

// set our bucket name
$hostfile = $creds['tutorial_file_path']
    . DIRECTORY_SEPARATOR
    . 'awstutorial-hostname.txt';

if (file_exists($hostfile)) {
    $host = file_get_contents($hostfile);
} else {
    $host = file_get_contents(
        'http://169.254.169.254/latest/meta-data/public-hostname'
    );
}

$bucket = 's3://awstutorial-'
        . md5(
            $creds['access_key'] .
            $creds['secret_key'] .
            $host
        );

// change this if you want to check a different file
if (file_exists("{$bucket}/sample.mov")) {
    echo "{$bucket}/sample.mov is on S3!\n";
}

// now peek in the queue
$sqs = new Amazon_SQS_Client(
    $creds['access_key'], 
    $creds['secret_key']
);

// In a production script, you'd skip this step
// and store the URL to your queue in a config file
$req = new Amazon_SQS_Model_ListQueuesRequest();
$req->withQueueNamePrefix('awstutorial');
$resp = $sqs->listQueues($req);
$list = $resp->getListQueuesResult()
             ->getQueueUrl();
$queue_url = array_shift($list);


// UNCOMMENT IF YOU WANT TO MANUALLY INJECT A MESSAGE
// $msg = new Amazon_SQS_Model_SendMessage();
// $msg->withMessageBody("{$bucket}/sample.mov")
//     ->withQueueName('awstutorial');
// $resp = $sqs->sendMessage($msg);
// sleep(5);


// first, get the approximate number of messages
// in the queue. Why approximate? See the SQS 
// architecture documentation.
$q = new Amazon_SQS_Model_GetQueueAttributesRequest();
$q->withQueueUrl($queue_url)
  ->withAttributeName('ApproximateNumberOfMessages');
  
$resp = $sqs->getQueueAttributes($q);
$num = $resp->getGetQueueAttributesResult()
            ->getAttribute('ApproximateNumberOfMessages');

echo "Queue awstutorial has ~{$num[0]->getValue()} messages.\n";

// now take a look with a minimal VisibilityTimeout,
// so as to not interfere with other processes
// which may access the queue.
$msg = new Amazon_SQS_Model_ReceiveMessageRequest();
$msg->withQueueUrl($queue_url)
    ->withMaxNumberOfMessages(1)
    ->withVisibilityTimeout(1);

$resp = $sqs->receiveMessage($msg);
if ($resp->isSetReceiveMessageResult()) {
    $msg = $resp->getReceiveMessageResult()
                ->getMessage();

    if (isset($msg[0])) {
        echo "Found a message with a body of:\n";
        echo $msg[0]->getBody() . "\n";
    } else {
        echo "Message not fully retrieved:\n";
        var_dump($msg);
    }
} else {
    echo "Couldn't retrieve a message.\n";
}

To check on jobs in your queue, just run the script.

From your instance:

PROMPT> cd /var/www/awsfiles
PROMPT> php job-check.php

Step 3: Perform Conversion Job

Flash-encoded video (FLV) has emerged as the de-facto standard for embedded online video. So, we want to convert our video from QuickTime (.mov) to Flash (.flv) so that we can embed it in a Flash-based video player.

For our sample video, this isn't a big task. But if you start working through conversions of longer, multi-megabyte videos, you'll definitely want to divide that job up over multiple machines and automate it.

The details of the conversion process itself are outside the scope of this article -- but we'll convert a video nonetheless. Just do a web search for the tools you see used in the conversion process if you want to learn more about them. They're popular tools that have been written about a great deal.

First: we need a tool called ffmpeg, which will do the conversion. For that, we'll set up an additional yum repository to pull the necessary packages from.

From your instance:

PROMPT> curl -L -O http://dag.wieers.com/rpm/packages/rpmforge-release/rpmforge-release-0.3.6-1.el5.rf.i386.rpm
PROMPT> rpm -ivh rpmforge-release-0.3.6-1.el5.rf.i386.rpm
PROMPT> yum -y install ffmpeg

You'll see a lot of output during the course of installing ffmpeg and its dependencies. When installation is complete, you should be able to test a conversion manually. The awsfiles.zip bundle contains a sample.mov to test manual conversion.

From your instance:

PROMPT> cd /var/www/awsfiles
PROMPT> ffmpeg -i sample.mov -ar 22050 \
    -acodec mp3 -ab 32k -r 25 -s 320x240 -vcodec flv \
    -qscale 9.5 output.flv

If it works, you're almost in business! (You'll know it worked if there's a file called output.flv in the /var/www/awsfiles directory.) Now we just need to write a script to fetch the name of the file to convert from SQS, retrieve the file from S3, and run that conversion on it automatically. If the command did not work for you and you'd like to run this tutorial through every step, you may need to hit some support forums for ffmpeg usage -- the author of this article is no expert on that topic.

In a high-volume video sharing site, you'd want to run something like the following on multiple EC2 instances on a cronjob, quite possibly working on more than one job at a time. You'd also want to tune the Visibility Timeout settings to be closer to what you actually need to do an average conversion. The following script is just to illustrate the main segments of the workflow with SQS and S3.

File: job-fetch.php

<?php
/**
 * Check to see if sqs-queue-conversion.php tasks
 * actually put a file on S3 and a message in SQS.
 */
require_once 'example_setup.php';

// Register S3 wrapper
Killersoft_Wrapper_S3::selfRegister();

// Fetch a job from SQS
$sqs = new Amazon_SQS_Client(
    $creds['access_key'], 
    $creds['secret_key']
);

// In a production script, you'd skip this step
// and store the URL to your queue in a config file
$req = new Amazon_SQS_Model_ListQueuesRequest();
$req->withQueueNamePrefix('awstutorial');
$resp = $sqs->listQueues($req);
$list = $resp->getListQueuesResult()
             ->getQueueUrl();
$queue_url = array_shift($list);

// - we know all 'awstutorial' jobs are videos to convert.
// - hide job from others while we're working on it.
$msg = new Amazon_SQS_Model_ReceiveMessageRequest();
$msg->withQueueUrl($queue_url)
    ->withMaxNumberOfMessages(1)
    ->withVisibilityTimeout(20);

$resp = $sqs->receiveMessage($msg);
if (!$resp->isSetReceiveMessageResult()) {
    echo "Couldn't retrieve a message.\n";
    exit;
}

$msg = $resp->getReceiveMessageResult()
            ->getMessage();

if (!isset($msg[0])) {
    echo "Message not fully retrieved:\n";
    var_dump($msg);
    exit;
}

// Now we've got a job. Let's check to see if the 
// file to work on is still present.
$source = $msg[0]->getBody();

if (! file_exists($source)) {
    // source is gone, so kill the job and exit
    echo "Deleting orphaned job from queue.\n";
    $del = new Amazon_SQS_Model_DeleteMessageRequest();
    $del->withReceiptHandle(
            $msg[0]->getReceiptHandle()
          )
        ->withQueueUrl($queue_url);
    $sqs->deleteMessage($del);
    exit;
}

// file is present, let's convert it!
$localin = '/tmp/'.basename($source);
file_put_contents(
    $localin,
    file_get_contents($source)
);
$localout = substr($localin, 0, -3) . 'flv';

// ffmpeg command for transformation
$cmd = '/usr/bin/ffmpeg -i '
     . escapeshellcmd($localin)
     . ' -ar 22050 '
     . '-acodec mp3 -ab 32k -r 25 -s 320x240 '
     . '-vcodec flv -qscale 9.5 '
     . escapeshellcmd($localout);

// run the conversion
// you'd probably want exec() or proc_open()
// in production, but we use passthru() 
// for this tutorial so you can watch!
passthru($cmd);

// check for expected output
if (!file_exists($localout)) {
    die("Conversion failed, exiting.\n");
}

echo "Conversion complete!\n"
   . "Deleting completed job from queue.\n";
   
$del = new Amazon_SQS_Model_DeleteMessageRequest();
$del->withReceiptHandle(
        $msg[0]->getReceiptHandle()
      )
    ->withQueueUrl($queue_url);
$sqs->deleteMessage($del);

// Clean up
$bucket = parse_url($source, PHP_URL_HOST);
$cbucket = 's3://' . $bucket . '/converted';
if (! is_dir($cbucket)) {
    mkdir($cbucket);
}
echo "Uploading converted file back to S3\n";
file_put_contents(
    $cbucket . '/' . basename($localout), 
    file_get_contents($localout)
);
echo "Deleting original since we no longer need it\n";
unlink($source);

From your instance:

PROMPT> cd /var/www/awsfiles
PROMPT> php job-fetch.php

This command determines the URL of your 'awstutorial' queue and attempts to receive a message from that queue. When successful, the file referenced in the message is downloaded and processed with ffmpeg. If the conversion fails, the message is left in the queue for a future attempt. If conversion succeeds, the job is deleted from the queue and the resulting file is uploaded back to S3. The original file is also deleted from S3, since this site is only concerned with hosting the converted files.

Step 4: Create a Record in the Video Database

With a converted video, now you're ready to put that into a database somewhere so that you can dynamically update the content on your video sharing site.

Rather than building a MySQL database, managing a MySQL server and dealing with the assorted issues that pop up with scaling any RDBMS, we'll leverage Amazon's SimpleDB and leave the scalability, redundancy and performance concerns in their hands.

For a deep dive into the use Amazon's SimpleDB for PHP library, be sure to go through the excellent SimpleDB Getting Started Guide. If you narrow the scope of the examples in the guide down to PHP, you'll see that the Getting Started Guide uses the same library and similar structure to this article.

We'll create a very simple data structure for this example, which looks like this:

video data structure

Before getting into the steps for setting up the data store, it's important to note: It really is this easy.

Some developers have shown a resistance to even experimenting with SimpleDB, because of a disbelief over how few steps are required to get up and running with it relative to MySQL or other RDBMS. Don't worry! SimpleDB has "Simple" in the name for a reason.

To begin with, we'll create a Domain. A SimpleDB Domain is roughly the equivalent of a table in a traditional database system, in that they are intended to house similar data in each item. Keep in mind that part of "Simple" is that you cannot perform joins between Domains.

File: sdb-create-domain.php

<?php
/**
 * Script to create the data domain for this tutorial
 */
require_once 'example_setup.php';

// load the service
$sdb = new Amazon_SimpleDB_Client(
    $creds['access_key'], 
    $creds['secret_key']
);

// create the domain
$req = new Amazon_SimpleDB_Model_CreateDomainRequest();
$req->setDomainName('awstutorial');

$result = $sdb->createDomain($req);

if ($result->isSetResponseMetadata()) {
    $meta = $result->getResponseMetadata();
    if ($meta->isSetRequestId()) {
        echo 'RequestId: ' . $meta->getRequestId() . "\n";
    }
    if ($meta->isSetBoxUsage()) {
        echo 'BoxUsage: ' . $meta->getBoxUsage() . "\n";
    }
}

From your instance:

PROMPT> cd /var/www/awsfiles
PROMPT> php sdb-create-domain.php

As you can see, each operation on the SimpleDB service has BoxUsage value available in the response metadata. Since billing is done in large part based on BoxUsage, keep an eye on the BoxUsage values while building your SimpleDB applications.

Now that you've got a Domain created, let's make sure that it's really there. To do that, issue a List Domains request.

File: sdb-list-domains.php

<?php
/**
 * Script to check that awstutorial exists
 */
require_once 'example_setup.php';

// load the service
$sdb = new Amazon_SimpleDB_Client(
    $creds['access_key'], 
    $creds['secret_key']
);

// make sure the domain is active
$req = new Amazon_SimpleDB_Model_ListDomainsRequest();

$result = $sdb->listDomains($req);

if ($result->isSetListDomainsResult()) {
    $list = $result->getListDomainsResult()
                   ->getDomainName();
                   
    foreach ($list as $domain) {
        echo "DomainName {$domain}\n";
    }
}

From your instance:

PROMPT> cd /var/www/awsfiles
PROMPT> php sdb-list-domains.php

With a Domain created, we can immediately begin inserting data. That's right, no need to define a schema -- just start inserting items with the attributes that are relevant to them. Attributes are a lot like columns in a spreadsheet.

File: sdb-create-records.php

<?php
/**
 * Insert a test record into the awstutorial domain.
 */
require_once 'example_setup.php';

// load the service
$sdb = new Amazon_SimpleDB_Client(
    $creds['access_key'], 
    $creds['secret_key']
);

// Let's call this item1 -- acts as the "id" column
// in our table graphic.
$req = new Amazon_SimpleDB_Model_PutAttributesRequest();
$req->withDomainName('awstutorial')
    ->withItemName('item1');

// create a sample record for the table structure we're 
// working with
$att = new Amazon_SimpleDB_Model_ReplaceableAttribute();
$att->withName('userid')
    ->withValue('clay');
$req->withAttribute($att);

$att = new Amazon_SimpleDB_Model_ReplaceableAttribute();
$att->withName('category')
    ->withValue('tutorials');
$req->withAttribute($att);

$att = new Amazon_SimpleDB_Model_ReplaceableAttribute();
$att->withName('file')
    ->withValue('sample.mov');
$req->withAttribute($att);

$att = new Amazon_SimpleDB_Model_ReplaceableAttribute();
$att->withName('description')
    ->withValue('Just a sample video. Watch!');
$req->withAttribute($att);

$att = new Amazon_SimpleDB_Model_ReplaceableAttribute();
$att->withName('size')
    ->withValue('71k');
$req->withAttribute($att);

$att = new Amazon_SimpleDB_Model_ReplaceableAttribute();
$att->withName('tags')
    ->withValue('apple');
$req->withAttribute($att);


$result = $sdb->putAttributes($req);

if ($result->isSetResponseMetadata()) {
    $meta = $result->getResponseMetadata();
    if ($meta->isSetRequestId()) {
        echo 'RequestId: ' . $meta->getRequestId() . "\n";
    }
    if ($meta->isSetBoxUsage()) {
        echo 'BoxUsage: ' . $meta->getBoxUsage() . "\n";
    }
}

From your instance:

PROMPT> cd /var/www/awsfiles
PROMPT> php sdb-create-records.php

Just like that, you've got a resilient database containing one record. To see it, you may use the following script:

File: sdb-list-items.php

<?php
/**
 * List test records in the awstutorial domain.
 */
require_once 'example_setup.php';

// load the service
$sdb = new Amazon_SimpleDB_Client(
    $creds['access_key'], 
    $creds['secret_key']
);

/**
 * 
 * A query just returns the name of the items in a domain,
 * which means you need to send another query for
 * attributes relating to each item.
 * 
 */
$req = new Amazon_SimpleDB_Model_QueryRequest();
$req->setDomainName('awstutorial');

$response = $sdb->query($req);

if ($response->isSetQueryResult()) {
    
    echo "Amazon_SimpleDB_Model_QueryRequest result\n";
    echo "-----------------------------------------\n";
    
    $result = $response->getQueryResult();
    $list   = $result->getItemName();
    
    foreach ($list as $item_name) {
        echo "ItemName: {$item_name}\n\n";
    }
}

/**
 * 
 * Query with Attributes returns item names and their attributes
 * in a single query.
 * 
 */
$req = new Amazon_SimpleDB_Model_QueryWithAttributesRequest();
$req->setDomainName('awstutorial');

$response = $sdb->queryWithAttributes($req);

if ($response->isSetQueryWithAttributesResult()) {

    echo "Amazon_SimpleDB_Model_QueryWithAttributesRequest result\n";
    echo "-------------------------------------------------------\n";

    $result = $response->getQueryWithAttributesResult();
    $list   = $result->getItem();
    
    foreach ($list as $item) {
        echo "Item Name: " . $item->getName() . "\n";
        
        $attlist = $item->getAttribute();
        foreach ($attlist as $attribute) {
            echo $attribute->getName() . ": ";
            echo $attribute->getValue() . "\n";
        }
    }
}

From your instance:

PROMPT> cd /var/www/awsfiles
PROMPT> php sdb-list-items.php

At the time of this writing, SimpleDB is in beta, which means there are a few limitations. You may create up to 100 Domains, and each Domain is limited to 10GB. Additional limitations are applied to each call to the SimpleDB service, but they are permissive enough to allow a very wide variety of applications to be built on top of SimpleDB.

And, since December 1, 2008, a free tier has been added to the SimpleDB service that allows light-to-moderate usage of SimpleDB at no charge. Please refer to the SimpleDB Developer Guide for details on per-call limitations.

Step 5: Put It All Together

If you've made it this far in this article, congratulations! We've covered how to take an uploaded file, load it to S3, queue up a conversion job with SQS and EC2 and run that conversion. Tie that together with creating a SimpleDB database of files, and all you need now is selecting data from SimpleDB to display on your website.

SimpleDB makes that easy with its Select method. For example, here's how to pull up all the records with the tag "apple" from the Domain we created in the previous section.

File: sdb-sqlselect.php

<?php
/**
 * Select specific records in the awstutorial domain using
 * the SQL SELECT style interface.
 */
require_once 'example_setup.php';

// load the service
$sdb = new Amazon_SimpleDB_Client(
    $creds['access_key'], 
    $creds['secret_key']
);

// Craft a SQL-like SELECT expression
$q = "SELECT * FROM awstutorial WHERE tags LIKE 'app%'";


$req = new Amazon_SimpleDB_Model_SelectRequest();
$req->setSelectExpression($q);

$response = $sdb->select($req);

if ($response->isSetSelectResult()) {

    $result = $response->getSelectResult();
    $list   = $result->getItem();
    
    foreach ($list as $item) {
        echo "Item Name: " . $item->getName() . "\n";
        
        $attlist = $item->getAttribute();
        foreach ($attlist as $attribute) {
            echo $attribute->getName() . ": ";
            echo $attribute->getValue() . "\n";
        }
    }
}

From your instance:

PROMPT> cd /var/www/awsfiles
PROMPT> php sdb-sqlselect.php

Be sure to dig into the details of the SimpleDB query syntax in the SimpleDB Developer Guide. You'll find that it's a bit different than standard SQL, but it's expressive enough that you can easily assemble a community-driven video sharing site, that allows for selecting videos by tags, users, file types, etc.

Extra Credit: Tie Step 3 and Step 4 Together

By now, you've got visions of video-sharing domination dancing through your head (as well you should!). To make this all smooth, go back to Step 3 and alter the 'job-fetch.php' script. Combine it with Step 4's 'sdb-create-records.php' script to create a record for the converted movie, with the 'file' value of the location of the converted .flv that's uploaded to S3.

With that, the SELECT query above in Step 5, plus a handy .flv player like the JW FLV Media Player, and you're almost ready to launch!

Don't Forget to Cache!

One thing to keep in mind: when dealing with database driven websites, even SimpleDB driven websites, it's always a good idea to make good use of caching. SimpleDB is fast, but it is still a web services call. You'll get the best performance utilizing a local cache on your webserver to minimize the number of times you need to go back to the database to fetch data.

A well thought-out caching plan is essential for getting the best performance out of a dynamic website. Plus, with SimpleDB, a carefully considered approach to caching will also save you money!

The details of how to cache, when to cache, and what tools to use for caching are outside the scope of this article. Be sure to check your favorite framework for caching classes, as well as PEAR's Cache_Lite and thePHP Memcache extension.

Conclusion

It's difficult to imagine building web applications "the old way" once you've gotten the hang of the various Amazon Web Services offerings. New and exciting services are continually added to the AWS arsenal, so be sure to get up to speed on those discussed here so that you can be ready to integrate the next set of tools from the Amazon labs.

The next generation of YouTube-like services will most likely be built on the basic services outlined here. Master them, and you'll be ready for whatever heavy lifting you find you need.


PHP Home | All PHP Tutorials | Latest PHP Tutorials

Sponsored Links

If this tutorial doesn't answer your question, or you have a specific question, just ask an expert here. Post your question to get a direct answer.



Bookmark and Share

Comments(1)


1. View Comment

Right, so were is the tutorial files.zip?
Or, example_setup.php?

Sorry, but without that stuff this is kinda useless. I hope you can link them because I'd like to try this.


View Tutorial          By: jeremy at 2011-10-24 15:58:40

Your name (required):


Your email(required, will not be shown to the public):


Your sites URL (optional):


Your comments:



More Tutorials by Clay Loveless, Chief Architect, Mashery
Building a Video Sharing Site using PHP in AWS
A Basic Example using PHP in AWS (Amazon Web Services)
Introduction to Amazon Web Services

More Tutorials in PHP
PHP code to import from CSV file to MySQL
PHP code to write to a CSV file from MySQL query
PHP code to write to a CSV file for Microsoft Applications
Convert XML to CSV in PHP
Password must include both numeric and alphabetic characters - Magento
PHP file upload (Large Files)
PHP file upload prompts authentication for anonymous users
PHP file upload with IIS on windows XP/2000 etc
Error: Length parameter must be greater than 0
Multiple File Upload in PHP using IFRAME
Resume or Pause File Uploads in PHP
Exception in module wampmanager.exe at 000F15A0 in Windows 8
Handling file locks in PHP
HTML table output using Nested for loops in PHP
Count occurrences of a character in a String in PHP

More Latest News
Most Viewed Articles (in PHP )
public, protected, and private Methods in PHP
preg_split() and explode() in PHP
PHP code to write to a CSV file from MySQL query
Convert a hex string into a 32-bit IEEE 754 float number in PHP
Upload and Download files with FTP in PHP
Function to return number of digits of an integer in PHP
func_get_arg() and func_get_args() functions in PHP
Exception in module wampmanager.exe at 000F15A0 in Windows 8
Reading Cookie Values in PHP
Reading .CSV file in PHP
isset() function in PHP
Traversing Arrays Using foreach in PHP
public, protected, and private Properties in PHP
__toString() METHOD in PHP
Parent: child process exited with status 3221225477 -- Restarting
Most Emailed Articles (in PHP)
PHP code to write to a CSV file from MySQL query
PHP code to write to a CSV file for Microsoft Applications
Password must include both numeric and alphabetic characters - Magento
PHP code to import from CSV file to MySQL
Convert XML to CSV in PHP
Comparison operators in PHP
Using PEAR::Crypt_HMAC in PHP
GDBM, NDBM, DB2, DB3, DBM, and CDB Databases in PHP
Perl's Encoding::FixLatin equivalent in PHP
call_user_func() or call_user_func_array() functions in PHP
History and origin of PHP
isset() function in PHP
Binary Operators in PHP
if Statements in PHP
do...while Loops in PHP