Facebook API v2.0 Ubuntu PHP 5.4 #fail

Facebook #fail, I do not like thisSo I finally caught on to the fact that Facebook released version 2.0 of their API at the end of April (to coincide with the F8 developer conference.) and will discontinue pre v2.0 API calls in April 2015.

I thought I had better take a look as I support a number of PHP apps that run on Facebook or integrate with Facebook for login/authentication etc.

v2.0 of the Facebook API has been completely rewritten from the fairly basic previous API code that I had been using, there are a lot of changes including new login features and permissions.

‘This is great’, I thought…

Facebook PHP API v2.0 requires PHP v5.4

However there is a problem that will cause a whole lot of people a whole lot of problems. The new PHP SDK for v2.0 of the API requires PHP v5.4.

if (version_compare(PHP_VERSION, '5.4.0', '<')) {
 throw new Exception('The Facebook SDK v4 requires PHP version 5.4 or higher.');
}

My main production server runs the latest (as of the time of writing) LTS of Ubuntu v12.04, it’s up to date but a quick php -v from the command line reveals

PHP 5.3.10-1ubuntu3.13

The 14.04 LTS is not available to apt until the first point release which I think is any day now. 14.04 LTS will include PHP 5.5, but I’m not really planning on upgrading my live server straight away, 12.04 LTS is supported up to 2018 – that’s what Long Term Support means.

I currently work with live production Ubuntu servers ranging from v8 to 12 that I would not even dream of trying to upgrade.

Bearing in mind that a whole lot of people won’t even be running an Ubuntu version above 12 on their production servers getting to PHP 5.4 is going to present them with a considerable upgrade challenge.

You are going to have to be pretty brave to manually upgrade PHP, and if you do it without proper testing then you are also going to have to be particularly stupid.

Upgrading PHP without testing is going to break applications that are running PHP code that has been deprecated, older versions of Magento spring to mind, not to mention the problem people who are on a virtual or shared hosted systems where they have absolutely no control of the version of PHP running on their server.

Facebook #fail

So it looks like it’s a real #fail from Facebook to implement a PHP SDK totally dependent on a PHP version that a lot of people are not running and possibly cannot run.

What does this mean for your existing Facebook apps?

  • For apps that existed before April 30th 2014, making an API call without specifying a version number (‘unversioned’) is equivalent to making a call to the v1.0 of the API.
  • For apps created on or after April 30th 2014, making an API call without a specifing a version number is equivalent to making a call to v2.0 of the API.
  • Apps that were inactive or have a creation date on or after April 30th, 2014 will not be able to make calls to v1.0 of the API. They must use v2.0.

This means your existing pre April 30th 2014 apps will be allowed to still make pre v2.0 api calls until April 30th 2015 after that they will stop working.

  • If you want to create a new Facebook App you have to use the v2.0 API.
  • Inactive apps will no longer work. (define inactive please Facebook.)
  • if you want to create a new Facebook App you need to upgrade to PHP 5.4 or for Ubuntu admins be running the latest LTS (14.04).
  • if you have code on Envato using the old API it will be removed in August 2014 – goodbye to my Magento Facebook storefront!

Man this sucks

I think this really sucks, especially if you are authenticating users via Facebook login if you don’t do anything your users will not be able to login to your website after April 30th 2015. I am kinda glad I didn’t implement that Facebook login feature on our eCommerce site now…

if your eCommerce site is running on shared hosting, or does not support PHP5.4 (i.e. Magento 1.3x) then you are pretty well stuffed I think.

The new API has a learning curve for devs too and I am already coming to this pretty late so I guess I better look at what options I have and what apps are going to stop working next year if I do nothing, or am unable to get my code and server to PHP 5.4 by then.

Having said all this I would bet my bottom dollar that Facebook change these deadlines and extends the v1.0 API support beyond 2015 – watch this space…

Because I am now depressed I am going to include a funny picture to cheer me up.

silly-animals-doing-funny-things

 

 

 

 

 

 

How to move the Dropbox cache folder

DropboxHow to move the Dropbox cache folder

Dropbox is extremely useful, it can be used in many different ways to allow distributed sharing of files between various systems, Windows, Mac, Unix, IOS devices. The list of ways to use Dropbox are endless.

I use one dropbox account to backup SQL backups from various systems on a Windows 2003 server however I noticed today the disk dropbox was installed on had ran out of disk space and the problem was the Dropbox cache folder.

Dropbox Cache Folder too large?

The Dropbox cache folder is a hidden folder within the Dropbox folder which caches older versions of Dropbox files. The cache contents are cleaned out every few days by Dropbox but the cache files still seem to have the ability to grow very large, very quickly – my cache folder was 20GB, which on a relatively older 250GB disk was a large percentage of the free disk space.

Whilst this may not be a problem on some desktop PC’s or Laptops if you have Dropbox installed on a server with limited disk capacity you might find yourself running low on disk space. Even worse, if you installed Dropbox on a relatively small system partition, your system will grind to a halt.

You can manually delete the cache folders, but you will see that they very quickly reappear and start to grow in size.

Move Dropbox cache folder to an external drive

The solution is to move the Dropbox cache folder (NOT the data folder) to an external drive.

Unix users will be familiar with symbolic links which allow you to link a virtual file or folder to a target file or folder. This is possible on newer versions of Windows with the Mklink command, and on older versions of Windows Server with the Junction command.

To move my Dropbox cache folder on Windows 2003 to an external usb drive, I used junction with the following command:

junction “d:\DATA\Dropbox\.dropbox.cache” “g:\EXTERNAL\dropbox-cache”

with Mklink on Windows 7/8 the command would be

mklink /d /h  “d:\DATA\Dropbox\.dropbox.cache” “g:\EXTERNAL\dropbox-cache”

The external drive is large so the Dropbox cache folder can grow there without any problems.

 

 

Upgrade Magento 1.8 to 1.9

Magento Upgrade

Magento CE 1.9 has been released. It appears to be a relatively small update introducing a new responsive GUI. See the release notes here.

Here are my notes after upgrading my dev store from 1.8.x to 1.9 using the command line.

Backup existing install folder and database

1. Magento source
tar -cvvf /home/backup/magento.tar /home/www/magentoinstallfolder/

2. Magento DB
/usr/bin/mysqldump -h localhost -u USER -pPASS magento | gzip > ~/backup/folder/db-magento.sql.gz

1. Remove cached files and temp files

rm -rf downloader/pearlib/cache/*
rm -rf downloader/pearlib/download/*
rm -rf downloader/var/cache/*
rm -rf downloader/var/report/*
rm -rf downloader/var/tmp/*
rm -rf media/catalog/product/cache/*
rm -rf media/tmp/*
rm -rf var/cache/*
rm -rf var/session/*

2. Restart memcached (if being used)

3. Sync mage

./mage mage-setup

./mage sync

4, Run upgrade

./mage upgrade-all

5. Reset ownership and permissions on all files and folders.

Check Magento is operational, login and refresh caches.

If all is well take a well deserved break and maybe eat a cream cake. If all is not well restore from backup…

 

 

Google Universal Analytics Update for Magento Version v.1.3.X Old Versions

gua

Google Universal Analytics went out of BETA recently (May 2014) and is now available for all users. Universal Analytics has “more features, better insights”.

All versions of Magento include the client side analytics code which can be enabled in Magento Admin. When enabled Magento creates the client side html and javascript and appends it to the footer section of each page.

On the checkout completion page additional code is created to track sales.

Google Univeral Analytics Magento

Google Universal Analytics uses new client side javascript code, this means Magento does not yet support Google Universal Analytics by default, however it is relatively easy (if you are using new versions of Magento) to override the base/default Magento code that creates the client side analytics html and javascript.

If you simply turn off the built in Magento analytics feature and add the new Google Universal Analytics code to your page templates (as Google suggests you do) then you will lose all eCommerce tracking.

You can update your Magento core code to support Google Universal Analytics using the information below.

New Versions > 1.4.x

If you are using a newer version of Magento i.e. > 1.4.1.x you can update the core analytics functionality following the guidelines here.

Old Versions < 1.4.x

if you are using an older version of Magento (before 1.4.1) i.e. Magento 1.3.x, I have adapted the Magento v1.3.x google analytics core module code to support Google Universal Analytics. The code can be installed as a local module and will override the core module.

Download the code from my GitHub

To install the module copy the files to your Magento app folder, refresh your Magento configuration and configure the settings under Configuration -> Google API -> Google Analytics.

To view the analytics code look at the source code of any page, scroll to the bottom and look for the code between the Google Universal Analytics Code comment tags. For order tracking information, check the code on the Checkout/Success page.

Use GEOIP2 for PHP Geo IP data in WordPress, Magento, HTML

The above iFrame makes an ajax request to a PHP class querying GeoLite2 data created by MaxMind, available from http://www.maxmind.com.
This example displays the location of a single visitor i.e. YOU. Click here to see the logging data for this WordPress blog which demonstrates a PHP memcache data logging solution with geo ip visitor data displayed dynamically on a google map.

Magento SEPA Migration Check List

The Single Euro Payments Area (SEPA) is a payment-integration initiative of the European Union for simplification of bank transfers denominated in euro.

If you make or accept payments via direct bank transfers then you will know all about SEPA. As a customer online payment method direct bank transfers are popular in some countries (i.e. Germany) and not so popular in others (i.e the UK).

The deadline for SEPA payments was 1st February 2014 but (as of the time of writing) has been extended 6 months to 1st August 2014.

SEPA standardises bank transfers in Europe, replacing bank specific account and sort code numbers with an IBAN (International Bank Account Number) and BIC (Business Identifier Code).

For most businesses the migration to SEPA involves communicating the new changes to customers and migrating bank account numbers and sort codes to IBAN and BIC.

For Magento stores if you have a payment method that allows customers to pay via a direct bank transfer then you need to ensure that you can capture IBAN and BIC numbers at checkout. Until 1st August 2014 you can decide to give the customer the option of entering either bank account number and sort code or IBAN and BIC. From August 1st you *should* only be accepting IBAN and BIC.

Here is what you need to do to get your Magento Store ready for SEPA payments

  • Make sure you are capturing bank account information in the SEPA format – IBAN and BIC at checkout.
  • Check that your validation rules work for the IBAN and BIC formats.
  • Make a note to remove input fields and validation rules for old account formats, bank account number and sort code.
  • Upgrade your existing debit payment methods to make sure they support SEPA.
  • Implement a validation system for IBAN and BIC

This debit module is perfect for German Magento shops implengint Direc Debit (Lastschrift) payment methods and supports SEPA : https://github.com/therouv/Magento-DebitPayment

To validate IBAN account numbers with PHP take a look at https://code.google.com/p/php-iban/

 

 

Javascript only EU Law Cookie Notification Message for WordPress and Magento

You know sometimes you just want a quick solution to a problem and wherever you look there are just complex, overpriced scripts that make your simple problem sound like something that needs a lot of code and money to get to work.

I needed to display a cookie notification message on a wordpress and magento site, to avoid compatibility problems I wanted to avoid using jQuery and just do it as simply as possible in javascript.

After looking around at a lot of $10 jQuery solutions I found a free javascript only solution, modified it for Magento and WordPress and implemented it.

Download the code from Github here

Whatever your opinion of the EU Cookie Law which came into effect in 2012, sooner or later if you are in Europe you may need to show that you are doing something about it. Err actually it’s 2014, so I guess we are already a teeny weeny bit late.

From what I understand the law may differ in various countries (certainly outside Europe), so depending on where your website is, and who visits your site the law may or may not apply to you.

This script will let  you quickly implement a cookie notification banner with a link to your privacy policy. The notification itself uses cookies and will not be show again when closed.

You can implement this solution on any website by copying the install files to your web server and adding the following code to the footer of your site.

<script type=”text/javascript”>

var cookieNotificationURL='/cookies/';
var cookieNotificationLifetime=180;
var cookieNotificationMessage='This website uses cookies. For more information please ';
var cookieNotificationClickMessage='click here';
var cookieNotificationMessageFade=true;
var cookieNotificationPath='/path-to-install-folder/
</script>

<script type="text/javascript" src="/js/cookienotification/cookie.notification.min.js"></script>

This sets up some variables used by the script, the URL of the privacy info page, the lifetime of the cookie used by the script in days, messages displayed, the option to fade the message automatically and the path to the install folder – used to display the close image.

For wordpress simply paste this into the footer php file of your theme.

For Magento paste this into the footer.phtml template file of your Magento theme. You can translate the messages, and perhaps a good idea, choose to display the message only on non SSL pages – we don’t want this javascript to somehow screw up a sale!

<?php if (empty($_SERVER['HTTPS']) && $_SERVER['HTTPS'] = ‘off’): ?>

<!-- Cookie Notification -->
<script type="text/javascript">
var cookieNotificationURL='/datenschutzerklaerung/';
var cookieNotificationLifetime=180;

<?php if (Mage::app()->getStore()->getId()==2): ?>
var cookieNotificationMessage='This website uses cookies. For more information please ';
var cookieNotificationClickMessage='click here';
<?php else: ?>
var cookieNotificationMessage='Diese Website verwendet Cookies. ';
var cookieNotificationClickMessage='Klicken Sie hier für weitere Informationen.';
<?php endif ?>
</script>

<script type="text/javascript" src="/js/cookienotification/cookie.notification.js"></script>
<?php endif ?>

Here I am manually checking the store id and translating the text to German. Remember to refresh your Magento cache after editing and saving the footer template file.

You can see the code working on this WordPress blog below, and also with Magento on my Magento development site.

Tested with Magento 1.8.x, WordPress 3.blah, Chrome, Firefox, IE11. Detects mobile devices and modifies banner size accordingly.

IOS 7.x Jailbreak SSH Access / SSH Tunnel

There are 3 reasons why I always “Jailbreak” my iPad.

  1. The Apple iPad is a (bloody expensive) computer. When I buy a (bloody expensive) computer I expect to have 100% usability from it. The restrictions imposed by Apple on their i device operating system (IOS) inhibit it’s practical usability considerably, and I really don’t like that.
  2. When I am travelling I use the excellent MyWi app to turn my iPad into a wireless hotspot.
  3. For privacy and security I like to be able to tunnel my iPad traffic through a secure SSH connection to my server.

“Jailbreaking” exploits vulnerabilities in IOS to achieve root access to the operating system thus “liberating” IOS and allowing ad hoc software to be installed and executed. If you come from a Unix background this is great, because your (bloody expensive) computer now becomes really useful in accessing other Unix based systems and doing geeky Unix type stuff.

I have been Jailbreaking my iPad since IOS v.5 – I find little point in Jailbreaking my iPhone as the screen size limits use.

The Jailbreak process itself is always seamless:

  • Backup with iTunes
  • Install update from iTunes (not via OTA update)
  • Restore from iTunes
  • Jailbreak
  • Restore Cydia software and purchases, i.e. MyWi.

The IOS 7.x Jailbreak was released recently and I finally decided to upgrade my iPad from IOS 6.x to IOS 7.x and apply the Jailbreak, as usual this worked pretty well, but some changes in IOS 7 caused problems:

  • IOS 7 stops applications from connecting to localhost SSH on port 22
  • IOS 7 multi tasking affects SSH background connections

IOS 7 Stops Applications from Connecting to localhost SSH on Port 22

This was a devious software change by Apple. Normally after Jailbreaking the first task is to install Open SSH via Cydia – this gives you normal SSH terminal access to the device, and then use a terminal application such as iSSH to login to the device (localhost) as root.

With IOS 7 Apple have hardcoded a restriction into the operating system that stops (App Store) Apps from making an SSH connection to localhost on the default SSH port 22. When you try and connect with iSSH you will get a connection cancelled error. You can still SSH from an external device, but not locally.

The workaround for this is to change the listening TCP/UDP ports used by the SSH daemon to something other than 22.

To do this you need to edit a couple of system files. An easy way to edit the files is with the Cydia app iFile.

Take a look at /etc/services this file defines network services including SSH. Find the entries for SSH:

ssh    22/udp    # SSH Remote Login Protocol
ssh    22/tcp      # SSH Remote Login Protocol

and duplicate them creating a new service called ssh2

ssh2    52222/udp    # SSH Server
ssh2   52222/tcp      # SSH Server

Save the file.

Here I am using 52222 for the UDP/TCP ports, you can use other port numbers but stay clear of well known ports from 0 – 1023 (dynamic/private ports 49152 to 655535 are preferable).

Now edit /Library/LaunchDaemons/com.openssh.sshd.plist  and change the SockServiceName string to ssh2.

<key>SockServiceName</key>
<string>ssh2</string>

Save the file and reboot.

We are basically telling the operating system to continue using port 22 for SSH connections but to listen for SSH connections on a different port.

You can now connect using SSH on the port you specified i.e.

ssh root@my.ipad.address:52222

Remember to change the root and mobile default passwords of your i device when you login.

IOS 7 multi tasking affects SSH background CONNECTIONS

So now I have root access to my IOS 7.0 device I can run SSH to create a secure tunnel to my Ubuntu server:

ssh -N -g -D XXXX user@myserver.com

This creates a SOCKS proxy tunnel on port XXXX over SSH to my server, the i device can be configured to send all traffic via this proxy with a proxy auto config (PAC) file.

On IOS 7.0 this worked as expected, hurrah! I ran the shh tunnel changed my Wifi proxy settings to auto using my PAC file URL, switched apps to Chrome, checked my IP address to confirm I was proxying via the SOCKS tunnel and was happy that my iPad data was going through the “secure” tunnel –  until a few minutes later when it stopped working, doh!

The tunnel stops working because shortly after switching apps the SSH connection to the iPad is terminated, also terminating the SSH tunnel. This is because Apple has changed the way App multitasking works in IOS 7.x

When you switch apps in IOS 7 some apps continue to run for a short while and are then set to a suspended state to reduce system resources. They will “instantly” launch when you return to them. Of course when an app like iSSH switches to the background and is suspended by the operating system any active SSH connections will quickly timeout and terminate. This means our session running the SSH tunnel will be terminated, closing the tunnel.

Some apps are allowed to update in the background, and this is controlled via the background refresh options in settings, but as of the time of writing iSSH (and Prompt) do not appear in this list. (in IOS 6 apps were allowed to run for 10 minutes in the background and iSSH used to prompt you to return to the app to keep connections alive).

Fortunately there is a simple work around to this problem, install the mobile terminal app via Cydia and reboot. The mobile terminal app has been around for a while and gives you direct command line access as the mobile user. Although Cydia says it only supports IOS v4 to v6 it installs and runs perfectly on IOS v7.x too.

The great thing about mobile terminal is that it creates a direct local login session. When you switch the app to the background this session will keep running even when the mobile terminal app is suspended and reset. In fact if you install adv-cmds via Cydia you can login via SSH and see this login session running as a process with the ps command.

So we execute our SOCKS proxy ssh command in mobile terminal and setup the tunnel, when the mobile terminal app switches to the background the tunnel will stay open in the login session indefinitely, or until you kill the session manually from another command line using the kill process command.

If you don’t want to use mobile terminal have a look at the cydia implementations of screen and autossh.

Now I have full SSH functionality from my (bloody expensive) IOS 7.x computer again!

Here is the pac file I use for my proxy auto config:

function FindProxyForURL(url, host) {
return “SOCKS localhost:XXXX”;
}

 

HTTP Live Streaming using Quicktime Broadcaster and FFMPEG

It’s 2014 and I am still using the Apple Darwin Media server, but after years of being neglected by Apple it is starting to show it’s age. One of the main problems is that for Apple i devices there is no native RTSP support so streaming content from the Darwin server to portable devices is not so easy.

Apple have largely replaced RTSP with HTTP Live Streaming (HLS). HLS basically just cuts up video files into small MPEG video segments that can be downloaded/streamed via a normal website using the HTTP protocol.

This works pretty well for video content that you want to convert from say MP4 to HLS, you run a conversion process that transcodes the video into multiple 10 second segments that you serve up to clients via an .m3u8 reference file on your web server.

If you want to do this with live content then the process is similar but you will also need a client to transmit the live video to your server to transcode it into segments on the fly.

When I am streaming live RTSP content to my Darwin media server I use the Apple Quicktime Broadcaster (on my MAC) as its free and kinda works well. So lets use Quicktime Broadcaster (QTB) to stream live video to a server running FFMPEG which will transcode the video into HLS segments which can be served up by Apache (or your web seriver of choice) to HLS clients.

You can download the Quicktime Broadcaster for Mac here. It hasn’t been updated since 2009, so obviously isn’t high on Apples list of supported software anymore, but hey ho, it is free.

Fire up QTB and configure your video and audio settings.

We will be creating a manual unicast connection and the screen shots below show the settings I used.

Set transmission to Manual Unicast. The address is the TCP/IP address of the server that will run FFMPEG. Note the port numbers used for audio and video, this info is saved in the .SDP file.

screenshot_02

 

For this demo I am using my webcam as the source video. If you have an external device select it as the video source. Choose your compression options depending on the bandwidth available to you. Enable and configure audio compression in the same way.

screenshot_01

 

Click broadcast and the client will attempt to unicast the video data to the server (even though nothing is listening yet).

Export these settings to a .sdp file by clicking File -> Export -> SDP save the SDP file and then upload it to your server.

Now we want to run FFMPEG so that it listens on the ports we specified, captures the unicast video and audio data, transcodes it into HLS segments and stores the HLS data to a publicly accesible area of your web server.

We do this by running FFMPEG with all the options we need, Here is an example command line :

 

ffmpeg -i  /Dropbox/stream/hls.sdp -acodec libfaac -ac 2 -b:a 128k -vcodec libx264 -vpre libx264-ipod640 -b:v 500k -threads 0 -g 75 -level 3.1 -map 0 -vbsf h264_mp4toannexb -flags -global_header -f segment -segment_time 10 -segment_list hls.m3u8 -segment_list_flags +live -segment_list_entry_prefix http://tv.server.co.uk/media/ -segment_format mpegts stream%05d.ts

 

Here we are telling FFMPEG to use the SDP file we exported from QTB as the input, I saved it to my Dropbox. Then we are specifying the compression parameters for the transcoded MPEG transport stream (TS) files. You can easily find other examples of various FFMPEG compression options and use them depending on your requirements.

The segmenter options define how big the segments will be and where the .m3u8 file will be created – in this case I specified 10 second segments and the m3u8 file is being created in the folder where I ran FFMPEG from. The .m3u8 file and TS segments should be created in, or later moved to a folder accessible from the internet.

The segment_list_entry_prefix defines the prefix to the TS files in the m3u8 file, this is the configured url of your web server including the uri to your HLS files.

Finally the name used for the segment files is defined. Note the +Live segment_list_flags option, this tells ffmpeg that we will be transcoding live content. The official documentation for all these options can be found here.

Configure your ffmpeg options, start the QTB broadcast and run FFMPEG. If all is working you will see FFMPEG starting to receive and transcode the unicast video (and audio) data.

Now point your i-Phone/Pad/Pod or HLS client (i.e. VLC, Safari) to the m3u8 file on your webserver, i.e. from the options I used above the url would be :

http://tv.server.co.uk/media/hls.m3u8

You should see your live video stream.

My video stream had about a 60 second lag time. Disconnect and reconnect and you will see you are receiving live content – with the 60 second lag. Note that FFMPEG doesn’t automatically remove the old TS segments, you need to handle that yourself.

If it doesn’t work check FFMPEG for errors, note that some of the segment options need a relatively new version of FFMPEG to function. I downloaded and compiled the latest version of FFMPEG and it’s dependencies on my Ubuntu server using the guide found here.

Make sure the m3u8 file and MPEG TS segments are accessible from the web, open the m3u8 file in a text editor to check the url being used for each segment.

If FFMPEG starts dropping files it means your live data is coming in too fast for FFMPEG to process, you need to look at your compression settings. Try reducing the length of the TS segments from 10 seconds to 2 seconds, Remember your server must be able to transcode and save the live data within these times otherwise the stream will start to stutter.

 

 

 

 

Amazon iFrame X-Frame-Options SAMEORIGIN error

I didn’t realize that Amazon restricted access to product content in iframes with an X-Frame-Options header until yesterday (16.02.2014) when they applied the same iFrame restrictions to their admin backend Amazon Seller Central.

This made me very grumpy.

I created an eBay and Amazon admin website to allow us to consolidate order info from both merchants. It was useful to go directly from this site to the Amazon order in seller central by clicking on a button, even more useful was showing this in a fancybox iframe so the user didn’t have to leave the admin page. This worked, up until yesterday, when the iframe request is cancelled due to a :

Refused to display …. in a frame because it set ‘X-Frame-Options’ to ‘SAMEORIGIN’.

X-Frame-Options is an HTTP response header to prevent framing of pages. If the header is present the browser will refuse to render (cancel) the page in a frame depending on the values:

DENY – stops all framing
SAMEORIGIN – stops framing except for requests from the website itself.

So it looks like Amazon changed the security policy on their customer admin pages (Seller Central) yesterday to match the front end product pages and block frame request to content using the SAMEORIGIN header. There is no way around this so it is no longer possible to frame any amazon content. Boo!

The workaround is to load the content into a new _blank page and not use iframes – Hurrah!