Tuesday, October 4, 2011

Ajax - Reminder for me - Multiple calls

This is more a reminder for me, rather than a useful post.

The ability to call different backend programs and have live updates into divs on a page, and here is the code that does it and works on Windows, Firefox and Google Chrome.


<html>
<head>
<style>
#time { position: absolute; top: 100px; left: 10px; border-left: 2px solid grey; border-bottom: 2px solid black; width: 200px; height: 200px }
#blob { position: absolute; top: 100px; left: 300px; border-left: 2px solid gray; border-bottom: 2px solid black; width: 200px; height: 200px }
</style>
</head>


<script type="text/javascript">
function ajaxFunction(myURL,myDIV)
{
var xmlHttp;
try
{
// Firefox, Opera 8.0+, Safari, Chrome
xmlHttp=new XMLHttpRequest();
}
catch (e)
{
// Internet Explorer 2 versions, capture both
try
{
xmlHttp=new ActiveXObject("Msxml2.XMLHTTP");
}
catch (e)
{
try
{
xmlHttp=new ActiveXObject("Microsoft.XMLHTTP");
}
catch (e)
{
alert("Your browser does not support AJAX!");
return false;
}
}
}
xmlHttp.onreadystatechange=function()
{
if(xmlHttp.readyState==4)
{
document.getElementById(myDIV).innerHTML=xmlHttp.responseText;
}
}
xmlHttp.open("GET",myURL,true);
xmlHttp.send(null);
}
</script>
<body>
<form name="myForm">
<p>Magic updating data, go edit the file
<div id=time></div>
<div id=blob></div>
</form>
<script>setInterval("ajaxFunction('cgi-bin/time.pl','time')",5000);</script>
<script>setInterval("ajaxFunction('cgi-bin/stuff.pl','blob')",5000);</script>
</body>
</html>

Sunday, October 2, 2011

Perl DBI and Sybase IMAGE fields

Sybase IMAGE fields and TEXT fields can be a right pain when working with Perl and the ct_ libraries. The manual pages are a little rough and don't give full examples, often missing out essential parts. So here it is in one section.

NOTE: Your LANG variable must be unset!

I noticed whilst identifying the issue for them that Sybase will allow dumps up to a certain size, e.g. the /etc/passwd file would quite happily fit into an IMAGE field if you use the following;


use strict;
use DBI;

# Open the connection to the database
my $dbh=DBI->connect('dbi:Sybase:server=STEVE;database=test','sa','') || die "Can't connect";

sub Insert
{
# Insert data
open(FH,"/etc/passwd");
local $/; # Slurp mode to suck in the file into one scalar
my $filename=<FH>; # read in the data
$/=1; # Return to normal newline mode
close FH;
$filename=unpack("H*",$filename); # Hex convert the entire file
my $length=length($filename);
$dbh->do("INSERT INTO data VALUES(4,'$filename')"); # Note the single quotes '
print "Insert done\n";
}

sub GetData
{
# Get data
my $sth=$dbh->prepare("SELECT * from data where id=4");
$sth->execute();
open(OFH, ">mytcap"); my $line;
while ( $line = $sth->fetchrow_hashref() )
{
my $data=$line->{'file'};
$data=pack('H*',$data); # Unpack the hex data
while ( $data =~ /(.{2})/g ) # Take each 2 characters from the hex code
{
print chr(hex($1)); # Turn the hex back into text
}
}
close(OFH);
$sth->finish();
}

Insert();
GetData();
$dbh->disconnect();

However, if the data became larger, e.g. the /etc/httpd/conf/httpd.conf file then you would start to see complaints from Sybase saying that the maximum size is only 30. That is, it can't move the point to the next available page size for the IMAGE. Which lead to the need for the ct_ functions as below;

use strict;
use DBI;

my $dbh=DBI->connect('dbi:Sybase:server=STEVE;database=test','sa','') || die "Can't connect";

sub Insert
{
# Insert data
$dbh->do("INSERT INTO data VALUES(4,'0xab')"); # Put some rubbish in the IMAGE field
}

sub Update
{
my $size = -s '/etc/httpd/conf/httpd.conf'; # Open a large file
open(FH,"/etc/httpd/conf/httpd.conf");
local $/;
my $filename=<FH>;
$/=1;
close FH;
$filename=unpack("H*",$filename); # Convert it to hex

my $sth=$dbh->prepare("select file from data where id=4"); # Grab the inserted new row
$sth->execute();
while($sth->fetch) # Fetch an array reference
{
$sth->syb_ct_data_info('CS_GET',1); # Set the pointer
}
$sth->syb_ct_prepare_send();
# Tell Sybase how much data we plan to send and turn the log on
$sth->syb_ct_data_info('CS_SET',1, {total_txtlen => length($filename), log_on_update => 1});
$sth->syb_ct_send_data($filename, length($filename)); # Send the data
$sth->syb_ct_finish_send();
}

sub GetData
{
# Get data
my $data;
my $sth=$dbh->prepare("SELECT datalength(file) AS len from data where id=4");
$sth->execute();
my $length=${$sth->fetchrow_hashref()}{'len'},"\n"; # Get the length of the IMAGE data
$sth->finish();
$dbh->{LongReadLen}=$length+1; # Tell Sybase how much data we plan to fetch

$sth=$dbh->prepare("SELECT id,file from data where id=4"); # What record do we want
$sth->{syb_no_bind_blob}=1; # Tell Sybase we are fetching binary
$sth->execute();
my ($len,$d);
while ( $d = $sth->fetch ) # Get array ref of data
{
$len = $sth->syb_ct_get_data(2,\$data,0); # Fetch all the data, $len contains number of bytes
# NOTE: \$data is the reference that will capture the data
while ( $data =~ /(.{2})/g ) # Do our favourite conversion on the returned data
{
print chr(hex($1)); # Print out the converted characters
}
}
$sth->finish();
}

Insert();
Update();
GetData();
$dbh->disconnect();

Note that with these examples I use Perl's unpack function to place the data into hex form before inserting it into the database, so that no quote or special character conversions are required. On retrieving the data I make use of the chr and hex functions to convert every 2 hex values into their decimal number and then chr to get the real character.