On Mon, 26 May 2008 11:00:52 -0400 "Christopher M. Jones" cjones@partialflow.com wrote:
I need some consultation. I've used Drupal for lightweight deployments (mostly personal blogs). But I've got a potential client who is interested in drupal for e-commerce. We're thinking through a couple of issues, and we'd like some advice:
Our client uses MSSQL for their inventory, etc. Since drupal doesn't support anything but mysql and pgsql, and since their tables would be different from drupal's regardless, we're trying to work out the best way to handle integrating drupal into their existing infrastructure. We've considered writing a script to pull data from one database and use it to update the other. We've also considered hacking the relevant modules (Ubercart) to do this on demand.
Does anyone have experience with this?
If you're planning to use Ubercart it may not be that easy to convince it to use 2 DB. If you don't need extreme customisation to the purchase process it may still be a "bargain". I don't know Ubercart so deeply to say if it is going to be easy to make it use 2 DB. Anyway I doubt it is going to be that easy.
Depending on the version you're developing for there were some choices about how to make Drupal works with MSSQL. I remember there was a "big" patch and Siren (whose project page is not anymore available). I'd say that using MSSQL for all Drupal on the currently supported release is going to cause you more headache than the one it will avoid you. It could be different for D7.
I've written an e-commerce engine inside Drupal. In its infancy I was using pgsql for CMS and for all the e-commerce processing and MSSQL for the catalogue. I wrote a couple of classes to overcome the lack of LIMIT in MSSQL and keep on using Drupal paging facilities. It was not a big work.
Anyway it worked to a certain extent, you still have themed tables for displaying products, you can use the template system, FAPI... and you've a CMS. If you're planning to develop your own e-commerce software, put most of the business logic in php and rely just on what Drupal can offer, you'll have anyway to implement the consistency and locking logic by your own and having your catalogue on one DB and baskets, orders etc... on another may not make too much difference, since you won't exploit what a RDBMS can really offer.
While things grew up I decided to a) import the catalogue in pgsql and b) put most of the business logic in the DB. On 2 similar boxes (2x Xeons roughly 5 years old with RAID1 and 4Gb RAM), and no tuning at all, one running MS SQL the other running pgsql, exporting and importing a catalogue of 600K articles and a tree of tens of thousands categories takes roughly 5 minutes including FTP transfer. If you can live with a 5 min "down" and miss referential integrity on items ids, it doesn't seem a big cost. Otherwise pgsql have a nice plug-in (dblink) that let you perform "cross DB" queries... and that's what I'm going to implement next as soon as I'll finish with other features that have precedence over a more elegant import procedure.
I had to implement a quite unusual purchase process so I couldn't go the Ubercart route. If you've a more traditional purchase process, importing the catalogue into Ubercart may be the easiest, more reliable path and it will come with a lot of plug-ins, discount methods etc... you'll take years to develop by your own.
Even if you were working with the same DB on the back-end an front-end you'll still have to move data around. If you don't have a very dynamic catalogue you could easily sync the 2 DB once a day... if eg. availability of items is critical to your business you may just pull on stock quantities on demand (you may be so lucky there is already a plug-in for Ubercart that just do that).
Summing it up:
Since I think most of Ubercart coherency checks and constraints are made in php on client side (DB client) it doesn't make too much difference if you're pulling part of the data directly from MSSQL "on demand" (look at FreeTDS and sybase_*[1]), but I think pulling them all is going to make things to tricky and hard to keep under control, anyway I'd pull data on demand just if I had a very dynamic catalogue that really need to be up-to-date.
Building your own e-commerce engine if you don't need a really custom purchase process is not going to pay off, but if you have to... if you put your business logic and coherency checks in php you may find interesting and not too hard to put your feet in 2 DB inside Drupal, if you feel more comfortable in delegating locking and coherency checks to the DB... having to deal with 2 DB is going to be a PITA and even syncing a reasonably large catalogue is not going to cost you that much.
[1] most Linux distro comes with sybase drivers for php but they miss precompiled mssql drivers. You could compile them but I didn't find the extra maintenance involved worth. Sybase drivers are generally up to the task. Here you could find something that may come useful if you plan to connect to MS SQL from a Linux box with PHP: http://www.webthatworks.it/d1/page/odbc,_freetds_and_microsoft_sql_(and_php) http://www.webthatworks.it/d1/page/nested_query,_connection_sharing_and_php_... http://www.webthatworks.it/d1/page/freetds_and_sqsh_debian