Skip to main content
deleted 29 characters in body
Source Link
codecowboy
  • 619
  • 1
  • 5
  • 8

I need to write an application which consumes a forum's content via a REST API and stores threads and posts. The application will act as a bridge layer between the forum and write data to a third application which then needs to query/import this data periodically - as close to 'real-time'as possible. The platform is PHP 5.3 / MySQL and probably Symfony with the Zend_Rest client.

My question is what would be an appropriate / performant architecture be for the bridge layer? I imagine I will need to do an initial import of the forum data which will be slow (may take hours). The bridge application will also have a front-end for selectively adding the forum messages to the third application and adding further meta data e.g sentiment (was the message positive or negative in tone). I realise the data import / export could be done with procedural scripts and cron jobs but am wondering if there is a better way.

Many thanks,

I need to write an application which consumes a forum's content via a REST API and stores threads and posts. The application will act as a bridge layer between the forum and a third application which then needs to query/import this data periodically - as close to 'real-time'as possible. The platform is PHP 5.3 / MySQL and probably Symfony with the Zend_Rest client.

My question is what would be an appropriate / performant architecture be for the bridge layer? I imagine I will need to do an initial import of the forum data which will be slow (may take hours). The bridge application will also have a front-end for selectively adding the forum messages to the third application and adding further meta data e.g sentiment (was the message positive or negative in tone). I realise the data import / export could be done with procedural scripts and cron jobs but am wondering if there is a better way.

Many thanks,

I need to write an application which consumes a forum's content via a REST API and stores threads and posts. The application will act as a bridge layer between the forum and write data to a third application periodically - as close to 'real-time'as possible. The platform is PHP 5.3 / MySQL and probably Symfony with the Zend_Rest client.

My question is what would be an appropriate / performant architecture be for the bridge layer? I imagine I will need to do an initial import of the forum data which will be slow (may take hours). The bridge application will also have a front-end for selectively adding the forum messages to the third application and adding further meta data e.g sentiment (was the message positive or negative in tone). I realise the data import / export could be done with procedural scripts and cron jobs but am wondering if there is a better way.

Many thanks,

Source Link
codecowboy
  • 619
  • 1
  • 5
  • 8

How can I optimally consume and re-syndicate a REST web service

I need to write an application which consumes a forum's content via a REST API and stores threads and posts. The application will act as a bridge layer between the forum and a third application which then needs to query/import this data periodically - as close to 'real-time'as possible. The platform is PHP 5.3 / MySQL and probably Symfony with the Zend_Rest client.

My question is what would be an appropriate / performant architecture be for the bridge layer? I imagine I will need to do an initial import of the forum data which will be slow (may take hours). The bridge application will also have a front-end for selectively adding the forum messages to the third application and adding further meta data e.g sentiment (was the message positive or negative in tone). I realise the data import / export could be done with procedural scripts and cron jobs but am wondering if there is a better way.

Many thanks,