SQL Server Sync Framework with SQL DB's: Getting Started

ecbunoof  于 2023-11-16  发布在  其他
关注(0)|答案(9)|浏览(91)

I have an app that uses SQL Enterprise to store all data in 4 different DB's. I needed to build in the ability to work "offline" for my users. I accomplished this through Merge Replication to local SQL Express installs for everyone. This "Works" but feels like the sledge hammer approach.

For instance, I am replicating all 14000 people to every DB when any individual user may only EVER interact with a 100 or so. That's not even counting the fact that they would NEVER interact with more then 5 ish in between connections to the Central DB's.

What I am looking for is tips, pointers, and, maybe, a nice tutorial on Sync Framework 2(with Databases). First hand accounts on what worked for you and why would also be most welcome. I have yet to come across a clear and concise(not to mention current) tutorial for working with Sync Framework .

My specifics are MS SQL Server 2005 or 2008, any version. Any .Net version(3.5 or 4). The current data layer is all LinqToSQL . There are not any Sprocs currently in use.

My thoughts, so far, are to only sync each workers assigned caseloads and associated data. Ideally, we would go straight to a "Check In/Check Out" format where they select the members they plan to go visit and it then syncs the necessary data.

As a bonus, could someone tell me what this is referred to as? I come across "Occasionally Connected" all the time but that seems inaccurate. It would be more accurate to call them "Occasionally DIS-Connected", thoughts?

muk1a3rh

muk1a3rh1#

We ve been using Sync framework in couple of our projects (one with sql server, one with PGsql) so I can say that it works fairly well.

Check out this walkthrough application to get an idea of what you can do.

http://code.msdn.microsoft.com/sync/Release/ProjectReleases.aspx?ReleaseId=4835

This will show you how to synchronize data between multiple sql server databases. You can also customize the 'increments' stored procedures to take custom parameters and filter data based on these parameters (for instance people the client-user is planning to visit).

I will also suggest that you use a reflector to decompile Sync framework code in case you see wierd errors - sometimes it is not possible to figure out where the error is till you see the exception getting caught in the framework code. Redgate works perfectly for me!

Let me know if you need more help!

bqujaahr

bqujaahr2#

I have a couple of suggestions and hints that may or may not be obvious / helpful.

To me it this sounds like a problem that can be split into 3 separate aspects:

Synchronisation process

  • You should definitely ensure that all rows have some sort of "last_update" column so that your synchronisation process can efficiently and reliably determine what data is already up-to-date - that way you can be a lot more aggressive with the number of records that you are synchronising.
  • I would avoid having a complex synchronisation processes and simply use the brute force approach wherever possible. 14,000 records doesn't sound like that many to me - with optimisations you may find that its possible to synchronise all changes made between connections in a reasonable time. If not then I'd still probably be quite liberal about what you synchronise to avoid users working with out-of-date data without realising.

Pushing changes made in offline mode

If at all possible I would probably simply disallow changes in offline mode - it thats not possible then you should consider pushing changes as being as a separate (and probably fairly complex) process in its own right.

Without knowing more about the application its difficult to make good suggestions as the synchronisation process is very business dependent, however some things to consider are:

  • Should uses be able to lock (or check out) an item to prevent other people from changing it while they are working offline?
  • Should the lock be overridable?
  • If a lock is overrided, what should happen when someone attempts to save changes (a merge process seems like a sensible choice)
  • Should someone be able to edit an item that they don't have checked out?

If you do decide to implement a complex offline changes process then you might want to take a look at the workflow used in common distributed VCS for inspiration.

Changing the offline data store

You might find using SQL Compact or SQLite as your local data store a more elegant solution (it would certainly make the installation process easier), if you are using LINQ then I'd probably swing in favour of SQL Compact as it definitely has LINQ to SQL support.

I would focus on the above two first as this change delivers the least in terms of end-user improvements and is probably the most work - the above two are completely achievable while still using SQL Server Express as the local data store.

llycmphe

llycmphe3#

http://code.msdn.microsoft.com/sync/Release/ProjectReleases.aspx?ReleaseId=3422 This is a nice walk-through using SQL Server CE should work great for what you are trying to do and has a much smaller footprint. Seems to me that the functionality that you lose with not hamper what you are trying to do in the least.

lhcgjxsq

lhcgjxsq4#

It sounds to me like Microsoft Sync may be a good option for you. There is an overview of synchronizing databases here, http://msdn.microsoft.com/en-us/sync/bb887608.aspx . Take a look at it and see if it meets your needs. It sounds like the perfect solution to your problem.

0yycz8jy

0yycz8jy5#

This is essentially called Distributed Computing.

In any such system that I've worked on, I've typically used a N-Tier Architecture to allow for occasional connecting of a client program to the database server on the back-end for CRUD operations. Any work that the client does is considered connection-less, as they do not interact with the server until they save their changes.

Using this kind of approach you should be able to make an application that allows them to connect for a moment to get some data, perform some work on the data and then propagate any changes (CRUD) to the server (database).

I would have to say that using Merge replication with Local Sql Express installs is for sure a sledge hammer approach. Don't feel too bad about it, we all do it from time to time :P.

EDIT

I have not used Sync Framework myself, but it seems nice. Check out the Sync Framework Developer Center for some more information on it.

gcuhipw9

gcuhipw96#

The question is, does your app need to run on the device with locally stored disconnected data, or can you create a mobile version of a web front end because they're assumed to always need to be connected to a server when working with the data?

The projects I worked on (a long time ago) used the compact framework and activesync with either an Access or SQL Server back end because it was too expensive to have each device connected all the time, like a phone. They took the device to the field, needed to access the data from the disconnected db, did their mods, and synced when they got back in to the office.

If that's what you're doing, you can use their deviceID to identify which cases/rows each device needs, but if you're thinking of developing a custom sync solution, I think you're fighting the wrong battle. That's been done already. Use a proven sync solution, limitations and all, and focus on criteria you can use to limit the synced data.

q3aa0525

q3aa05257#

It seems that you need to have more control over the merging data options. With this requirement I can see there are 2 types of data; Data that belongs to a Member (Case belongs to a member), and Shared Data (Reference, Master data).

I would have a mixture of these techniques in this case to make things semi rapid.

For instance, Master and Reference data changes are not that frequent. Therefore, we need not to sync very often but need to have the control over changes and should/must done in a central location. In this case, I would do the changes on my Central DB and use the Sledge Hammer Approach.

For Transactional data, assuming once-directional, in this case member can only change cases/rows that belongs to him/her, can be implemented using many ways.

As mentioned above, you can use Sync Framework Developer Center or ActiveSync.

Since the thresholds are not that strict like in PDAs or Palmtops, we can tryout a manual approach as well since we have Laptops. Implementing a functionality/process routine to connect and push data that is created, updated, and deleted (in this case we have to maintain a flag in the db level) using BulkCopy Operations a most common approach in this scenario (need to provide a button for members to click and invoke, or a service to poll and check the connectivity and start sync routine automatically).

/KP

bq3bfh9z

bq3bfh9z8#

After many days of struggling with the sync framework I'm considering ditching it and just writing some simple WCF services and code. My requirements are fairly simple, one-way synchronisation from SQL2008 to SQL CE on mobile devices. From my experience it's very difficult to customise (only sync some fields etc) and extremely slow and inefficient. I think Microsoft need to do some more work to make it a bit easier to use.

Cheers

Mark

wwwo4jvm

wwwo4jvm9#

Have you attempted to filter your merge replication articles at the row level yet? This would reduce your 14k row size into something manageable.

相关问题