Showing posts with label store. Show all posts
Showing posts with label store. Show all posts

Tuesday, March 27, 2012

Embedded and distribute MSDE on CD

Hi,

I am trying to find something like a light version of SQL Server 2k to store data for an application that will be distributed through a cd.

Since we also have a web front end for this application when the user is online, we prefer that we can reuse the code for connection on both the cd and the web.

Any idea what will work?

I have come accross MSDE 2k but was not able to find any documentation on the pros and cons for this product regarding its capacity, the type of data it can handle, security, ...

Please give me some pointer.

Thanks very much.

Baoha.http://www.microsoft.com/sql/msde/default.asp hompage and corresponding links for more information.

Sunday, February 26, 2012

Eliminating Dynamic SQL

I am refactoring stored procedures that use dynamic sql.
The reason the store procedures use dynamic sql is because
the data that is need comes from another MS SQL database
that resides on the same server instance.
The following is a code example from a stored procedure:
declare @.theDatabase sysname;
set @.theDatabase = 'SomeDatabase';
declare @.thePrimaryKey int;
set @.thePrimaryKey = 2;
declare @.theSqlString varchar(8000);
set @.theSqlString = 'select r.Feild from ' +
@.theDatabase +
'..TableName r ' +
'where r.ID = '
+ rtrim( str( @.thePrimaryKey ) )
exec sp_executesql @.theSqlString;
The problem is that dynamic sql is EXTREMELY slow! I have
been doing some research and have found a little bit on
sp_linkedservers and OPENQUERY but I have not figured out
how to accomplish what is done above.
Can anyone give me an example of how I can use
sp_linkedservers and OPENQUERY to accomplish this query?
Would it be better, performance wise, to just continue
using dynamic sql?
Any help would be greatly appreciated.
Sincerely,
John Dickey
jpd0861@.msn.comJohn Dickey wrote:
> I am refactoring stored procedures that use dynamic sql.
> The reason the store procedures use dynamic sql is because
> the data that is need comes from another MS SQL database
> that resides on the same server instance.
> The following is a code example from a stored procedure:
> declare @.theDatabase sysname;
> set @.theDatabase = 'SomeDatabase';
> declare @.thePrimaryKey int;
> set @.thePrimaryKey = 2;
> declare @.theSqlString varchar(8000);
> set @.theSqlString = 'select r.Feild from ' +
> @.theDatabase +
> '..TableName r ' +
> 'where r.ID = '
> + rtrim( str( @.thePrimaryKey ) )
> exec sp_executesql @.theSqlString;
> The problem is that dynamic sql is EXTREMELY slow! I have
> been doing some research and have found a little bit on
> sp_linkedservers and OPENQUERY but I have not figured out
> how to accomplish what is done above.
> Can anyone give me an example of how I can use
> sp_linkedservers and OPENQUERY to accomplish this query?
> Would it be better, performance wise, to just continue
> using dynamic sql?
If you code the dynamic SQL properly, you can eliminate much of the
performance overhead. In your case, you're using sp_executesql, but are
not taking advantage of its main benefit over EXEC. That is, parameter
binding.
Instead of how it's currently coded, you could use:
set @.theSqlString = 'select r.Feild from ' +
'[' + @.theDatabase + ']' +
'..TableName r ' +
'where r.ID = @.thePrimaryKey'
Exec sp_executesql @.theSqlString, N'thePrimaryKey INT', @.thePrimaryKey
I'm not sure why it's necessary, though, given your example. You have a
hard-coded database name on the same server. If you know the database,
you can just fully-qualify the object in the SQL:
Select * from [AnotherDatabase].dbo.[MyTable]
David Gugick
Imceda Software
www.imceda.com|||Why do you want to parameterize the name of the target database?
If the name may change and/or you prefer not to hard-code it in SPs
then just create a view or views to reference the other database and
reference only the views in your SPs. That way the database name is
coded only in a few places instead of every SP.
David Portas
SQL Server MVP
--

Eliminating Dynamic SQL

I am refactoring stored procedures that use dynamic sql.
The reason the store procedures use dynamic sql is because
the data that is need comes from another MS SQL database
that resides on the same server instance.
The following is a code example from a stored procedure:
declare @.theDatabase sysname;
set @.theDatabase = 'A database that I get based on
critera at runtime.';
declare @.thePrimaryKey int;
set @.thePrimaryKey = 2;
declare @.theSqlString varchar(8000);
set @.theSqlString = 'select r.Feild from ' +
@.theDatabase +
'..TableName r ' +
'where r.ID = '
+ rtrim( str( @.thePrimaryKey ) )
exec sp_executesql @.theSqlString;
The problem is that dynamic sql is EXTREMELY slow! I have
been doing some research and have found a little bit on
sp_linkedservers and OPENQUERY but I have not figured out
how to accomplish what is done above.
Can anyone give me an example of how I can use
sp_linkedservers and OPENQUERY to accomplish this query?
Would it be better, performance wise, to just continue
using dynamic sql?
Any help would be greatly appreciated.
Sincerely,
John Dickey
jpd0861@.msn.comIf you have the same table structures in each database then you can
create a partitioned view across them and reference the partitioned
view instead. Whether this is the right solution though may depend on
just why you have the data split across multiple DBs in the first
place. It doesn't seem like a very practical architecture if it forces
you to write dynamic SQL in all your production code.
David Portas
SQL Server MVP
--|||Thank you for your reply David.
The reason that we have different databases is because the
databases are for different applications that my
application interfaces with. They happen to be Great Plains
accounting databases that I am trying to retrive data from.
Can you give me more information about the partitioned view
and how I can set that up?
John Dickey

>--Original Message--
>If you have the same table structures in each database
then you can
>create a partitioned view across them and reference the
partitioned
>view instead. Whether this is the right solution though
may depend on
>just why you have the data split across multiple DBs in
the first
>place. It doesn't seem like a very practical architecture
if it forces
>you to write dynamic SQL in all your production code.
>--
>David Portas
>SQL Server MVP
>--
>.
>|||It's all in Books Online:
http://msdn.microsoft.com/library/e...des_06_17zr.asp
David Portas
SQL Server MVP
--

Sunday, February 19, 2012

Efficient way to store game data: help?

Hello everyone!

So I am trying to wrap my head around the most efficient way to store some game data in sql server 2005.

I wont list all of my tables or columns as there would just be too many, but will try to post as exact as I can.

Basically lets say I have 3000 monsters in my game, and over 8000 items some of which are monster loot.

Each monster could have 1 drop item or they could have 200 different item drops all depending on the monster.

So I am just trying to find the best method of creating a loot table.

So I have the following in mind.

[MonsterTable]

monsterID;

monsterName;

etc...

[ItemTable]

itemID;

itemName;

etc..

[LootTable]

lootTableID;

refMonsterID;

refItemID;

So I'm not worried about how I would get all of the data in there, I'm just wondering if this simple method of using Ref. ID's between tables would provide the best method for extracting the data when needed.

Any ideas or thoughts are more then welcome, any questions just ask, and thanks for any guidance or information you can provide on a more efficient manner.

From the situation / information provided, this is an appropiate and common design.

Jens K. Suessmeyer.

http://www.sqlserver2005.de