Autor Beitrag
sokrates
Hält's aus hier
Beiträge: 4



BeitragVerfasst: Do 29.11.07 07:37 
Hello, sorry for not writing in German, but my German really suck when it comes to writing. I was hoping someone of you could help me out, am trying to uploade some scripts to my SQL datbase but the problem is that it is reading the file to slow.
The file is build up like this.

|
script
|
script
|
script

and so on...
so my code is very simple:

ausblenden C#-Quelltext
1:
2:
3:
4:
5:
6:
7:
8:
9:
10:
11:
12:
13:
14:
15:
16:
17:
18:
19:
20:
21:
22:
23:
24:
25:
26:
27:
28:
29:
30:
public void Uploade_file()
        {
            TextReader tr = new StreamReader(txtPath.Text);
            string input = null;
            commands_script = "";
            while ((input = tr.ReadLine()) != null)
            {
                if (input == "|")
                {
                    if (commands_script != ";")
                    {
                        MySqlConnection conn = new MySqlConnection(connectionString);
                        MySqlCommand cmd = new MySqlCommand();
                        cmd.CommandText = commands_script;
                        cmd.Connection = conn;
                        conn.Open();
                        cmd.ExecuteNonQuery();
                        conn.Close();

                        commands_script = "";
                    }
                }
                else
                {
                    commands_script = commands_script + input;
                }
            }
            tr.Close();
            MessageBox.Show("Done");
        }



But it is taking forever to uploade it. Is there someone that have an ide about how to make it faster?
JüTho
ontopic starontopic starontopic starontopic starontopic starontopic starontopic starhalf ontopic star
Beiträge: 2021
Erhaltene Danke: 6

Win XP Prof
C# 2.0 (#D für NET 2.0, dazu Firebird); früher Delphi 5 und Delphi 2005 Pro
BeitragVerfasst: Do 29.11.07 11:00 
Hello,

you are using the right standard way to connect with a database: create DbConnection, create DbCommand, conn.Open, Execute, conn.Close, next command.

But you know that there are more commands, one after another. In this situation you can use the following way:
ausblenden C#-Quelltext
1:
2:
3:
4:
5:
6:
7:
8:
9:
10:
11:
12:
13:
14:
15:
16:
17:
18:
// use the same DbConnection and DbCommand for all your commands
using(DbConnection conn = new DbConnection(connectionString)) {
  DbCommand cmd = new DbCommand();
  cmd.Connection = conn;
  conn.Open();
  // now start the input of the file using the 
  ...
    while(...)
      if (... == ";") {
        //  change the CommandText only
        cmd.CommandText = commands_script;
        conn.ExecuteNonQuery();
        commands_script = String.Empty;
      } else ...
    ...
  // conn.Close() can be omitted because of the using instruction
  conn.Close();
}

By the way, you have to include try-catch constructs.

Juergen

PS. I agree maro158's hint: Use StringBuilder instead of string concatenation.


Zuletzt bearbeitet von JüTho am Do 29.11.07 13:04, insgesamt 1-mal bearbeitet
maro158
ontopic starontopic starontopic starontopic starontopic starontopic starontopic starontopic star
Beiträge: 35



BeitragVerfasst: Do 29.11.07 12:21 
If your script file is really large, consider using a StringBuilder in conjunction with StreamReader to first input your scripts. Then you could try to parallelize the execution of the scripts, but only if the actions performed would not place locks on database objects that would prevent the execution of a parallel task in a timely manner. Using a pool of connections would also be a good idea. Minor performance enhancements could result from replacing == and != with the string.CompareOrdinal() method. Also you're doing a string concatenation in a while loop. Since strings are immutable you end up with creating a new string every time you do that. This is also true of command_script = "", which generates a new empty string.

I am afraid, there is no standard way of optimizing code. But if you use a profiler, you can better understand the hot spots of your code.
sokrates Threadstarter
Hält's aus hier
Beiträge: 4



BeitragVerfasst: Do 29.11.07 18:35 
the size of the files am uploading is around 20 GB to 100 GB.
And the scripts in the file have 250 000 statments each.
For exampel. an

INSERT INTO bla. bla. bla VALUES
(bla, bla, bla),
(bla, bla, bla),
(bla, bla, bla),
(bla, bla, bla),
(bla, bla, bla),
(bla, bla, bla);


have 250 000 values to insert. ;)
I will try that with putting the
database connection on the outside
and using a different kind of string. But isent there
any way to speed up reading part from the file :)
JüTho
ontopic starontopic starontopic starontopic starontopic starontopic starontopic starhalf ontopic star
Beiträge: 2021
Erhaltene Danke: 6

Win XP Prof
C# 2.0 (#D für NET 2.0, dazu Firebird); früher Delphi 5 und Delphi 2005 Pro
BeitragVerfasst: Do 29.11.07 20:11 
Oops... In this case, I don't have any usable idea. The only possible way I see is to create a DataTable, insert the VALUES into DataRows and Row.Columns[] and transport a package of Rows (roundabout 1000) by one ExecuteNonQuery. But the main problem is remaining: many times ReadLine of the file.

Perhaps you get a better way with StreamReader.Read(Char[], Int32, Int32) method or with RegEx, but there are doubts. Juergen

PS. It would have been better if you would have given all the important informations in the first post.
sokrates Threadstarter
Hält's aus hier
Beiträge: 4



BeitragVerfasst: Do 29.11.07 21:46 
sorry, I dident think that was so imported. sorry :(
JüTho
ontopic starontopic starontopic starontopic starontopic starontopic starontopic starhalf ontopic star
Beiträge: 2021
Erhaltene Danke: 6

Win XP Prof
C# 2.0 (#D für NET 2.0, dazu Firebird); früher Delphi 5 und Delphi 2005 Pro
BeitragVerfasst: Fr 30.11.07 10:45 
user profile iconsokrates hat folgendes geschrieben:
sorry, I dident think that was so imported. sorry :(

[OT] If you want to make a work faster, it's always important how many data have to be moved. Juergen
maro158
ontopic starontopic starontopic starontopic starontopic starontopic starontopic starontopic star
Beiträge: 35



BeitragVerfasst: Fr 30.11.07 16:47 
user profile iconsokrates hat folgendes geschrieben:
the size of the files am uploading is around 20 GB to 100 GB.
And the scripts in the file have 250 000 statments each.


You're serious? Don't panic! Just don't use strings at all. Use a char[] buffer of size let's say 1024 to sequentially scan through your stream and add letter after letter until you reach "|". When advancing to the next letter check if the char[] buffer is still large enought to accomodate the new char and if not resize it using a predefined chunk size (use Array.Copy for this). Finally, when "|" is reached, you can use the char[] buffer to execute your query, thus keeping memory usage low. Then you reset the char[] buffer and start again until the end of your stream. Hope this helps.
sokrates Threadstarter
Hält's aus hier
Beiträge: 4



BeitragVerfasst: So 02.12.07 22:35 
great idea, I will check it out at once :)