| Author |
Message |
   
Conquas
Side Hero Username: Conquas
Post Number: 5353 Registered: 11-2011 Posted From: 99.82.251.78
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 06:15 pm: |
    |
Acf:
bro.... SSIS lo package rayoochu ga.... you can check update data usingweb service calls in SSIS... all opinions expressed here are mine.. |
   
Twitter
Megastar Username: Twitter
Post Number: 23013 Registered: 10-2009 Posted From: 151.191.175.208
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 05:57 pm: |
    |
Acf:No and No
if no then better do it on DB side if you can, if you have some rules that cannot be done at DB side then no other way do it using generic lists/#tables. i defer with the idea of executing small chunks of data using different threads and joining as a big chunk, caz it results same (big chunk) at the end besides complication of multithreading heck. |
   
Parugu
Junior Artist Username: Parugu
Post Number: 238 Registered: 07-2012 Posted From: 192.234.2.25
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 05:53 pm: |
    |
java multithreads ela rayali? oka example explain cheyandi. |
   
Parugu
Junior Artist Username: Parugu
Post Number: 237 Registered: 07-2012 Posted From: 192.234.2.25
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 05:43 pm: |
    |
half mind tho edo cheputunnani anukoka pote, big data vadu all set. |
   
Gotcha
Side Hero Username: Gotcha
Post Number: 9358 Registered: 02-2008 Posted From: 99.189.219.190
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 05:41 pm: |
    |
Store it in hash table or dictionary for performance wise This real estate is for sale. |
   
Parugu
Junior Artist Username: Parugu
Post Number: 236 Registered: 07-2012 Posted From: 192.234.2.25
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 05:41 pm: |
    |
single transaction ki code rayatame ekkuva. millions  |
   
Driverramudu
Side Hero Username: Driverramudu
Post Number: 8593 Registered: 02-2009 Posted From: 128.103.187.124
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 05:14 pm: |
    |
Netsaint:
Database problem is you cannot make use if multi systems to process and store data back into database Where as thru Application you can make use of multiple systems and get chunk of data and dump easily into same database with less tme more records application is more flexible that database  Life is Race. I am in. Driving is my PASSION. Chakkani Car undi Pakkana Pilla undi  |
   
Netsaint
Side Hero Username: Netsaint
Post Number: 6498 Registered: 05-2008 Posted From: 166.147.123.24
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 05:10 pm: |
    |
Acf we invoke a ssis job from frontend. User just clicks webapp processing happens on webpage but actually doing by invoking job. All forecastings reporting calculations are performed with huge data. We run this for 3 hours. Finally when done display message. My Telugu Bhakthi Blog :http://gurugeetha.blogspot.com/ |
   
Driverramudu
Side Hero Username: Driverramudu
Post Number: 8592 Registered: 02-2009 Posted From: 128.103.187.124
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 05:08 pm: |
    |
Cocanada:
If you use temp table then temp database grows huge that too for millions of records. Its always better to use bring sqlserver object model directly into C# and mke use of bulkcopy of sqlserver concept  Life is Race. I am in. Driving is my PASSION. Chakkani Car undi Pakkana Pilla undi  |
   
Driverramudu
Side Hero Username: Driverramudu
Post Number: 8591 Registered: 02-2009 Posted From: 128.103.187.124
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 05:05 pm: |
    |
Acf:
If i were u or what i would suggest is use SQLbulkcopy datamodel you can instantiate that Using C# language as if we do office applications and then establish connection with your Sqlserver 2008 as target database and source is from webservice data or any other as source and map columns source to target. Thats it you are all set. We did converted millions or records like that recently Hope and wish this helps you bro Life is Race. I am in. Driving is my PASSION. Chakkani Car undi Pakkana Pilla undi  |
   
Cocanada
Legend Username: Cocanada
Post Number: 38533 Registered: 01-2008 Posted From: 168.244.164.254
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 04:28 pm: |
    |
Goonda:temp table avasaram ledanukunta.. main table loney oka column add chesi nuvvu cheppina process chestey fine anukuntunna
if the job has to read and update the same table, it will be real slow. thats why i said new table is needed. http://imgcdn.raagalahari.com/june2012/functions/ram-charan-wedding-high-resolution/ram-charan-wedding-high-resolution34.jpg |
   
Acf
Junior Artist Username: Acf
Post Number: 532 Registered: 01-2008 Posted From: 69.74.106.197
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 02:26 pm: |
    |
Twitter:oh my mad, why do you need additional member details ?
we need because that data is not available in .txt file...
Twitter: is there any human intervention needed here ? are you serving the data to the client machine ?
No and No |
   
Acf
Junior Artist Username: Acf
Post Number: 531 Registered: 01-2008 Posted From: 69.74.106.197
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 02:21 pm: |
    |
One:better explain this to client ..this is huge risk making calls in millions in seconds from same server .....For me security issue
they are aware of this... but no plans to modify this as of now...
Khandada:as it is gaa we do it across all out web service clients... select top 1000 * from member where data_updated_yn = 0 make WS call update member set data_updated_yn = 1
Yes i have same idea on using flags to set the status... my questions is actually more on how to retrieve the data... 1. if we get n records at a time... to process all records I need make several database calls from my application which is expensive... may be need to use more threads... 2. if we get entire data at once then the application takes huge memory... |
   
One
Hero Username: One
Post Number: 16009 Registered: 09-2008 Posted From: 31.211.154.169
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 02:13 pm: |
    |
Acf:we dont have control on WS or WS doest have access to our DB. Webserive can only take MemberID... it wont take array list or custom object
better explain this to client ..this is huge risk making calls in millions in seconds from same server .....For me security issue If there is no option on this.....Then go by batch process .. |
   
Acf
Junior Artist Username: Acf
Post Number: 530 Registered: 01-2008 Posted From: 69.74.106.197
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 02:07 pm: |
    |
One:WS calls multiple calls na....endhuku bujji network bokkaa....create objects and make them arrylist pass them to WS ....get them back and process them
we dont have control on WS or WS doest have access to our DB. Webserive can only take MemberID... it wont take array list or custom object. |
   
Khandada
Side Hero Username: Khandada
Post Number: 4322 Registered: 10-2008 Posted From: 64.79.135.151
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 02:07 pm: |
    |
Goonda:main table loney oka column add chesi nuvvu cheppina process chestey fine anukuntunna
as it is gaa we do it across all out web service clients... select top 1000 * from member where data_updated_yn = 0 make WS call update member set data_updated_yn = 1 ---- if you want to account for WS call failures, add a update_error column if you encounter an error during WS call for some records, update member set update_error = "msg" this way your client program will not get stuck on error record(s) and skip to next available instead of going in loop you can schedule a sql job to run every hr or so in the BG to retry the error record update member set update_error = null, data_updated_yn =0 where data_updated_yn =1 and update_error is not null also client program has to have error handling if it going to die on failed network calls, lost db connections etc Humpty Dumpty sat on a wall,Humpty Dumpty had a great fall |
   
Twitter
Megastar Username: Twitter
Post Number: 23004 Registered: 10-2009 Posted From: 151.191.175.208
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 02:06 pm: |
    |
Acf:Retrieve that data for simplicity need to retrieve all MemberID's and pass memberID to the webservice (and worst of all we have to call webservice million time passing one memberID at a time) and get additional member details.
oh my mad, why do you need additional member details ? is there any human intervention needed here ? are you serving the data to the client machine ? |
   
Acf
Junior Artist Username: Acf
Post Number: 529 Registered: 01-2008 Posted From: 69.74.106.197
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 02:05 pm: |
    |
Jaankayalu_chips:
if im loading that data to some C# object i can take care of that transaction within my object itself.
Goonda:iterate and get 100 records..
is this iteration through the list that contains primary keys? you mean to say get 100 primary keys from the list and build sql string and call database to get all the data for those 100 records? right? if we do this way we have to make several calls to the database which is expensive right?
Khandada:you need to account for the WS call...for each memberID...that is a network round trip ( including a DB call on the service side too)
yes we cant change that... thats how the webserice is setup. |
   
One
Hero Username: One
Post Number: 16008 Registered: 09-2008 Posted From: 31.211.154.169
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 02:04 pm: |
    |
Jaankayalu_chips:so, you should have a processing indicator. you might need a trigger table to select the rows you want to process and mark them as pending. first job will populate the trigger table
+1 |
   
One
Hero Username: One
Post Number: 16007 Registered: 09-2008 Posted From: 31.211.154.169
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 02:00 pm: |
    |
Acf:Retrieve that data for simplicity need to retrieve all MemberID's and pass memberID to the webservice (and worst of all we have to call webservice million time passing one memberID at a time) and get additional member details.
WS calls multiple calls na....endhuku bujji network bokkaa....create objects and make them arrylist pass them to WS ....get them back and process them One more then when you get data from DB ..best thing get it from Store procedure .... convert data into hash map/collections/MAp...then process data... If there is nothing much in C other then validation ... do it in DB itself.... |
   
Aha
Side Hero Username: Aha
Post Number: 2023 Registered: 01-2011 Posted From: 8.19.13.19
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:58 pm: |
    |
Jaankayalu_chips:you can create a temp table with the primary key of Members + this new column which you populate with data from the web service call
+1 har ik gham tumhaara sahenge khushi se karenge na shikwaa kabhi bhi kisi se |
   
Khandada
Side Hero Username: Khandada
Post Number: 4321 Registered: 10-2008 Posted From: 64.79.135.151
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:58 pm: |
    |
Acf:
you need to account for the WS call...for each memberID...that is a network round trip ( including a DB call on the service side too) if you can allow a "processing window" for data update to be complete after the job is triggered from UI...then you can keep it simple enough to break down to chunks of 100 or 1000 Humpty Dumpty sat on a wall,Humpty Dumpty had a great fall |
   
Goonda
Megastar Username: Goonda
Post Number: 21864 Registered: 02-2007 Posted From: 199.82.243.103
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:57 pm: |
    |
Jaankayalu_chips:dude..you have to consider one more scenario. If the job aborts in the middle for any reason, you shouldnt process the records that you already processed. so, you should have a processing indicator. you might need a trigger table to select the rows you want to process and mark them as pending. first job will populate the trigger table your second job will pick the record from trigger table, make web service call and populate another table, mark the the record in trigger table as processed Goonda annai...pls validate
temp table avasaram ledanukunta.. main table loney oka column add chesi nuvvu cheppina process chestey fine anukuntunna Sasibabu: If TDP loses next elechens, i will donate 10% of my salary to TDP Skywalker: Bala chiru type kadu.....narasimha swamy avataram etti posani gadi pegulu medalo esukuntadu |
   
Jaankayalu_chips
Junior Artist Username: Jaankayalu_chips
Post Number: 893 Registered: 03-2012 Posted From: 168.244.164.254
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:56 pm: |
    |
Acf:first problem is loading and working on that data
dude..you have to consider one more scenario. If the job aborts in the middle for any reason, you shouldnt process the records that you already processed. so, you should have a processing indicator. you might need a trigger table to select the rows you want to process and mark them as pending. first job will populate the trigger table your second job will pick the record from trigger table, make web service call and populate another table, mark the the record in trigger table as processed Goonda annai...pls validate kangaaru padakandi....chart lo chusaanu |
   
Goonda
Megastar Username: Goonda
Post Number: 21863 Registered: 02-2007 Posted From: 199.82.243.103
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:54 pm: |
    |
Acf:not sure I understood this point... is this for pushing the data back to the DB?
iterate and get 100 records.. make a webservice call and update back to DB Sasibabu: If TDP loses next elechens, i will donate 10% of my salary to TDP Skywalker: Bala chiru type kadu.....narasimha swamy avataram etti posani gadi pegulu medalo esukuntadu |
   
Goonda
Megastar Username: Goonda
Post Number: 21862 Registered: 02-2007 Posted From: 199.82.243.103
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:53 pm: |
    |
Acf:you mean million records of primary/unique keys at once?
yes. This means smaller memory footprint. Sasibabu: If TDP loses next elechens, i will donate 10% of my salary to TDP Skywalker: Bala chiru type kadu.....narasimha swamy avataram etti posani gadi pegulu medalo esukuntadu |
   
Acf
Junior Artist Username: Acf
Post Number: 528 Registered: 01-2008 Posted From: 69.74.106.197
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:52 pm: |
    |
Goonda:another way of doing is. get the primary/unique keys and store them in a list/map.
you mean million records of primary/unique keys at once?
Goonda: iterate through the list/map and build a SQL query using the primary/unique keys in where clause and process them.
not sure I understood this point... is this for pushing the data back to the DB? |
   
Acf
Junior Artist Username: Acf
Post Number: 527 Registered: 01-2008 Posted From: 69.74.106.197
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:49 pm: |
    |
Jaankayalu_chips:reading and writing to same table? oh my mad there will be write locks all the time and the job will run dead slow you can create a temp table with the primary key of Members + this new column which you populate with data from the web service call
yes thats my last headache... first problem is loading and working on that data... |
   
Acf
Junior Artist Username: Acf
Post Number: 526 Registered: 01-2008 Posted From: 69.74.106.197
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:47 pm: |
    |
Twitter:then you have a filter here , for just a memberid you are expecting m's of data?
yes there may be million members... through .txt file we get may be million member records with these columns MemberID, MemberFirstName, MemberLastName and some other columns related to member. Retrieve that data for simplicity need to retrieve all MemberID's and pass memberID to the webservice (and worst of all we have to call webservice million time passing one memberID at a time) and get additional member details. |
   
Batthar_bindaas
Hero Username: Batthar_bindaas
Post Number: 12030 Registered: 12-2006 Posted From: 208.44.237.126
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:43 pm: |
    |
Acf:
 |
   
Jaankayalu_chips
Junior Artist Username: Jaankayalu_chips
Post Number: 891 Registered: 03-2012 Posted From: 168.244.164.254
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:43 pm: |
    |
Acf:
reading and writing to same table? oh my mad there will be write locks all the time and the job will run dead slow you can create a temp table with the primary key of Members + this new column which you populate with data from the web service call kangaaru padakandi....chart lo chusaanu |
   
Acf
Junior Artist Username: Acf
Post Number: 525 Registered: 01-2008 Posted From: 69.74.106.197
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:42 pm: |
    |
Jaankayalu_chips:2. Usually, we read 100 records at a time and process them.
if we read 100 or say "n" records at a time then there will be lot of calls to database and makes the system very slow right?
Twitter:if you have data paging requirements , like displaying data in Grid on a web page then you can limit the out put in sql query in such a way that load only few pages first and load as and when necessary i.e on demand load.
I dont have to display anything on the UI. in this on demand load, you have make several calls to the database right? how is the performance? so i am basically need to figure out either of these approaches 1. get small chunks of data at a time and process them... involves several calls to database to process all records. 2. get all records at once store in memory and push back to the db once im done. so not sure which would be appropriate... or looking for better ways... |
   
Goonda
Megastar Username: Goonda
Post Number: 21861 Registered: 02-2007 Posted From: 199.82.243.103
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:40 pm: |
    |
Acf:ok to make it more clear here is the actual process.. The requirement is there should be some UI to trigger all the following processes. All these processes are sequential and may not the completed all at once because there is a manual process involved between processes. 1. There is SQL Server job that takes tab delimitted txt file and uploads the data to SQL Server table called "Members". from UI we trigger the SQL Server Job. 2. Load the data from "Members" table to c# program and call a web service by passing Member ID and get other member related information. We dont get this information in tab delimitted file. We have to call web serive to get this data no other way. 3. There is no real manipulation of the data here but only thing is we have update the data that we got from the webserive to Members table. All these process can be done on background but the we need UI only to trigger these events. let me know if it is not clear
nen cheppina method follow avvu.. that way no overhead to your application. Sasibabu: If TDP loses next elechens, i will donate 10% of my salary to TDP Skywalker: Bala chiru type kadu.....narasimha swamy avataram etti posani gadi pegulu medalo esukuntadu |
   
Twitter
Megastar Username: Twitter
Post Number: 23003 Registered: 10-2009 Posted From: 151.191.175.208
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:39 pm: |
    |
Acf:2. Load the data from "Members" table to c# program and call a web service by passing Member ID and get other member related information. We dont get this information in tab delimitted file. We have to call web serive to get this data no other way.
then you have a filter here , for just a memberid you are expecting m's of data? |
   
Goonda
Megastar Username: Goonda
Post Number: 21857 Registered: 02-2007 Posted From: 199.82.243.103
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:34 pm: |
    |
another way of doing is. get the primary/unique keys and store them in a list/map. iterate through the list/map and build a SQL query using the primary/unique keys in where clause and process them. Sasibabu: If TDP loses next elechens, i will donate 10% of my salary to TDP Skywalker: Bala chiru type kadu.....narasimha swamy avataram etti posani gadi pegulu medalo esukuntadu |
   
Acf
Junior Artist Username: Acf
Post Number: 524 Registered: 01-2008 Posted From: 69.74.106.197
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:34 pm: |
    |
ok to make it more clear here is the actual process.. The requirement is there should be some UI to trigger all the following processes. All these processes are sequential and may not the completed all at once because there is a manual process involved between processes. 1. There is SQL Server job that takes tab delimitted txt file and uploads the data to SQL Server table called "Members". from UI we trigger the SQL Server Job. 2. Load the data from "Members" table to c# program and call a web service by passing Member ID and get other member related information. We dont get this information in tab delimitted file. We have to call web serive to get this data no other way. 3. There is no real manipulation of the data here but only thing is we have update the data that we got from the webserive to Members table. All these process can be done on background but the we need UI only to trigger these events. let me know if it is not clear |
   
Twitter
Megastar Username: Twitter
Post Number: 23002 Registered: 10-2009 Posted From: 151.191.175.208
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:34 pm: |
    |
Acf:
bottom line getting millions of records at once to the memory is not recommended at all what ever object your choose , unless the requirement is just reading from the DB and write into the files on the server it self (no round trips on the wire) |
   
Jaankayalu_chips
Junior Artist Username: Jaankayalu_chips
Post Number: 889 Registered: 03-2012 Posted From: 168.244.164.254
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:33 pm: |
    |
* i am assuming that processing all the million records is not part of one transaction Few suggestions: 1. Consider your transaction requirements and divide the job 2. Usually, we read 100 records at a time and process them. 3. Run your job in multiple threads 4. Add processing indicator column to your trigger table. pending, processed, failed kangaaru padakandi....chart lo chusaanu |
   
Twitter
Megastar Username: Twitter
Post Number: 23001 Registered: 10-2009 Posted From: 151.191.175.208
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:30 pm: |
    |
Acf:What is the best approach in this scenario 1. Make a call to SQL Server DB and get dataset with millions of records 2. Work on the data including some web service calls 3. Push the data back to SQL Server. If we get all million records at once to the application and store in memory using either Dataset or generic collection (which one is better Dataset or generic collection) i know it takes huge memory...
I would suggest go with Generic collections instead of Datasets. if you have data paging requirements , like displaying data in Grid on a web page then you can limit the out put in sql query in such a way that load only few pages first and load as and when necessary i.e on demand load. we have done this in previous project it was asp.net and c# with oracle. |
   
Dma
Side Hero Username: Dma
Post Number: 6874 Registered: 11-2009 Posted From: 70.184.123.211
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:23 pm: |
    |
What do you mean by Millions? Are you aggregating and creating reports? - The result should not be that bad processing data from an uploaded file? - This has to be designed to be processed in the backend create an excel dump? - Is this static data or dynamic based on parameters? If it is static, create an excel file every night and provide them a link. If not, be careful on how to dump the data. If you do not do it right, you could cause the system to crash with out of memory problems. Other design considerations for excel dump are - 32 bit Excel can only handle 64K rows. |
   
Jalsa
Hero Username: Jalsa
Post Number: 16456 Registered: 02-2008 Posted From: 159.53.174.143
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:23 pm: |
    |
Goonda:
brother, i don't think he is talking about batch process (backend processing). if it's a back-end process, then ETL is a good fit. |
   
Emc2
Side Hero Username: Emc2
Post Number: 8835 Registered: 03-2008 Posted From: 71.246.229.243
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:22 pm: |
    |
Acf:
there is no simple solution... minimize the data sets and pump on demand to UI.. creating temp tables on fly and access final data sets from those tables is another approach.. you have to try multiple options like this.. cherapakura chedevu.
|
   
Jalsa
Hero Username: Jalsa
Post Number: 16455 Registered: 02-2008 Posted From: 159.53.174.143
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:21 pm: |
    |
Goonda:ETL
can you explain how is this a good fit in this scenario? |
   
Goonda
Megastar Username: Goonda
Post Number: 21852 Registered: 02-2007 Posted From: 199.82.243.103
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:21 pm: |
    |
i am guessing this to be a batch process. I don't think a programming language can be helpful. Its better to use ETL tools like Informatica or datastage. Sasibabu: If TDP loses next elechens, i will donate 10% of my salary to TDP Skywalker: Bala chiru type kadu.....narasimha swamy avataram etti posani gadi pegulu medalo esukuntadu |
   
Acf
Junior Artist Username: Acf
Post Number: 523 Registered: 01-2008 Posted From: 69.74.106.197
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:20 pm: |
    |
Life_sucks:i dont think you'll display million records in a page.. use pagination ? i dont know MS technologies.
my mistake should have put this info on my initial post. I dont have to display all those records. all the manipulation will be on the background of a web app |
   
Jalsa
Hero Username: Jalsa
Post Number: 16454 Registered: 02-2008 Posted From: 159.53.174.143
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:20 pm: |
    |
i am assuming you need to show these records on the web page. Millions of records in web-server memory is not a good idea, unless you want it to acts like a cache. Assume, multiple users are running this query. How about changing the query to retrieve "n" records at a time and paginate accordingly on the UI? |
   
Life_sucks
Side Hero Username: Life_sucks
Post Number: 2722 Registered: 05-2008 Posted From: 168.230.130.248
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:18 pm: |
    |
i dont think you'll display million records in a page.. use pagination ? i dont know MS technologies. |
   
Acf
Junior Artist Username: Acf
Post Number: 522 Registered: 01-2008 Posted From: 69.74.106.197
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:18 pm: |
    |
Emc2:what tools/software you have? what is your current environment...
Visual Studio 2010, Sql Server 2008 anthe...
Goonda:ETL
hmm no idea about ETL and its integration with .net application. any example? |
   
Emc2
Side Hero Username: Emc2
Post Number: 8834 Registered: 03-2008 Posted From: 71.246.229.243
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:15 pm: |
    |
Acf:
what tools/software you have? what is your current environment... cherapakura chedevu.
|
   
Goonda
Megastar Username: Goonda
Post Number: 21849 Registered: 02-2007 Posted From: 199.82.243.103
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:13 pm: |
    |
ETL Sasibabu: If TDP loses next elechens, i will donate 10% of my salary to TDP Skywalker: Bala chiru type kadu.....narasimha swamy avataram etti posani gadi pegulu medalo esukuntadu |
   
Acf
Junior Artist Username: Acf
Post Number: 521 Registered: 01-2008 Posted From: 69.74.106.197
Rating: N/A Votes: 0 (Vote!) | | Posted on Monday, October 08, 2012 - 01:11 pm: |
    |
I need better way of handling huge amount of data (millions of records) in c# web application. never worked on such huge data so need suggestions. What is the best approach in this scenario 1. Make a call to SQL Server DB and get dataset with millions of records 2. Work on the data including some web service calls 3. Push the data back to SQL Server. If we get all million records at once to the application and store in memory using either Dataset or generic collection (which one is better Dataset or generic collection) i know it takes huge memory... is there any better approach? TIA |