Classicnewcar.us


SafeUM Blog - 154 million voter records exposed due to

SafeUM Blog - 154 million voter records exposed due to

Creativity & Learning. This morning i have found the way to detach SharePoint databases from SQL Server 2008 Express Edition and then attach all databases to our new SQL Server 2008 R2 Enterprise edition. But then i found problem on re-attach the databases to SQL Server Express edition. It returns me an error as shown on the image below. TITLE: Microsoft SQL Server Management Studio------------------------------ Cannot show requested dialog. ------------------------------ ADDITIONAL INFORMATION: Parameter name: nColIndex Actual value was -1. (Microsoft.SqlServer.GridControl) ------------------------------ BUTTONS: OK ------------------------------ Then after hours looking for the solution, finally i found the article by googling it, saying that this is one of the SQL Management Studio bugs. So all you need to do is using Transact-SQL to attach the database. Below is the T-SQL script that i used to re-attach my SharePoint database and worked very well.EXEC sp_attach_db @dbname = N'Your_SharePoint_DBName',     @filename1 = N'[drive]:\DB_Attach_Location\SharePointDB_File.mdf',     @filename2 = N'[drive]:\DB_Attach_Location\SharePointDB_Log.LDF'; Example: EXEC sp_attach_db @dbname = N'SharePoint_AdminContent_cc5d5fea-c432-4746-930e-cfc2138cffd3',    @filename1 = N'E:\Data\MSSQL10.SHAREPOINT\MSSQL\DATA\SharePoint_AdminContent_cc5d5fea-c432-4746-930e-cfc2138cffd3.mdf',     @filename2 = N'E:\Data\MSSQL10.SHAREPOINT\MSSQL\DATA\SharePoint_AdminContent_cc5d5fea-c432-4746-930e-cfc2138cffd3_log.LDF'; Once you run the query you may notice the new database has been attached successfully. Hope it helps. Thanks!Had the same problem! Any clue on why this happens?Reattaching the DB solved all of them.Mhi Mario, probably the problem was with the security/privileges, because when i attached it by using Domain Admin and added the Domain Admin as dbcreator in the SharePoint instance i had no problem at all.Try to add your account as dbcreator in SharePoint DB server. Go to Security > Server Roles, add your account in dbcreator group.Mei.Thank you so much for the great help!!! Brilliant!!!Run Management Studio as administrator by right clicking on it's icon in Start Menu and selecting 'Run as administrator'. You need to do this even if you are an administrator on the PC.Thank you. This was a huge help.thz to u !!!it is really helpfull to me ...:DThanks a lot Mei!!XavierHello Mel, How can you attach the T-SQL script to sharepoint database pleaseI solved this problem with my SQLEXPRESS. You need run SSMS as Administrator (even if your actualy account is added to administarator group you need use Administarator account directly ) or connect to DB as SA user (on my Pc was disabled). Then you can manage you SQL Server ( Attach option should work now), you can activate SA account or you can add your actual login to sysadmin if you like use it with full access to your SQL Server.Do Consider this if you are stupid like me :)Cause: Please note that old versions of SSMS (SQLServer Management Studio) are not forward compatible. Using 2012 SSMS you can connect to SQLServer 2005/2008 databases as it is backward compatible.It looks like Microsoft will not fix the forward compatiable.Solution: Upgrade all your client tools to highest version, and then you won't be missing out on any functionality or getting error messages.lovely find sir...really very helpful..... Post a Comment BLOG_CMT_createIframe('https://www.blogger.com/rpc_relay.html', '16837478032432179987');



OakLeaf Systems is a Northern California software consulting organization specializing in developing and writing about Windows Azure, Windows Azure SQL Database, Windows Azure SQL Data Sync, Windows Azure SQL Database Federations, Windows Azure Mobile Services and Web Sites, Windows Phone 8, LINQ, ADO.NET Entity Framework, OData and WCF Data Services, SQL Server 2008+, and Visual Studio LightSwitch. TIP: Click the latest item's title below to speed loading.• Updated 8/27/2010 with an explanation for long execution times with cloud execution or data storage. I described my first tests with LoadStorm’s Web application load-testing tools with the OakLeaf Systems Azure Table Services Sample Project in a Load-Testing the OakLeaf Systems Azure Table Services Sample Project with up to 25 LoadStorm Users post of 11/18/2010. These tests were sufficiently promising to warrant instrumenting the project with Windows Azure Diagnostics.Users usually follow this pattern when opening the project for the first time (see above):Only programmers and software testers would find the preceding process of interest; few would find it entertaining.Setting Up and Running Load TestsBefore you can run a load test, you must add a LoadStorm.html page to your site root folder or a <!-- LoadStorm ##### --> (where ##### is a number assigned by LoadStorm) HTML comment in the site’s default page.LoadStorm lets you emulate user actions by executing links with HTTP GET operations (clicking Next Page, then First Page) and clicking the Count Customers, Delete All Customers and Create Customers buttons in sequence with HTTP POST operations. Step 6 is required to set the form up for the Create Customer step. Here’s the Load Test setup starting with 5 users and increasing to 25 users over 10 minutes:Following are the results of the preceding load test:Finally, here’s the ysis of the preceding test run:The ysis showed four errors occurred at about 6 and 9 minutes. The errors probably are the result of failure to complete the preceding task before invoking a successive GET or POST method. Execution Time SummaryThe following table shows the execution times in seconds for the individual operations with three operational configurations:The relatively long times for the project running in the Azure production fabric to perform iterative CRUD operations on tables in Azure production storage, compared with the same project running in the Azure development fabric with Azure production storage, is surprising. The production affinity group for compute and storage is USA South Central. I plan to use the load tests and a diagnostic files yzer to see if I can determine the source of the problem.• Update 8/27/2011: The long execution times for CRUD operations with cloud execution or cloud storage were subsequently greatly reduced by changingstatements toThis change executes CRUD operations in a single batch. The current UI includes a Batch Updates check box to enable comparisons.LoadStorm’s CSV Test ReportsFollowing is a screen capture of Excel 2010 displaying the CSV file exported by clicking the preceding load test page’s Download As CSV button:All errors were HTTP 500 (Internal Server) Errors, which Windows Azure Diagnostics should report. Setting up Windows Azure DiagnosticsFollowing is the code added to the SampleWebCloudService.sln project to establish Windows Azure Diagnostics (WAD) in accordance with the Microsoft Systems Center team’s  Monitoring and Diagnostic Guidance for Windows Azure–hosted Applications white paper of 6/21/1020 by Brian Copps:The Web.config file for the Web Role contains the following two added sections, which enable WAD and tracing failed IIS requests:and Failed Request Tracking:Verifying WAD with Cerebrata’s Azure Diagnostics ManagerYou need a log viewing application to verify that all WAD log files are being generated by the code you add, as well as your Web.config file modifications. Cerebrata Software Pvt. Ltd. introduced its Azure Diagnostics Manager (ADM) in September 2010 and offers a 30 day free trial. Cerebrata described ADM on 11/22/2010 as follows:Following is a screen capture of the Cerebrata ADM Dashboard showing % Processor time for a repeat of the preceding test on 11/21/2010:Here’s the list of 15 IIS Failed Request Logs of HTTP 500 errors: And the Compact View of the the first IIS Failed Request Log, which tells you more than you’re likely to want to know about these failures:The updated source code contains multiple trace log entry instructions, such as:that generate Cerebrata Trace Log items:Here’s Excel 2010 displaying the bottom of the CSV file exported by clicking the down-arrow at the bottom right of the preceding capture:As time is available, I’ll report in future posts about troubleshooting with the aid of WAD what appear to be issues with the performance of the sample project when adding and updating customer records. Posted by Roger Jennings (--rj) at 10:28 AM   Labels: Azure, Azure Diagnostics, Cerebrata, Cloud Computing, LoadStorm, Windows Azure, Windows Azure Diagnostics Post a Comment Atom 1.0 site feed RSS 2.0 site feed Follow me on:Twitter: rogerjenn Tweet Linked In: Roger Jennings Facebook: Roger Jennings Google+ Friendfeed: RogerJenningsPinterest: rogerjennFUSELABS so.clMy SharePoint Online Site and BlogMy Office 365 Preview Developer SiteMy SharePoint 2010 (Office 365) SiteMy Microsoft FUSELABS Montage PageMy Amazon Author Page My GigaOM Pro yst PageList of 2,000+ original (topical) OakLeaf posts since 1/1/2009 in reverse date orderSearch for 'Windows Azure'Search for 'SQL Azure'Search for 'OData'Search for 'LightSwitch' Azure + Windows-Azure QuestionsWindows-Azure-Storage QuestionsSQL-Azure QuestionsWindows Azure ForumSQL Azure - Getting Started ForumWindows Azure AppFabric ForumWindows Azure Webcasts on Channel9Windows Azure Management PortalSQL Azure Labs: Incubator SaaS ProjectsWindows Azure Platform AppFabric LabsWindows Azure Apps on Microsoft PinpointOriginal content copyright © Roger Jennings 2005 - 2013 and licensed under a Creative Commons Attribution 3.0 License.Content of others quoted in this post is subject to the copyright terms of the originator.





#Contact US #Terms of Use #Privacy Policy #Earnings Disclaimer