Sunday, December 20, 2009

Remove Raid metadata from drive under linux

Following my previous post, the disk was initiall part of a RAID group using HP Embedded SATA RAID controller, it was removed from the controller and used at another PC.
Since the dmraid -x command doesn' work, I ended up with using the following command to remove the raid metadata (will destroy the old data!)

[root@localhost liveuser]# dd if=/dev/zero bs=1M seek=150000 skip=150000 of=/dev/sdc

(this is a 160G drive so I decide to remove the last 10G's data which will definitely include the raid metadata).

Friday, December 18, 2009

Raid metadata prevent fedora from installing on Sata Drive

I was trying to install fedora12 on my PC but the installation process didnt recognize the sata disk at all...after spend some time searching, it turned out to be the RAID metadata that is causing the issue...

Below are the output:

[root@localhost george]# dmraid -r
/dev/sdc: ddf1, ".ddf1_disks", GROUP, ok, 312237856 sectors, data@ 0

[root@localhost george]# dmraid -x
ERROR: ddf1: wrong # of devices in RAID set "ddf1_System" [1/2] on /dev/sdc

About to delete RAID set ddf1_System

WARNING: The metadata stored on the raidset(s) will not be accessible after deletion
Do you want to continue ? [y/n] :y
ERROR: Raid set deletion is not supported in "ddf1" format
[root@localhost george]#

still trying to figure out how to remove the raid metadata without putting back the drive to RAID controller (it is on another computer, the HP ML110 server I mentioned in my previous post)

Tuesday, December 15, 2009

Extend SharePoint Databases

search DBs:
http://www.mssqltips.com/tip.asp?tip=1794

For Content DBs:
http://www.mssqltips.com/tip.asp?tip=1777

moving tables between filegroups:
http://decipherinfosys.wordpress.com/2007/08/14/moving-tables-to-a-different-filegroup-in-sql-2005/

search objects inside filegroup(By John@sqlservercentral):


SELECT o.name AS TableOrIndex
FROM sysfiles f
JOIN dbo.sysfilegroups s
ON f.groupid = s.groupid
JOIN dbo.sysindexes i
ON i.groupid = s.groupid
JOIN dbo.sysobjects o
ON i.id = object_id(o.name)
AND i.indid in (0, 1)
WHERE f.name = 'My_Logical_File_Name'

UNION

SELECT i.name
FROM sysindexes i join sysfilegroups f ON i.groupid = f.groupid
JOIN sysfiles l ON f.groupid = l.groupid
WHERE l.name = 'My_Logical_File_Name'

Wednesday, November 18, 2009

create Document Library based on user template

it took me a while to figure out that the Lists.asmx web service doesn't support creating a document library based on user template, the only way to do it is via a customized web service.
the code for web service can be found at http://blogs.msdn.com/sridhara/archive/2008/06/07/getwebtemplate-web-method-does-not-return-addlist-web-method-does-not-recognize-my-custom-list-templates.aspx

In my code, I also need to check whether the docLib already exists which can be done via the standard Lists.asmx web service.

Tuesday, November 10, 2009

AD Distinguished Name and special character

I wrote a script to send out password expiration notifcations a while ago, recently I got told that some users didnt' receive their notification.
After looking into it, I realized that the OU structure got changed recently and some OU name contains special characters, in this case if you use GetObject("LDAP://"& DN) to get user Object, DN has to be encoded for special characters.
In my case, a '/' was causing the issue and has to be replaced with '\2F' (hex).

Monday, November 2, 2009

Phoenix Online Data Mining Utility

Here it is at http://69.197.142.200:8080/Phoenix/ , our group project for cs9323 course. this web app basically does some data mining tasks online. Not really a brilliant idea to do data mining online but that was what we were given.

Wednesday, October 21, 2009

Moving database file and renaming database file

exec sp_helpdb DatabaseName

datach:

exec sp_detach_db 'MyDatabase', 'true'

attach:

exec sp_attach_db 'MyDatabase',
'C:\Program Files\Microsoft SQL Server\MSSQL\data\MyDatabase_Data.MDF',
'C:\Program Files\Microsoft SQL Server\MSSQL\data_log\MyDatabase_Log.LDF'

Change logical file name:

alter database MyDatabase modify file (name='test', newname='MyDatabase')
alter database MyDatabase modify file (name='test_log', newname='MyDatabase_log')

source: Timothy Chen Allen

Tuesday, October 20, 2009

A quick tutorial on using hibernate for webapp

Directly from Netbeans KB:
http://www.netbeans.org/kb/docs/web/hibernate-webapp.html
However, don't expect to learn much from this over simplified tutorial.

Monday, October 19, 2009

Using Netbeans and Eclipse side by side with SVN

while I was doing the group assignment for comp9323, I thought about using netbeans and eclipse side by side. Initially I preferred Netbeans but after all I was using Eclipse for the project.
After playing around Subversion and Netbeans/Eclipse for a while, I actually realize that this can be setup in a way that netbeans and eclipse can be used side by side in same project. In my case, I created the source repository in SVN then synchronize it with Eclipse, later I use Netbeans to check out a copy then follow the instructions to create netbeans project, when it complaints about existing build folder, just remove the build folder then you can proceed.

Monday, September 28, 2009

"Media is Write Protected" in windows server 2008 shared drive

Just solved an issue with users at Melbourne office, apparantly something got changed to their mapped network drive and they can't write to the shared drive.

Agains, a technet article explained this and the solution:

You may see "Media is Write Protected" Error or VDS error 80070013 after bringing SAN disk online via Diskpart in Windows Server 2008
http://support.microsoft.com/kb/971436

Thursday, September 10, 2009

Epicor E4SE and eBackoffice Go Live

Finally, it is time.
E4SE 811 and eBackoffice 736 will replace our current 810b/735a environment after staying so many years.
Just got the new test environment up together with the training environment.
time to go sleep now.

Monday, September 7, 2009

setting up datasource in tomcat & glassfish appserv

Notes to setup mysql datasource in tomcat/sunappserv (for my cs9323 assignment):

1. copy mysql-connector-java-5.1.8-bin.jar to c:\\tomcat6\lib or c:\\sun\appserv\lib

2. Add the following section to c:\\tomcat6\conf\context.xml

For glassfish, go to http://localhost:4848 and create a pool under 'Resource/Connection Pools' then create a jdbc resource called 'jdbc/mydb' in 'Resources/JDBC Resouces'. (you need to specific host,port, user, pass,db in the pool settings)

3. Under webcontent\META-INF, edit context.xml to include the following section:



Alternatively, this can be done via adding the following section into WEB-INF/web.xml:



4. for tomcat, use the following code to do the lookup:


For glassfish:

Tuesday, August 18, 2009

Who is Logged on at that Computer?

A quite useful script:

' WhoLogon.vbs
' Sample VBScript to discover which user is logged on
' Author Guy Thomas http://computerperformance.co.uk/
' Version 1.2 - December 2005
' --------------------------------------------------------------'
Option Explicit
Dim objWMIService, objComputer, colComputer, strComputer

strComputer = "."
Set objWMIService = GetObject("winmgmts:" _
& "{impersonationLevel=impersonate}!\\" _
& strComputer & "\root\cimv2")
Set colComputer = objWMIService.ExecQuery _
("Select * from Win32_ComputerSystem")

For Each objComputer in colComputer
Wscript.Echo objComputer.UserName & " is logged on"
Next

' End of Sample Script. Who is logged on?

Thursday, August 13, 2009

publish SQL Reporting service to internet

1. we have to use NTLM for authentication in our environment which can be done by executing 'cscript adsutil.vbs set w3svc/1/root/NTAuthenticationProviders "NTLM"
2. modify RSWebApplication.config and comment out the line
< ReportServerUrl>< /ReportServerUrl >

Tuesday, August 11, 2009

network performance issue

we had a significant network performance slowdown back to 1 week ago which initially we thought was caused by server performance.
After tuning up a few servers, the situation seems to be improved but the system response is still notably slow.
Eventually we got to look at the firewall (Cisco ASA which replaced the pix firewall a week ago, the CRC error seems to be introduced at that time).
I did a ping from the Cisco ASA firewall and observed packet loss, the network guy later discovered CRC errors using show interface command (I did same thing but didn't notice the CRC error!) which apparantly is the root caused of the problem.
I have to admit though, by looking at the server side of things, we did discover a few performance bottleneck and improved the application performance. I guess this can also be noticed when the netowrk is back to normal.

Good lesson.

Tuesday, July 28, 2009

Change SQL Server Collation in a clustered environment

start /wait D:\Servers\setup.exe /qn VS=[VIRTUALSERVER] INSTANCENAME=[MSSQLSERVER] REINSTALL=SQL_Engine REBUILDDATABASE=1 ADMINPASSWORD=[CURRENT USERPASS] SAPWD=[sapassword] SQLCOLLATION=[NewCollation] SQLACCOUNT=[domain acct] SQLPASSWORD=[domainpass] AGTACCOUNT=[domainacct] AGTPASSWORD=[domainpss]

Make sure that ADMINPASSWORD is the current user's password and this user should have admin rights on all involved nodes.

Disable the loopback check

This is used to fix a logon issue with NTLM and IIS, the error message:

Logon Failure:
Reason: An error occurred during logon
User Name: [User Name]
Domain: [Domain]
Logon Type: 3
Logon Process: Èù®
Authentication Package: NTLM
Workstation Name: [Workstation Name]
Status code: 0xC000006D
Substatus code: 0x0
Caller User Name: -
Caller Domain: -
Caller Logon ID: -
Caller Process ID: -
Transited Services: -
Source Network Address: 127.0.0.1
Source Port: 1298


For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp.

the following steps will fix the issue:

1. Click Start, click Run, type regedit, and then click OK.
2. In Registry Editor, locate and then click the following registry key:
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Lsa
3. Right-click Lsa, point to New, and then click DWORD Value.
4. Type DisableLoopbackCheck, and then press ENTER.
5. Right-click DisableLoopbackCheck, and then click Modify.
6. In the Value data box, type 1, and then click OK.
7. Quit Registry Editor, and then restart your computer.

sourced from http://support.microsoft.com/kb/896861

Wednesday, July 22, 2009

Foreign Key vs. Trigger Referential Integrity in SQL Server

I personally thought foreign key action is better than triggers, well the following article back up my idea:

http://www.mssqltips.com/tip.asp?tip=1508

Monday, July 13, 2009

Adding cluster disk to sql

went through the practice again and realized that I actually forgot something I did before :-)

Also one mistake is, the LUN was presented to me already partitioned and thus when I tried to do the SAN disk alignment it gave me error message which I happened to find one technet article taking about different issue with same symptom and led to a waste of time.(error msg: DiskPart was unable to create the specified partition)

Anway, I finally figured this out and delete the old partition and then things went smoothly.

Finally, the performance project is finally nearlly there, the newly purchased test server is put online and a significant performance increase was observed. (previoulsy revenue recognition process took 50+ minutes now only 1+ minutes!).

Thursday, July 9, 2009

Web Server Farm

Sometimes different iis configuration inside the web server farm can cause serious issue.
the following command can be quiet helpful:
iiscnfg /copy /ts websrv08 /tu domain\george.wen /tp xxxx

Tuesday, June 30, 2009

Nginx with FastCgi

Haven't been playing with linux web server for a while, I tried Nginx and it seems to be impressively fast:

Red Hat / CentOS Install nginx PHP5 FastCGI Webserver

How do I install and configure a ngnix FastCGI php5 HTTP / web server under Red Hat / RHEL / Fedora / CentOS Linux?

Nginx (engine x) is an HTTP(S) server, reverse proxy and IMAP/POP3 proxy server written by Igor Sysoev. It is known for its high performance, stability, rich feature set, simple configuration, and low resource consumption.
Step # 1: Enable EPEL repo

ngnix is not included in the base system. Turn on EPEL repo to install nginx stable release:
# rpm -Uvh http://download.fedora.redhat.com/pub/epel/5/$(uname -m)/epel-release-5-3.noarch.rpm
Step # 2: Install ngnix

Type the following command at a shell prompt:
# yum install nginx
Sample output:

Loaded plugins: downloadonly, fastestmirror, priorities, protectbase
Loading mirror speeds from cached hostfile
* epel: archive.linux.duke.edu
* base: ftp.linux.ncsu.edu
* updates: centos.mirror.nac.net
* addons: mirror.cs.vt.edu
* extras: centos.mirror.nac.net
0 packages excluded due to repository protections
Setting up Install Process
Parsing package install arguments
Resolving Dependencies
--> Running transaction check
---> Package nginx.x86_64 0:0.6.34-1.el5 set to be updated
--> Finished Dependency Resolution

Dependencies Resolved

==============================================================================================================================================================
Package Arch Version Repository Size
==============================================================================================================================================================
Installing:
nginx x86_64 0.6.34-1.el5 epel 319 k

Transaction Summary
==============================================================================================================================================================
Install 1 Package(s)
Update 0 Package(s)
Remove 0 Package(s)

Total size: 319 k
Is this ok [y/N]: y
Downloading Packages:
Running rpm_check_debug
Running Transaction Test
Finished Transaction Test
Transaction Test Succeeded
Running Transaction
Installing : nginx [1/1]

Installed: nginx.x86_64 0:0.6.34-1.el5
Complete!

nginx configuration file

* Default config file: /etc/nginx/nginx.conf
* Default SSL config file: /etc/nginx/conf.d/ssl.conf
* Default virtual hosting config file: /etc/nginx/conf.d/virtual.conf
* Default documentroot: /usr/share/nginx/html

Configure PHP As FastCGI

Type the following to install php5 with other modules:
# yum install php-pear-Net-Socket php-pear php-common php-gd php-devel php php-mbstring php-pear-Mail php-cli php-imap php-snmp php-pdo php-xml php-pear-Auth-SASL php-ldap php-pear-Net-SMTP php-mysql
Install spawn-fcgi simple program for spawning FastCGI processes

Type the following command:
# yum install spawn-fcgi
Next, download spawn-fcgi init.d shell script:
# wget http://bash.cyberciti.biz/dl/419.sh.zip
# unzip 419.sh.zip
# mv 419.sh /etc/init.d/php_cgi
# chmod +x /etc/init.d/php_cgi
Start php app server, enter:
# /etc/init.d/php_cgi start
# netstat -tulpn | grep :9000
Sample output:

tcp 0 0 127.0.0.1:9000 0.0.0.0:* LISTEN 14294/php-cgi

By default php server listens on 127.0.0.1:9000 port. Finally, update /etc/nginx/nginx.conf as follows:
# vi /etc/nginx/nginx.conf
Modify / append as follows:


location ~ \.php$ {
root html;
fastcgi_pass 127.0.0.1:9000;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME /usr/share/nginx/html$fastcgi_script_name;
include fastcgi_params;
}


Save and close the file. Restart nginx:
# service nginx restart
Create /usr/share/nginx/html/test.php as follows:


phpinfo();
?>

Original Post at http://www.cyberciti.biz/faq/rhel-fedora-install-configure-nginx-php5/

Sunday, June 28, 2009

Tip: Fighting OS-Level Fragmentation

SQL Server Magazine UPDATE TIP OF THE WEEK
This SQL Server tip is brought to you by SQL Server Magazine, the smart guide to building world-class applications! Click here for two free sample issues:

Tip: Fighting OS-Level Fragmentation
by Brian Moran

Q. I've read about how fragmentation at the SQL Server level can hurt my queries' performance. I understand how to look for fragmentation within my SQL Server tables by using the DBCC SHOWCONTIG command. However, should I also worry about fragmentation at the OS level?

A. Fragmentation exists at both the SQL Server level and the file level within the OS. It sounds like you already use DBCC SHOWCONTIG to combat SQL Server-level fragmentation (for a DBCC SHOWCONTIG primer, see SQL Server Books Online—BOL). So, let's look at how and when OS-level defragmentation can speed up your SQL Server.

Remember that SQL Server can report 0 percent fragmentation even when the on-disk files are horribly fragmented. SQL Server doesn't know or need to know how the OS physically lays out the bits on disk; it's the OS's job to manage physical bits on disk. However, because SQL Server doesn't know how the bits are laid out on disk, SQL Server has no direct way to report about file fragmentation. Imagine you're performing a table scan and SQL Server reports 0 percent fragmentation but the file that contains the table is scattered all over your disk. In this case, performing an OS-level defragmentation could help performance by making the files more contiguous on disk. However, defragmenting at the OS level doesn't always have the effect that you might expect. SQL Server pre-allocates space when you create a file. For example, if you create a 100MB file on a disk that's been defragmented recently, SQL Server creates the file in contiguous space. And SQL Server can read and write to that file forever, and the file won't fragment at an OS level (unless you stop SQL Server and perform a file-copy operation that puts the file into noncontiguous space). But if you create that 100MB file on a disk that's heavily fragmented, the file will also be fragmented.

What happens if SQL Server creates the 100MB file in contiguous space, but auto-grow operations increase the size of the file to 200MB? In this case, the new space added to the SQL Server file might fragment as the file grows. An OS-level defragmentation will improve performance if the files become fragmented and you're performing table-scan operations within SQL Server that look at ranges of data. So, I suggest scheduling a regular OS-level defragmentation. Given that file defragmentation is integrated in the operating system, defragmenting open files is perfectly safe. However, I/O activity generated must be considered if continuous high performance is mandatory. SQL Server slow time might not be a problem for small shops, but larger enterprise shops trying to maintain 4 or 5 nines (99.99% to 99.999%) of availability will be hard-pressed to find time for I/O intensive OS-level defragmentation. In such cases, many third-party defragmenters have advanced features to minimize or eliminate the I/O overhead of disk defragmentation. The best practice for highly available SQL Servers is to create your files on contiguous space that's already been defragmented, planning ahead of time for data growth and building that growth into the size of the initial files.

Send your technical questions to Brian Moran.


Additional Reference:
Microsoft SQL Server 2000 Index Defragmentation Best Practices
http://technet.microsoft.com/en-us/library/cc966523.aspx

Sunday, June 21, 2009

Factors influencing affective commitment

Factors influencing affective commitment

(Meyer & Tropoloynsky)

 

Ø  participation in decision-making

Ø  receptiveness to employees’ideas

Ø  discretion over activities

Ø  task autonomy

Ø  role clarity & freedom from conflict

Ø  positive feedback

Ø  job scope, job challenge

Ø  supportive leadership

Ø  transactional &transformational leadership

Ø  training & mentoring opportunities

Ø  performance-contingent rewards

Ø  job security

Ø  opportunities for growth & development

 

High Commitment” HRM

Ø  move from control-based management practices to those based on commitment

Ø  policies that promote mutuality

Ø  mutual goals, influence, mutual respect, rewards, responsibility

Ø  reduction in management hierarchy and division of labour, teamwork, manager as facilitator (less hands-on supervision)

Ø  According to Walton, policies of mutuality will elicit greater commitment

Ø  for successful work organisation, consider factors influencing job satisfaction, productivity & quality of work, innovation

 

 

Drivers of engagementInstitute of Employment Studies

 

Ø  Involvement in decision making

Ø  Freedom to voice ideas

Ø  Feeling enabled to perform well

Ø  Having opportunities to develop on the job

Ø  Feeling the organisation is concerned for the employees’health & well-being

 

Importance of line managers and front-line supervisors

 

Above is a slide I took from a management course which inspired me a lot in how an individual employee can be motivated and be fully engaged in the organisation and his/her job.


______________________________________________________________________
This email has been scanned by the MessageLabs Email Security System.
For more information please visit http://www.messagelabs.com/email
______________________________________________________________________

Wednesday, June 10, 2009

Comparing Table Variables with Temporary Tables


______________________________________________________________________
This email has been scanned by the MessageLabs Email Security System.
For more information please visit http://www.messagelabs.com/email
______________________________________________________________________

Tuesday, May 26, 2009

Best Practices , Design and Development guidelines for Microsoft SQL Server

Madhu K Nair, Jul 24, 2008

There are many resources available in the net but here I have a list of Best Practices,Design Guidelines and General Guidelines in Database Development and Designing specifically for SQL Server.

Best Practices

1. Use Stored Procedure: Benefits are as follows :-
(a) Code reusability
(b) Access Control: You can control permission on sp
(c) Execution plan reusability : Though adhoc query also create and reuse plan, the plan is reused only when the query is textual match and the datatypes are matching with the previous call. Any datatype or you have an extra space in the query then new plan is created
Eg.

Select  Sal from Employee where sal=$10  --Money
--And
Select  Sal from Employee where sal=10  -- Int 

Above statements will create different execution plan because the datatype of value is different.
(d) Prevent SQL Injection
(e) Procedure gives more Readability and Manageability.

2. Use Fully Qualified Name for objects : This is very significant. You must use fully qualified name when you refer any object in SQL Server. Ie. SchemaName.ObjectName. Because, when the execution plan is prepared by the query engine , in the binding process, Query engine has to resolve the Object existence. If you specify the fully qualified name the object resolution become easy for the engine and also it will be more readable.

3. Avoid using Scalar Function in SELECT statement: Recently I faced this issue and I emphasis this point. Never use Scalar function inside a query which returns a large number of rows. Scalar function behave like a cursor when you use Scalar function inside a query which returns large number of rows . Change the scalar function to Inline or Multiline table function or a view.

4. Avoid Mixing-up DML and DDL statement on a temp table inside sp : This is very important. When you Create a temptable (#table) and ALTER the same temptable in the same storedprocedure later, this DDL and DML mix-up causes the stored procedure to get recompiled. So, if a stored procedure is getting recompiled in each call check this point.

5. Select only the required columns: Select only the required column in select statement. Usage of SELECT * can be the cause of NOT using the indexes available on the table . Also if you are selecting more data then you are doing more IO. In short we should limit the IO.

6. Avoid Usage of HINTS : HINTS prevent Query engine automated optimization capability. You may find a hint gives you better performance on a particular scenario. But it may behave differently as the data grows or when scenario changes. Check this KB on HINTS http://msdn.microsoft.com/en-us/library/ms187713.aspx

7. Use Table variable and Temp table as far as possible: You must use Table variable (@TableName) or Temp table (#TableName) for intermediate storage of records in an procedure. Avoid using Table variable for large record set. There are pros and cons between Table variable and Temp table, but in general, if the record set is small you should go for Table variable.

8. Use SET NOCOUNT ON : Basically, you must reduce the data transferred on the network. Database Engine, return the number of rows effected by the statements to the client which is unnecessary and you can avoid that using this statement. It is a must in all Stored procedure.

9. Do not change SET OPTION In connection: Changing SET Option during connection or anywhere else will cause the stored procedure to recompile. Refer this KB for more info http://www.microsoft.com/technet/prodtechnol/sql/2005/recomp.mspx

10. EXISTS vs IN : IN Operator can easily be replaced by EXISTS which is more optimized for correlated queries. But you may find IN better for small tables.

11. Keep the transaction as short as possible: Deadlock is a outcome of ill-formed query. You must keep the transaction as short as possible to avoid dead lock. And also refer the object in the same order inside transaction.

12. Avoid user input inside a transaction: Do not accept a user input inside a transaction.

13. Avoid doing Front-End work in Databases : String processing , Ordering , Conversion etc can easily be done at the client side and you should avoid doing these kind of work in Database. The point here is, you should have a clear segregation of tasks between Data Layer, Data Access Layer (DAL) and Business Layer. For example, you can easily number your rows in client side where as if you do that in Database you have to use a Function or extra join which can be avoided.

14. Avoid Function in select statement: Any functions like CONVERT(),CAST,ISNULL usage in query may ignore the indexes available on the table.

15. Do not use EXEC (‘String’) , use sp_executeSql : As far as possible you try to avoid Dynamic SQL. If there is no other option use sp_ExecuteSQL DO NOT USE EXEC(‘string’). Because EXEC statement is prone to SQL Injection and it is not parametrized query which can re-use execution plan.

16. Use proper size for the input parameter: This is one of the step to avoid SQL Injection and also the reduce the memory usage.

17. Do not keep the name of sp with sp_ prefix: Naming convention is very important. Do not name the storedprocedures with SP_ as prefix (eg sp_ somespname ) because this naming convention is used for system stored procedure in MS SQL Server.

18. USE WHERE Condition as far as possible: Basically, you should limit the rows fetched by the query.

19. Avoid Negative operator : Avoid using <> , NOT IN, NOT EXISTS kind of operator because it causes a table scan. Query engine has to ensure there is not data till the last row is read.

20. Avoid Cursor /loops: In SET Based operation, in general looping can be avoided.

21. Avoid using Like ‘% %’ : If you use % in both side of the searching value, the query will go for table scan which should be avoided. If the application is more text searching kind go for Full Text Index.

22. Do not use WITH Recompile : Usage of WITH Recompile causes the procedure to recompile each time it call. You must avoid this command.

23. JOIN Consideration : When you JOIN two table consider these points
(a) Avoid using negative operator (<> ,NOT IN) in JOIN
(b) Avoid Like operator in Join


Design Guidelines

1. Create Covering indexes: Create covering indexes. Covering index will have all the data required by the query at the leaf level itself. Covering contains all the columns used in SELECT, WHERE, ORDERBY, JOIN etc.
Eg.

Select Col1,Col2 From YourTableName Where Col3=1 Order by Col4.

The coveing index for the above mentioned query will be
Col1+ col2+ col3+ col4. (Note : Most selective column should come first in the index creation statement)

2. Remove Unwanted indexes : In SQL Server 2005 it is very easy to find unused indexes. Too many or too less indexes on a table are equally bad. If you have unwanted/unused indexes on a table Insert/Update statement will have performance hit and also we all know indexes consume space.

3. Create the indexes most selective column as first column in Index : Index creation has to be done after proper analysis. You must create the index with Most Selective column at first and so on.

4. Formatting the stored procedure and queries : You must have a format / template for each object (sp/function/views) and everyone (the dev team) should stick to the format defined. And also the query has to be formatted well so that it is more readable.

5. Use Identity column if the table is INSERT oriented table as Clustered Index to avoid page split : This is a design and data modeling issue. If you have more insert kind of table (some kind of logging table) then you must go for Identity Column (ever increasing) as Clustered Index. This helps to resolve page split. There may be Hotspot issue (all transaction contending for same page of a table), but I have never faced.

6. Use proper fillfactor for Indexes: Very important to avoid Page Split. In general transactional table can be kept at 80-90 fillfactor.

7. Balanced Normalization / De-normalization: You must have a trade off between Normalization and de-normalization. At time De-normalization can give you better performance at the cost of Data redundancy.

8. Primary Key size and Composite primary key : You must limit the size of the PK because, in a relational database, you may be creating foreign key which refers this primary key. If you have multiple Column in PK (composite PK) or big size , you are knowingly or unknowingly increasing the space usage. If the composite PK contains more than 3 columns then you may go for surrogate key like Identity column as PK.

9. Do not alter system Objects: If your application requires some tweaking of system objects then you are in trouble. The structure of system object can be changed by Microsoft in any release or patches. So avoid such modeling.


Guidelines for Datatype Selection

As a Database architect I believe in the significance of proper datatype selection while designing the tables. if you do a proper analysis of the data and then select the datatype, then you can control the row, page, table size and hence increase the overall performance of the database. Following points you may consider when you design a table :-

1. If your database is to support web-based application better to go for UNICODE for the scalability of the application. (Unicode (nchar, nvarchar) takes 2 bytes per char where as ASCII (char,varchar) datatypes takes 1 bytes per char)

2. If your application is multi-lingual go for UNICODE.

3. If you are planning to include CLRDatatype (SQL Server 2005) in the database go for UNICODE Datatypes , because, if CLRDatatype is going to consume the data then it must be in UNICODE.

4. For numeric column, find the range that column is going to have and then choose the datatype. For eg. You have a table called Department and DepartmentID is the Primarykey Column. You know that the maximum rows in this table is 20-30. In such cases it is recommended to choose TinyINT datatype for this column. Generally keeping all numeric columns type of INT without analyzing the range that column going to support is not at all recommended from storage perspective.

5. Description /Comments /Remarks sort of columns may or may not have data for all the rows. So it is better to go for Variable datatypes like Varchar ,Nvarchar.

6. If you know the column is not nullable and it may contain more or less the same size of the data then for sure go for Fixed datatype like CHAR or NCHAR. Having said that it is important to know that, if you select fixed datatypes and if the column is nullable then, if you do not have any data (null) then also the column will consume the space.

7. If the size of the column is less than 20 char , use fixed width datatypes like NCHAR or CHAR.

8. I have seen in many applications use Decimal to store currency kind of data though the application needs the precision which can be supported by money. So, my point here is, use Money datatype if you need only 4 precision.

9. Use UniqueIdentitifier column as PK and ClusteredIndex or so only when it is unavoidable because UniqueIdentitifier takes 16 Bytes of the space.



General Guidelines

1. Write ANSI standard Code : You must write standard code which will scale your application. ie migration to next version will not be an issue. Do not use Deprecated features. For eg. There are DBCC Command to rebuild index but in SQL Server 2005 in order to standardize things, you have ALTER INDEX command which does the same thing.

2. Do not give Error message which expose your architecture to the frontend: I have seen folks giving very detailed error message which tells you “ blah blah table do not have this rows in blah blah database” kind which can be a invitation to the hacker

3. Use proper Isolation level required for the application: This is very significant. Before going for any isolation level, you must know the implication. All application cannot afford READUNCOMMITTED Isolation level since it can cause data inconsistency issues like Dirty Read, Phantom read, Lost Update etc. WITH NOLOCK Hint is nothing but READ UNCOMMITTED isolation level.

4. Keep the Database Object Scripts in Source Control: We all are aware of this point but generally ignore. This is important for fault tolerance and management when multiple developers are working on same project.

Last edited Aug 4 2008 at 9:27 PM by madhuottapalam, version 10

 


______________________________________________________________________
This email has been scanned by the MessageLabs Email Security System.
For more information please visit http://www.messagelabs.com/email
______________________________________________________________________

Sunday, May 24, 2009

SQL Server Support Policy for Failover Clustering and Virtualization gets an update...

According to the following article, SQL 2005/2008 clustering under Windows 2008 using VMWare/Hyper-V is now officially supported by Microsoft.

 

http://blogs.msdn.com/psssql/archive/2009/05/19/sql-server-support-policy-for-failover-clustering-and-virtualization-gets-an-update.aspx

 


______________________________________________________________________

Thursday, May 14, 2009

Calling external web service from with SQL Server 2005

I am trying to establish the way to call external web service from within sql server 2005, previously we used a javascript and use SQL Agent Job to execute the script. However, it seems that SQL 2005 has problem with it.

The external consultant proposed another way, which is to use OLE Automation, while we managed to make it work occasionally, it seems to be unreliable at all. The cause of this is still unclear, I will look back once I get more timee.

After all, we know that the native way to call external web service is to use CLR and create assembly, since I haven’t done this before, it will be quite interesting to figure out how it works. So far, I have managed to create the CLR assembly and made it working. No sure why the other two methods didn’t work, but this final approach works pretty well.


______________________________________________________________________

Tuesday, May 12, 2009

Does upgrading to SQL Server 2008 fit your business?

Does upgrading to SQL Server 2008 fit your business?


Don Jones, Contributor
03.04.2009
http://media.techtarget.com/searchSQLServer/images/spacer.gif
http://media.techtarget.com/searchSQLServer/images/spacer.gif

SQL Server 2008 has been around long enough to be considered both a known, stable quantity and a big question mark for many businesses. That question is often, "Should we migrate?"

First off, if you are implementing a new SQL Server installation and do not need a specific older version for compatibility reasons, you should buy SQL Server 2008. If you still use SQL Server 2000 or an older version and don't face compatibility hurdles, you should migrate. But the question becomes more complicated for businesses running SQL Server 2005 that have no particular technical barriers that absolutely prevent a migration. What should they do?

It depends on the applications you run
Businesses use SQL Server in two distinct scenarios. The first is to support third-party applications that require SQL Server, and the second is to support internally developed applications. In the first instance, moving to SQL Server 2008 is advisable only if a few conditions hold true. First and foremost, confirm that your application vendor supports SQL Server 2008. Second, migrating is useful only when you have enough control over the application to take advantage of SQL Server 2008's important new features.

SQL Server 2008 isn't so much of an improvement over SQL Server 2005 that you will experience a huge benefit just from migrating. To derive a business benefit, you need to leverage its new features. With many third-party applications, you have zero control over things like the database schema, file groups, index deployment and so forth. Unless the vendor releases a version of its application that leverages SQL Server 2008, there isn't much point in putting yourself through the migration process.

An exception, perhaps, is when you deploy a new SQL Server computer while also moving from an existing 32-bit SQL Server installation to a new x64-based machine. Moving from 32- to 64-bit offers major performance advantages, and unless you specifically need SQL Server 2005, you might as well upgrade to the newest version.

If you are in control of applications
If you use SQL Server to support a custom application, or if you exercise significant control over a third-party application (such as making schema changes and so forth), migrating from SQL Server 2005 to SQL Server 2008 can offer performance advantages, enable new capabilities, and make life easier on your developers. Some of the key new features include the following (though the Microsoft site offers a complete list of new features in SQL Server 2008):

  • Using data compression to save disk space. Granted, these days this capability has become less urgent but in some instances -- such as servers in a branch office -- disk space can still be precious.
  • Support for the ADO.NET Entity Framework and Entity Data Model can facilitate work with data in an object-oriented fashion. This, however, is more important to new applications than to existing ones, which aren't likely to be re-engineered to the degree necessary to take advantage of these features.
  • Some important limitations in the embedded .NET Common Language Runtime (CLR) have been overcome, such as limits on user-defined aggregates and user-defined types.
  • New data types support spatial computing, such as the geography data type, geometry data type, and Virtual Earth integration.
  • FILESTREAM data types allow large blocks of data -- such as photos or files -- to be moved outside the main database pages and into file system files that are faster to access and help trim the main database files .

Reasons to migrate, no matter what
There are a few additional reasons to upgrade to SQL Server 2008, all of which involve server-level improvements that aren't necessarily performance-related. SQL Server 2008 offers certain reliability and availability improvements, such as the following:

  • SQL Server 2008's database mirroring is more robust and, in my view, easier to configure and manage.
  • Policy-based management can be a tremendous feature for improving the consistency of SQL Server's configuration and ongoing management, especially with compliance or other restrictive requirements.
  • A new data auditing feature that provides more granular auditing of SQL Server activity.
  • Transparent Data Encryption (TDE) lets you turn on encryption without requiring massive application-level changes and can help meet specific security and compliance requirements.
  • The data collector helps make performance management and tuning easier and more centralized.
  • The Resource Governor gives administrators central control over CPU and memory allocation, which is useful for servers that host multiple databases and experience significant resource contention between those applications.
  • If you manage multiple servers, Server Group management enables you to run T-SQL queries against multiple servers simultaneously from a central management server. This makes server farm management easier and more consistent.

The bottom line: Is an upgrade worth it?
If some of SQL Server 2008's new features appear compelling or solve your IT shop's specific problems and if migrating your applications doesn't pose compatibility issues, the move to SQL Server 2008 is worth it. If you're already on SQL Server 2005 and do not consider the new features necessities, there may not be a justification to move at this time.

By 2010 or 2011, Microsoft will likely release another version of SQL Server (this predication is based on Microsoft's history, not any inside knowledge), so an upgrade to SQL Server 2008 may be optional if you're not going to use its new features to solve business problems.

 

 


______________________________________________________________________
This email has been scanned by the MessageLabs Email Security System.
For more information please visit http://www.messagelabs.com/email
______________________________________________________________________

Monday, April 6, 2009

vmdk to vhd conversion

Google makes finding a solution too easy, well, sometimes...

http://www.aaronpalermo.com/wordpress/archives/32
http://social.technet.microsoft.com/Forums/en-US/winserverhyperv/thread/ef8c12f7-c45d-442e-9a30-c43cd87df3b3
http://stateless.geek.nz/category/virtualisation/xen/

Sunday, April 5, 2009

Editorial - The Disappearing DBA

Editorial - The Disappearing DBA

During a "difficult period", a developer tends to sink from view. When delivery dates have slipped, and testers are hammering them with bugs they've found in their 'perfect' code, the developer will sit quietly in the corner, unnoticed by most, doggedly bashing away at their keyboard.
However, when things are going well, when an awesome new feature has been completed or particularly knotty programming problem solved, everyone knows about it. Suddenly they are the gunslinger, the hotshot; rightly proud of their achievement they swagger around the office telling all to anyone who will listen.
The DBA often looks on wistfully. For him, the "fame trajectory" seems to work in reverse. When a server is down or performance is suffering, customers are quick to complain, and managers are eager to point the finger of blame. During difficult periods, the DBA is the centre of attention. The harder the DBA works, and the smoother everything is running, the more invisible the DBA becomes.
In short, it is the unfortunate lot of the DBA to work their way towards absence. In a recent blog, Andy Leonard notes that DBAs think differently to developers, and it's true. The DBA role requires a completely different mindset, and a very different perception of what "success" means.
Some DBAs, perhaps envious of the high profile developers, make a point of highlighting their achievements, sending regular emails that report uptime on the company servers, improvements made to database processes and so on. This isn't such a bad idea, but most seem content with a working life spent keeping deliberately out of the limelight. Plaudits and esteem are all well and good, but they don't trade for much at the bank. What DBAs lack in public profile they tend to make up in personal remuneration. They accept the occasional public lambasting with good grace, perhaps consoling themselves using Phil Factor's old trick of calculating, on the hourly rate, how much they are being paid to receive it.
One cannot change human nature, and so it may be better to aim for an image of quiet efficiency rather than gun-slinging hero. Whereas the developer can bring a spring to his step from thought of the recognition of the quality of his work and skills, the DBA can cheer himself from the thought that it has been a nice, quiet month; and now it's payday.
Cheers,
Tony.
Tony Davis from SQLServerCentral.com

Saturday, April 4, 2009

Recover sysadmin account for SQL 2005/2008

I accidently removed the the sysadmin account from the development machine and was stuck for a while, however the solution is quite simple (someone else took the credit):
start sql in single user mode by appending ';-m' to start-up paramaters throught configuration manager;
stop all other sql related service other than db engine;
run sqlcmd -E
run 'create login [contoso\administrator] from windows;
run 'EXEC sp_addsrvrolemember 'CONTOSO\administrator', 'sysadmin';
remove ;-m
restart sql db engine
done!!!

Tuesday, March 31, 2009

HP ML110 G4 with Xeon X3220/Q6600?

Ever since I installed VMWare ESX server on my HP ML110 G4, I haven't got much chance playing it and it has been lying on the floor for a while...what a waste!
Recently I decided to convert it to my workstation pc and installed WIN2008 on it, so far I managed to add a USB sound card, install TV Tuner (Leadtek DTV 2000H) and managed to get it working after install the missing BDA component.
Now I am not quite satisfied with the CPU speed (Pentium D 810 2.8G) and the worst thing is, since it doesn't have HT I can't add Hyper-V to the server.
After doing some research, I found out that the server's documentation states that it supports xeon duo core 3000 , Pentium D CPUs but no quard core listed, I also found out similar Motherboards with same chipsets (Intel 3000/ICH7R) actually support Xeon Quad core 3200 CPUs.
After reading a post somewhere (in Japanese!) saying that Q6600 works with this server, together with the fact that Q6600 is identical to XEON X3200, I guess I will try to get one and see how it goes anyway. you gonna know that the benchmark score I got from a website shows Pentium D 2.8G gets 710 while for X3220 it is 2844!

Tuesday, March 10, 2009

Moving Database Files to different LUN under a SQL cluster environment

After managed to move quorum and DTC storage to different LUN, there is also the need to move database files around coz we need to do a SAN reconfiguration.
The highly summarized steps are as following:

> create new LUNs, present them to cluster nodes;
> initialize new disk, don't partition them yet, use diskpart to create partiton with an offset to make sure the partition is aligned.
> format new drives, assign drive letter, take SQL instance offline (leave the disk resource online), use xcopy to copy database files to the new drive.
>remove drive letter from the old drive, assign it to the new drive;
>using cluster administration tool, re-create the disk resource, modify resource dependency...
>failover and test etc.

Thursday, March 5, 2009

HOWTO: Migrating Quorum and DTC drive to different LUNs for SQL Cluster

HOWTO: Migrating Quorum and DTC drive to different LUNs for SQL Cluster

1. Create two new LUNs for Quorum and DTC drives and present them to ESX Host;
2. Adding the new LUNs as raw disk mapping to cluster nodes using the same SCSI ID on all nodes;
3. Go to computer management and initialize the disks, create new partition and format them;
4. Move Quorum Drive to DTC temporally via cluster administration:
5. Go to disk management and remove the drive letter ‘Q:’, assign drive letter ‘Q:’ to the new disk that mapped to the new LUN, do this on all nodes;
6. Go back to cluster administration, add the new disk into ‘Cluster Group’ then assign the new drive as Quorum Resource;
7. The old quorum disk is now ready to be removed;
8. To move DTC storage, remove the DTC resource first, then delete the old DTC drive from DTC Group;
9. Remove the drive letter from old DTC Drive and assign it to the new DTC Drive;
10. Add the new DTC drive to DTC group;
11. Create DTC Resource and bind it to the new drive;
12. The old DTC Drive is now ready to be removed.

Notes: this is the actual steps I took to move quorum/DTC drive inside our environment, please be reminded that quorum and DTC sit on different drives here.

SQL Server 2005 Cluster - Moving quorum/DTC drive

This exercise was done within VMWare ESX environment.

Notes:

-- this part relates to building the test environment

System drive (C:\) and shared drive need to sit on differenct scsi adapter

system drive need to sit on local disk resource (host storage)

map LUNs on nodes following same order and use same scsi id.

enable network dtc

'msdtc.exe -install' if things go wrong with MSDTC.

-- this part relates to moving quorum/dtc drive

>Break existing LUN to small LUNs: unmap the LUN from controller first then remove it, create new LUNs, map new LUNs to controller.
>check ESX storage controller and make sure new LUNs are presented to HOST.

--> add new LUNs to all nodes via raw disk mapping, make sure you use same SCSI ID on all nodes;

--> Move quorum drive to DTC drive temporally, release the drive letter from old quorum drive, assign it to the new drive, this need to be done on all nodes;

--> move quorum drive to the new disk;
--> for moving DTC drive, you will need to remove DTC first then add it back with the new drive;

Monday, March 2, 2009

Moving quorum/MSDTC drive in a SQL server cluster environment

Moving quorum/MSDTC drive in a SQL server cluster environment
22 July from external source
Moving a Cluster to a New SAN
A fairly common scenario for a cluster administrator is to move a cluster from one SAN to another as SAN equipment is replaced with newer/faster SANs or the old SAN's lease is up and a new one is being brought in.
The easiest way that I have found to do this is to use these steps (this is from memory, let me know if I missed one or two):
Super High Level Steps:

1. Put the new array in the same fabric as the existing array
2. Create new LUNs on the new array and make sure they are visible to the nodes
3. Map the new LUNs to the old drive letters
4. Copy data from the old drive to the new drive
5. Move quorum and MSDTC

Slightly More Detailed Steps:

1. Carve the new LUNs on the new array
2. Add the new array and its LUNs to the same switch as the existing array
3. Configure the LUN masking on the switch to expose the new LUNs to NodeA and NodeB
4. Use the disk management tools in Windows to rescan the drives
5. Use the active node to partition and format the disks
6. Use Cluster Administrtor to create the new physical disk resources and put them into their proper cluster groups
7. Move the Quorum using the GUI to a temp location
1. In Cluster Administrator, right click the cluster name
2. Select Properties
3. Select the Quorum tab
4. Use the drop down box to select a temp location for the quorum
8. Delete the existing MSDTC folder (if any)
1. Stop the MSDTC resource
2. Copy the MSDTC folder from Q: to the final quorum disk target location
3. Stop the Q: resource (remember, the quorum isn't there anymore)
4. Delete the MSDTC resource
9. Move the quorum to its final location
1. Go into disk management and change the Q: name to another letter
2. Use disk management and name the final quorum drive to Q:
3. Repeat steps 1-4 to move the quorum to its final destination
10. Recreate the MSDTC resource
1. Create a new MSDTC resource with the clustername network name resource and the new Q: as dependencies
2. Bring the MSDTC resource online
11. Stop the cluster service and the application cluster groups (you can just stop the application resources if you want to move app data an app at a time)
12. Move the data from the old disks to the new ones
13. Re-letter the old disks to something outside the current range, but do not remove them yet - you might need to use them in your back out plan
14. Re-letter the new disks to the same drive letter as the old ones (no, you do not have to worry about disk signatures as applications don't understand disk signatures and don't care about anything other than drive letters)
15. Verify the dependencies and update them as needed
16. Restart the cluster service
17. Make sure the new drive letters and disk resources are showing up properly in cluster administrator
18. Bring everything back online

Again, these are basic steps. Some of the individual steps will require lots of work. I have done this now several times and am very happy with the results.

http://support.microsoft.com/default.aspx?scid=kb;en-us;280353

Fom: http://msmvps.com/blogs/clusterhelp/archive/2005/08/05/moving-a-cluster-to-a-new-san-original-posted-jul-22-2005.aspx

Tuesday, February 24, 2009

Good Regular Expression to Enforce Password Complexity in ASP.NET Applications

Regular expression to enforce password complexity in your applications:

^.*(?=.{10,})(?=.*\d)(?=.*[a-z])(?=.*[A-Z])(?=.*[@#$%^&+=]).*$

The regular expression enforces the following rules:

  • Must be at least 10 characters
  • Must contain at least one one lower case letter, one upper case letter, one digit and one special character
  • Valid special characters (which are configurable) are - @#$%^&+=

Originally from Anil John's post

Wednesday, February 18, 2009

Debug Distributed Transactions Error in SQL Server

In Windows 2003, the security was tight on MSDTC service. you will get the following error if you start a distributed transaction agaist a linked server:

Server: Msg 7391, Level 16, State 1, Line 2
The operation could not be performed because the OLE DB provider 'SQLOLEDB' was unable to begin a distributed transaction.
[OLE/DB provider returned message: New transaction cannot enlist in the specified transaction coordinator. ]
OLE DB error trace [OLE/DB Provider 'SQLOLEDB' ITransactionJoin::JoinTransaction returned 0x8004d00a].

To enable distributed transactions against linked SQL server or in ad-hoc queries, please follow these steps:

1) Enable network DTC access:
Goto control panel-->Add or Remove Programs, click Add/Remove Windows Components. In the popup window, highlight Application servers, click "Details..." button. In popup window, check "Enable network DTC access" check box if it's not selected.

You need to reboot your computer for this change to take effect.

2) Set up MSDTC security
Goto Administrator Tools-->Componet Services, expand Component Services-->Computers, right click "My Computer"-->Properties-->MSDTC tab-->"Security Configuration...", in the popup window,
Check Network DTC Access;
Check Allow IUnBound and Allow Outbound
make sure DTC Logon Account is NT Authority\NetWorkService


Save the change, MSDTC will restart

3) If firewall exists between the involved servers.
DTC uses Remote Procedure Call (RPC) dynamic port allocation. By default, RPC dynamic port allocation randomly selects port numbers above 1024. By modifying the registry, you can control which ports RPC dynamically allocates for incoming communication. You can then configure your firewall to confine incoming external communication to only those ports and port 135 (the RPC Endpoint Mapper port).
Reference MS article http://support.microsoft.com/?id=250367 to configure the ports RPC uses and firewall.

4) If your servers are running on different domain, or if you can run queries against the linked server, but get the same error messages when transaction is used, you need to change Authentication Mode of MSDTC security.

In the same UI in step 2), select "No Authentication Required" option. This is equivalent to turn off the RPC security. You can achieve the same effect by changing the registry key avlue:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\MSDTC\TurnOffRpcSecurity to 1. If the key not exists, you need to manually add it.

The option in the UI "Incoming Caller Authentication Required" only athenticate incoming RPC. It can be acheieved by the regisgy key value: HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\MSDTC\FallbackToUnsecureRPCIfNecessary to 1.

If you manually change registry, you need to restart MSDTC service.


Reference MS article: http://support.microsoft.com/?id=899191

BEGIN TRAN
SELECT * FROM LinkedServername.msdb.dbo.sysjobs
COMMIT TRAN
GO

The above query should be succeeded.

From: http://p-sql.spaces.live.com/?_c11_blogpart_blogpart=blogview&_c=blogpart&partqs=amonth%3d2%26ayear%3d2007


______________________________________________________________________
This email has been scanned by the MessageLabs Email Security System.
For more information please visit http://www.messagelabs.com/email
______________________________________________________________________

How to change SQL Server parameters in a clustered environment when SQL Server is not online

INTRODUCTION

When you use Microsoft SQL Server 2008 Configuration Manager, SQL Server 2005 Configuration Manager, SQL Serverhttp://images.intellitxt.com/ast/adTypes/mag-glass_10x10.gif 2000 Enterprise Manager, or SQL Server 2000 Setup to change SQL Server parameters in a clustered environment, you have to make changes on the active node while the SQL Server cluster resource is online. If SQL Server is not online, you have to bring SQL Server online first. However, in some circumstances, you may be unable to bring SQL Server online.

This article describes how to change SQL Server parameters in a clustered environment when SQL Server is not online or when you cannot bring SQL Server online.

http://support.microsoft.com/library/images/support/kbgraphics/public/en-us/uparrow.gifBack to the top

MORE INFORMATION

Important This section, method, or task contains steps that tell you how to modify the registry. However, serious problems might occur if you modify the registry incorrectly. Therefore, make sure that you follow these steps carefully. For added protection, back up the registry before you modify it. Then, you can restore the registry if a problem occurs. For more information about how to back up and restore the registry, click the following article number to view the article in the Microsoft Knowledge Base:

322756Â (http://kbalertz.com/Feedback.aspx?kbNumber=322756/ ) How to back up and restore the registry in Windows

To change SQL Server parameters in a clustered environment when SQL Server is not online, use one of the following methods.

http://support.microsoft.com/library/images/support/kbgraphics/public/en-us/uparrow.gifBack to the top

Method 1

Note We recommend that you try to use this method first.

  1. Click Start, click Run, type regedit, and then click OK.
  2. Locate the quorum disk. To do this, follow these steps:
    1. Locate the following registry key:

HKEY_LOCAL_MACHINE\Cluster\Quorum

    1. The Path entry contains the path of the quorum disk. For example, the Path entry contains the following path:

<QuorumDrive>:\MSCS

  1. Locate the GUID of the SQL Server cluster resource. To do this, follow these steps:

. Locate the following registry key:

HKEY_LOCAL_MACHINE\Cluster\Resources

a. Examine the Name column of the registry entries.

Note Several registry entries include "GUID" in the name of the entry.

    1. For the default instance, locate the SQL Server cluster resource that includes "SQL Server" in the Name column.

      For named instances, locate the SQL Server cluster resources that include "SQL Server (<InstanceName>)" in the Name column.
  1. Locate the checkpoint file name. To do this, follow these steps:

. Locate the following registry key:

HKEY_LOCAL_MACHINE\Cluster\Resources\{GUID}\RegSync

a. In the details pane, view the checkpoint registry hives and the corresponding numbers that resemble the following:

For the default instance
00000004 SOFTWARE\Microsoft\Microsoft SQL Server\MSSQLSERVER

For a named instance
00000004 SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL.X\MSSQLSERVER

Note For a named instance, X corresponds to the instance ID.

The number is the checkpoint file name. In this example, the checkpoint file name is 00000004.cpt.

  1. In Registry Editor, click HKEY_LOCAL_MACHINE.
  2. On the File menu, click Load Hive.
  3. In the <QuorumDrive>:\<GUID> folder, locate the checkpoint file that you found in step 4.
  4. In the Key Name box, type 1, and then click OK.
  5. Locate the following registry key to correct the invalid checkpoint registry key value:

HKEY_LOCAL_MACHINE\1\<YourRegistryKey>

Note The following examples correct the MSSQLSERVER checkpoint registry key:

    • Example 1
      To correct the invalid path of the Master.mdf file, follow these steps:

a. Locate the following registry key:

HKEY_LOCAL_MACHINE\1\Parameters

b. Correct the SQLArg0 key.

    • Example 2
      To disable the incorrectly enabled VIA protocol, follow these steps:
      1. Locate the following registry key:

HKEY_LOCAL_MACHINE\1\SuperSocketNetLib\Via

a. Change the value of the Enabled entry from 1 to 0.

  1. After you correct the registry key, click HKEY_LOCAL_MACHINE\1, click the File menu, and then click Unload Hive.

Note After you follow these steps, this checkpoint is fixed and is replicated to the specific node automatically during failover. You can bring the instance of SQL Server online.

http://support.microsoft.com/library/images/support/kbgraphics/public/en-us/uparrow.gifBack to the top

Method 2

Note Do not perform SQL cluster group failover between step 1 and step 3.

  1. At a command prompt, run one of the following commands to disable the cluster checkpoint for the specific registry subkey:
    • For an instance of SQL Server 2008, run the following command:

cluster . resource "SQL Server (<InstanceName>)" /removecheckpoints:"HKEY_LOCAL_MACHINE\Software\Microsoft\Microsoft SQL Server\MSSQL.x\MSSQLSERVER"

Note In this command, MSSQL.x is a placeholder for the instance ID of the instance of SQL Server. You can determine the corresponding value for the system from the value of the MSSQLSERVER registry entry in the following registry subkey:

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\<Instance Names>\SQL\

    • For an instance of SQL Server 2005, run the following command:

cluster res "SQL Server (<InstanceName>)" /removecheck: "Software\Microsoft\Microsoft SQL Server\ MSSQL.x \MSSQLSERVER"

    • For the default instance of SQL Server 2000, run the following commands:

cluster res "SQL Server" /removecheck: "Software\Microsoft\MSSQLServer\MSSQLSERVER"

cluster res "SQL Server" /removecheck: "Software\Microsoft\MSSQLServer\Cluster"

Note You have to run the second command only when you add or remove one or more IP addresses on which SQL Server 2000 listens.

    • For a named instance of SQL Server 2000, run the following commands:

cluster res "SQL Server (<InstanceName>)" /removecheck: "SOFTWARE\Microsoft\Microsoft SQL Server\Instance_Name\MSSQLSERVER"

cluster res "SQL Server (<InstanceName>)" /removecheck: "SOFTWARE\Microsoft\Microsoft SQL Server\Instance_Name\Cluster"

Note In these commands, Instance_Name is a placeholder for the name of the instance of SQL Server 2000. Additionally, you have to run the second command only when you add or remove one or more IP addresses on which SQL Server 2000 listens.

  1. Change the parameter for the clustered instance of SQL Server on all nodes.
  2. At a command prompt, run one of the following commands to enable the cluster checkpoint for the specific registry subkey:
    • For an instance of SQL Server 2008, run the following command:

cluster . resource "SQL Server (<InstanceName>)" /addcheckpoints:"HKEY_LOCAL_MACHINE\Software\Microsoft\Microsoft SQL Server\MSSQL.x\MSSQLSERVER"

    • For an instance of SQL Server 2005, run the following command:

cluster res "SQL Server (<InstanceName>)" /addcheck: "Software\Microsoft\Microsoft SQL Server\MSSQL.x\MSSQLSERVER"

    • For the default instance of SQL Server 2000, run the following commands:

cluster res "SQL Server" /addcheck: "Software\Microsoft\MSSQLServer\MSSQLSERVER"

cluster res "SQL Server" /addcheck: "Software\Microsoft\MSSQLServer\Cluster"

    • For a named instance of SQL Server 2000, run the following commands:

cluster res "SQL Server (<InstanceName>)" /addcheck: "SOFTWARE\Microsoft\Microsoft SQL Server\Instance_Name\MSSQLSERVER"

cluster res "SQL Server (<InstanceName>)" /addcheck: "SOFTWARE\Microsoft\Microsoft SQL Server\Instance_Name\Cluster"

  1. Note The resource name "SQL Server (<InstanceName>)" may be different in your case. To confirm the resource name, start Cluster Administrator, click SQL Group, locate the SQL Server resource properties, and then determine the exact name of the resource.
  2. Bring the instance of SQL Server online.

http://support.microsoft.com/library/images/support/kbgraphics/public/en-us/uparrow.gifBack to the top

REFERENCES

For more information about how to configure server startup options in SQL Server Configuration Manager, visit the following Microsoft Developer Network (MSDN) Web site:

http://msdn2.microsoft.com/en-us/library/ms345416.aspx (http://msdn2.microsoft.com/en-us/library/ms345416.aspx)

For more information, click the following article numbers to view the articles in the Microsoft Knowledge Base:

912397Â (http://kbalertz.com/Feedback.aspx?kbNumber=912397/ ) The SQL Server service cannot start when you change a startup parameter for a clustered instance of SQL Server 2000 or of SQL Server 2005 to a value that is not valid

244980Â (http://kbalertz.com/Feedback.aspx?kbNumber=244980/ ) How to change the network IP addresses of SQL Server failover cluster instances


______________________________________________________________________
This email has been scanned by the MessageLabs Email Security System.
For more information please visit http://www.messagelabs.com/email
______________________________________________________________________

Monday, February 16, 2009

Things I Wish I Had Known

Things I Wish I Had Known
By Tim Mitchell in Tim Mitchell | 02-11-2009 10:46 PM | Categories: Filed under: Career

It's OK to make mistakes. It's also OK to admit it.
There are two types of people in this world: those who make mistakes and fess up, and those that make mistakes and try to cover it up. Everyone - everyone - makes mistakes in their jobs, and in life in general. While there are some high-profile mistakes that are notoriously noteworthy (airline disasters, medical errors), the vast majority of us are permitted, and even expected, to make a few errors here an there. As long as you can learn from your errors and not continue to make the same mistakes, it can be filed under the "Valuable Lessons Learned" heading. Further, those whom you work with and for will have more respect for you if you admit your errors up front, especially when you deliver a plan for resolution.
Technical skills are not enough
When I started out, I had my mind set on simply learning a craft, fine-tuning my skills to become one of the best in the field, and keeping my head down and working hard for the next 40 years. While it is important to grow and learn hard skills, today's economy is not friendly to the stereotypical technical geek working in a dark basement and slinging code (or monitoring logs, or building desktops, or whatever). To be successful, you've got to break out of your technical world every now and again and interface with nontechnical people. Spend half a day shadowing one of your end users to see how the systems you build/support help them to do their jobs. Spend time with senior management and executives to find out what big-picture goals they have, and how you help get the organization there. Have coffee with someone unfamiliar with technology and learn how you can ease them into the digital age. Most importantly, get to know the overall goals of your organization/clients/customers - you'll be far more successful in the long run.
You are responsible for your own career development
Technical careers require constant learning. Training, college, and conferences/trade shows are great ways to learn and network, but many companies can't or won't fork over the cash for these career development events. The bottom line is, it's up to you to take charge of your career. A few thousand dollars for conference fees can be painful, but it could be argued that you'll easily recoup that investment over the course of your career. For those who truly are budget strapped, there are tons of free career resources; user group events, online references (even videos!), libraries, and volunteer opportunities are all cost conscious ways to build your network and skillset.
Don't try to be an expert in everything
There are generalists, and there are specialists; nobody can be an expert in all things technical. Find something that you enjoy doing (that statement alone should be a bullet point), and become an expert in that thing. You don't have to specialize to the point that you're a niche player, but you can limit your scope such that you can be known as an authority in your chosen area.
Don't take things too seriously
I almost didn't type this last one for fear that I would portray myself as having a lackadaisical attitude toward my career; nothing could be further from the truth. We're all human (see #1 above) and are limited by a number of factors, including emotions, limited energy, family commitments, and natural abilities. Don't be a stickler for absolute perfection; accept that some things are part of live and unchangeable. When obstacles block your path, don't freak out or become the voice of negativity; take a breath, smile, and know that, if it was easy, everybody would be doing it and our skills would be far less valuable. Address problems or deadlines with a sense of urgency, but don't let your commitments consume you to the point that you spend all of your time working.


______________________________________________________________________
This email has been scanned by the MessageLabs Email Security System.
For more information please visit http://www.messagelabs.com/email
______________________________________________________________________

Friday, February 13, 2009

Java MSN Live Messenger

The Java MSN Messenger Library is really something that quite interesting, I discovered it while I was trying to get a java based messenger client.
I managed to get a client sample and ran it and turned out to be quite powerful...
With JML, you can do things like creating your own msn robot, implementing group chat etc.

http://jml.blathersource.org/

Elevating LLM Deployment with FastAPI and React: A Step-By-Step Guide

  In a   previous exploration , I delved into creating a Retrieval-Augmented-Generation (RAG) demo, utilising Google’s gemma model, Hugging ...