الفائدة من إستخدام خادم الشيربوينت
التفاعل بكفاءة عالية مع الاخرين, على سبيل المثال, يمن إستخدام التقويم ضمن البوابة او موقع لمجموعة افراد لمعرفة الاحداث الخاصة بعمل الفريق وتتبعها, او استخدام مكتبة الوثائق لتخزين وثائق المجموعات والاقسام والمؤسسة بشكل عام. يمكن كذلك مناقشة مواضيع معينة بإستخدام المدونات Blogs.
إنشاء مواقع شخصية, حيث يمكن لأي موظف داخل المؤسسة انشاء موقعه الخاص الذي يمكنه من مشاركة معلوماته مع الاخرين. في هذا الموقع يستطيع الموظف مشاهدة وإدارة وثائقه, مهامه, وصلاته التشعبية, التقويم الخاص بأحداثه, والعديد من المعلومات الشخصية الأخرى وذلك من موقع شخصي واحد ومركزي.
سهولة ايجاد الاشخاص, او البحث عن المعرفة بكل اقسامها, او البحث في بيانات تطبيقات الاعمال الاخرى في المؤسسة, مثل نظام شؤون الموظفين. وعلى سبيل المثال يمكن إيجاد اشخاص ذوي خبرات معينة او اهتمامات معينة بواسطة محرك البحث والذي يمكن استخدامه من الموقع الشخصي لكل موظف. كذلك يمكن ايجاد بيانات معينة من قواعد البيانات المختلفة او تطبيقات الاعمال في المؤسسة, وكذلك البحث عن الوثائق والسجلات والصور وملفات الصوت والصورة... الخ.
إدارة الوثائق, إدارة السجلات, إدارة محتويات الويب. على سبيل المثال يمكن لأي مؤسسة تصميم خطة لإتلاف الوثائق او السجلات المحفوظة وذلك بعد فترة زمنية مجدولة.
استخدام وتصميم وبرمجة نماذج الادخال والتي تتكامل مع قواعد البيانات او تطبيقات الاعمال الاخرى والتي تعتمد على تقنيات XML بشكل أساسي. على سبيل المثال, يمكن للمؤسسة تصميم نموذج إدخال لطلب صيانة لأجهزتها التي تم بيعها للعملاء وذلك عن طريق شبكة الانترنت او اي شبكة داخلية (انترانت, او اكسترانت) ويمكن لهذا الطلب بعد تقديمه ان يسير في محرك سير عمل تلقائيا.
يمكن بسهولة نشر التقارير, والقوائم ومؤشرات تقييم الاداء (KPI) وذلك بالتكامل مع قواعد البيانت مثل MS SQL Server 2005.
Emad Adel Hanna
Sharepoint Developer
it.emadadel@hotmail.com
http://emadadel.wordpress.com
www.facebook.com/groups/Sharepoint.Egypt
- Edited by Emad Adel Sunday, April 8, 2012 12:17 AM
User Profile Synchronization Issues (Event ID 6050)
Hi All-
I am trying to configure User Profiles on SP 2010 and having a really hard time in making it to work.
Env: User Profiles Service runnning on application server (Win 2008 R2, SP2010 RTM) + Central Admin
Steps:
- Configure and Started User Profile Service &User Profile Synchronization Service ( followed instructions from harbar's blog: http://www.harbar.net/articles/sp2010ups.aspx )
- Farm account (domain\farm) is a local Admin and have ' Allow logon locally" rights. I am logged on with the Farm account while configuring. AD Import account (domain\adimport) has been given read only access to AD.
Note: We have not assigned "Replicate Directory Changes" permissions to this account because the client does not allow it.
- Until now everything good. No errors. I went back to User Profiles Service Application > Start Profile Synchronization > Full crawl > Ok
- Status is " Synchronizing" It sits there for a while and then becomes "Idle". No profiles have been importorted. Event log is full of event 6050.
Event Viewer
Log Name: Application
Source: FIMSynchronizationService
Date: 8/6/2010 4:43:01 PM
Event ID: 6050
Task Category: Management Agent Run Profile
Level: Error
Keywords: Classic
User: N/A
Computer: server name
Description:
The management agent "MOSSAD-AD SPS Connection" failed on run profile "DS_FULLIMPORT" because of connectivity issues.
Additional Information
Discovery Errors : "0"
Synchronization Errors : "0"
Metaverse Retry Errors : "0"
Export Errors : "0"
Warnings : "0"
User Action
View the management agent run history for details.
Event Xml:
<Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
<System>
<Provider Name="FIMSynchronizationService" />
<EventID Qualifiers="49152">6050</EventID>
<Level>2</Level>
<Task>1</Task>
<Keywords>0x80000000000000</Keywords>
<TimeCreated SystemTime="2010-08-06T20:43:01.000000000Z" />
<EventRecordID>3193</EventRecordID>
<Channel>Application</Channel>
<Computer>Server Name</Computer>
<Security />
</System>
<EventData>
<Data>MOSSAD-AD SPS Connection</Data>
<Data>DS_FULLIMPORT</Data>
<Data>0</Data>
<Data>0</Data>
<Data>0</Data>
<Data>0</Data>
<Data>0</Data>
</EventData>
</Event>
ULS Log :Unexpected UserProfile Synchronization: Encountered unexpected step result: stopped-connectivity. b72350be-53a7-4774-93e0-3da925e49bb0
ForeFront Identity Manager Service & Forefront Identity Manager Sync services are running under domain\farm account. My gut feeling is that is has to do with ForeFront Identity Manager.
I will really appreciate any suggestions you may have. I have already wasted about 2 days in trying to fix this without sucess.
FYI - I have tried this too http://blog.jussipalo.com/2010/02/sp2010-fimsynchronizationservice-errors.html
-Sam
- Edited by SamKhan79Microsoft employee Friday, August 6, 2010 9:16 PM deleting server name
Reply:
Hi,
Did you try any suggestions from Almero Steyn? You can find some info here: http://social.technet.microsoft.com/Forums/en-US/identitylifecyclemanager/thread/2ea8a1e9-c2a9-4105-964c-239412b6716b
Regards
------------------------------------
Reply:
Hi Sam,
>> Note: We have not assigned "Replicate Directory Changes" permissions to this account because the client does not allow it.
It will never work without it, it's in the documentation, read this.
good luck.
rob
------------------------------------
Reply:
Thanks Rob . I will try giving the "Replicate" permission and will let you guys know if that resolves the problem.
-sam
------------------------------------
Reply:
Unfortunately I was not able to convince the client enough to give the account the correct permissions. Therefore I cannot say for sure if this will fix the problem . However all symptoms points to Replicaate Directory Change permissions. I can verify the error by checking the log in Synchronization service manager.
\Program Files\Microsoft Office Servers\14.0\Synchronization Service\UIShell
Click the line items in which Status "stopped-connectivity" .
Under Connection Status find failed-search hyperlink and click on it. You will find the below error.
failed-Search servername:389 Replication Access was denied Error code:8453
So no user profiles :)
Thank you guys for your help.
------------------------------------
Reply:
Bummer.
You do have profiles & mysites, just no syncing with AD.
Implement mysites and make it mandatory for everybody who uses the portal to keep the profile data in sync with it's AD record. Let see how long it takes. ;-)
rob
------------------------------------
Reply:
I can confirm that it was due to "Replicate Dir Change" permissions. Import account MUST have these permissions. After assigning the import account these permissions i am able to import user profiles.
------------------------------------
Reply:
For the profile synchronization to work, our service account which is being used by UPS should have the "Replicate Directory Changes" permission on a domain.
This rights for query changes in the directory. This permission does not allow an account to make any changes in the directory. Refer: http://technet.microsoft.com/en-us/library/hh296982.aspx#RDCdomain
So, Here are the steps to fix:
Open the Active Directory Users and Computers snap-in
- On the View menu, click Advanced Features.
- Right-click the domain object, such as "company.com", and then click Properties.
- On the Security tab, if the desired user account is not listed, click Add; if the desired user account is listed, proceed to step 7.
- In the Select Users, Computers, or Groups dialog box, select the desired user account, and then click Add.
- Click OK to return to the Properties dialog box.
- Click the desired user account.
- Click to select the "Replicating Directory Changes" check box from the list.
- Click Apply, and then click OK.
After that, start UPS full import again, and the issue will get fixed!
More info: http://salaudeen.blogspot.com/2011/10/user-profile-sync-not-importing-ad.html
------------------------------------
can we install sql server 2008 and sql server 2008r2 on same machiene
can we install sql server 2008 and sql server 2008r2 on same machiene
can any one help me how we can do this if possible
thankyou
- Edited by v60 Saturday, April 7, 2012 6:18 AM
Reply:
Hello,
Yes, you can, but take in consideration the following:
http://msdn.microsoft.com/en-us/library/ee210714.aspx
Hope this helps.
Regards,
Alberto Morillo
SQLCoffee.com
------------------------------------
Reply:
hi morillo actually i installed sql server 2008 and now i want to install sql server 2008 r2 i want both of them to work so in this case also same what u said in url or it is different
pls clarify me
------------------------------------
Reply:
Same.hi morillo actually i installed sql server 2008 and now i want to install sql server 2008 r2 i want both of them to work so in this case also same what u said in url or it is different
pls clarify me
Prashant [MSFT] -- This posting is provided "AS IS" with no warranties, and confers no rights.
------------------------------------
Reply:
Hello,
As Prashant mentioned, your scenario is the same as the article.
Hope this helps.
Regards,
Alberto Morillo
SQLCoffee.com
------------------------------------
LIST DEFINITION AND EVENT RECIEVER
while doing an exercise on list definition and event receiver i came across following code in element.xml
<!-- Start Add as part of SPCHOL300 Ex1 -->
<ContentType
ID="0x010089E3E6DB8C9B4B3FBB980447E313CE94"
Name="Bug Item"
Group="Custom Content Types"
Description="Bug item content type."
Version="0">
<FieldRefs>
<FieldRef ID="{fa564e0f-0c70-4ab9-b863-0177e6ddd247}" />
<FieldRef ID="{cb55bba1-81a9-47b6-8e6c-6a7da1d25602}" />
<FieldRef ID="{0248c82f-9136-4b3a-b802-d0b77280b3bc}" />
<FieldRef ID="{aa4a82dd-5b32-4507-9874-4e1c7bca3279}" />
</FieldRefs>
</ContentType>
<!-- End Add as part of SPCHOL300 Ex1 -->
i couldn't get for what purpose FieldRef ID's are used???can anyone help??
Reply:
FieldRef in above example specifies a columns referenced in custom content type that this list definition is going to use.
They must have defined these fields before referencing them in the CT.
Amit
------------------------------------
Reply:
------------------------------------
SharePoint Admin certification_70-668
Hi,
Can any 1 share(Link) the SharePoint certification(70-668 )dump ?
Thanks David Kenneth
- Changed type Rock Wang– MSFT Monday, April 9, 2012 6:31 AM
Reply:
Hello David,
The more suited forum for your query would be http://social.microsoft.com/Forums/en/CertGeneral/threads
You can find the learning guide from http://www.microsoft.com/learning/en/us/exam.aspx?ID=70-668&locale=en-us#tab2 and start browsing the excellent documentation from http://technet.microsoft.com/en-us/library/ee428287.aspx.
-Sangeetha
------------------------------------
Adobe Flex on Windows Azure...A Great Interoperability
Hello ,
Although Microsoft haven't released officially the Interoperability of Adobe Flex on Windows Azure platform yet but I have developed the Flex Application compatible on the Azure Platform and its running on Local App fabric as well as on the Azure live.So do you think that Microsoft should release officially the Interoperability of Flex on Azure platform now.
Jayesh Maduskar
Jayesh
Reply:
Hi Jayesh,
After a quick search, it seems that the feature/idea you mentioned has not been submitted in http://www.mygreatwindowsazureidea.com/forums/34192-windows-azure-feature-voting. Please go ahead and submit you idea. Microsoft is listening ideas/feedbacks/features in that site.
Thanks for your feedback.
Thanks,
Wengchao Zeng
Please mark the replies as answers if they help or unmark if not.
If you have any feedback about my replies, please contact msdnmg@microsoft.com.
Microsoft One Code Framework
------------------------------------
Reply:
Jayesh
------------------------------------
Reply:
------------------------------------
Reply:
------------------------------------
Reply:
I haven't posted here for a while - but I worked with Flex and Azure – just accessing blobs (this works). I am still asking myself why there is no answer, because for me AppFabric messaging would be interesting , as well access to queues.
One reason could be that Flex is a client technology and therefore should not use the full spectrum of Azure - only those where you do not publish your secret azure account key to the swf.
Another reason is that Flex/Flash had a hard time the last 2 years with missing the iPad (now possible through native code without FlashBuilder) and the HTML5 hype. This made Flex/Flash unsexy. Personally I perceive Flex with many advantages over HTML, especially if you have limited resources. But this will probably change as html5 evolves and the tools for using it.
Lastly Microsoft invested a lot in Silverlight for a few years - to create an alternative to Flex/Flash. Even that Microsoft became much more open the last years - especially lived with Azure, I do not think Microsoft want to actively support competitive technologies when they have a product they want to protect.
If you need to use a cloud technology now I suggest you check the REST API of Azure http://msdn.microsoft.com/en-us/library/windowsazure/dd179355.aspx. As far as I know you are limited there to do only Get/Post HTTP statements (never understood why this is this way).
Another way would be to use an server supporting invocation (Flash Media Server, fluorine, weborb are the ones I know) and combine it with Azure.
Lastly you could use Amazon web services, which seem to have created an AWS SDK library for
actionscript 3.0.
Cheers,
Marc
Edit:
Check out these links: http://www.adobe.com/devnet/flex/flex_java.html in combination with http://www.windowsazure4j.org/
And http://stackoverflow.com/questions/4848739/how-to-access-azure-blob-storage-via-flex-application for Accessing the RAzure Storage Rest API http://stackoverflow.com/questions/4848739/how-to-access-azure-blob-storage-via-flex-application
- Edited by Marc Loeb Thursday, March 22, 2012 7:12 PM
------------------------------------
Reply:
Thanks for this wonderful reply, it got me thinking about other ways to setup my client and I fell on the new MVC4 beta which included a web API which relays the database access with the rest API inside a HTML5 wrapper, which is exactly that I needed !
I knew there had to be a solution, they're just catching up on their promises, I will be patient and I think this will work great !
Thanks again !
------------------------------------
Reply:
Glad I could help - good luck with your plans!
------------------------------------
HE All
HAPPY EASTER, everyone!!
Drew
Drew MS Partner / MS Beta Tester / Pres. Computer Issues Pres. Computer Issues www.drewsci.com
DFSReplication simple hourly test using VBScript
Mostly new to scripting. Using VBScript the goal is to test DFSReplication hourly and send email if test fails. A small txt file with date and time is created on reference server. The file is hopefully replicated out to 5 servers in US and 2 in UK. A "check for the existence" looks for the file from 1 hour ago on all servers. If the file does not exist on a server an email is generated identifying the failed server.
A task is scheduled to run the script every hour on the reference server. The test files are created and replicated, but pausing replication on a server does not generate an alert. Feeling a little dumb here. What am I missing??
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
' Location of test files
strTestFolder = "\\FxxRomSrv01\DFS_Projects\DfsrTest\"
Dim testArray(7)
testArray(0) = "\\FxxLovSrv01\DFS_Projects\DfsrTest\"
testArray(1) = "\\FxxhvSrv01\DFS_Projects\DfsrTest"
testArray(2) = "\\FxxaApolSrv01\DFS_Projects\DfsrTest\"
testArray(3) = "\\FxxaDonSrv02\DFS_Projects\DfsrTest\"
testArray(4) = "\\FxxLynSrv01\DFS_Projects\DfsrTest\"
testArray(5) = "\\FxxMinSrv01\DFS_Projects\DfsrTest\"
testArray(6) = "\\BxxStlSrv01\DFS_Projects\DfsrTest\"
' Do some housecleaning, delete all files older than 7 days from test folder
strDaysToKeep = 7
Set fso = CreateObject("Scripting.FileSystemObject")
Set f = fso.GetFolder(strTestFolder)
Set fc = f.Files
For Each f1 in fc
If DateDiff("d", f1.DateCreated, Date) > strDaysToKeep Then
fso.DeleteFile(f1)
End If
Next
' Create a new test file with the current date & time as a name
MyDate = Now()
strHour = Hour(MyDate)
If Len(strHour) = 1 Then strHour = "0" & Hour(MyDate)
strDay = Day(MyDate)
If Len(strDay) = 1 Then strDay = "0" & Day(MyDate)
strMonth = Month(MyDate)
If Len(strMonth) = 1 Then strMonth = "0" & Month(MyDate)
strYear = Year(MyDate)
strFileName = strYear & strMonth & strDay & strHour & ".txt"
Set txtFile = fso.CreateTextFile(strTestFolder & strFileName, True)
txtFile.WriteLine(Date)
txtFile.WriteLine(Time)
txtFile.Close
' Check for the existance of the file from 1 hour ago
MyDate = DateAdd("h", -1, Now())
strHour = Hour(MyDate)
If Len(strHour) = 1 Then strHour = "0" & Hour(MyDate)
strDay = Day(MyDate)
If Len(strDay) = 1 Then strDay = "0" & Day(MyDate)
strMonth = Month(MyDate)
If Len(strMonth) = 1 Then strMonth = "0" & Month(MyDate)
strYear = Year(MyDate)
strFileName = strYear & strMonth & strDay & strHour & ".txt"
strReport = ""
boolFailed = False
For Each strServerPath in testArray
If fso.FileExists(strTestFolder & strFileName) Then
strReport = strReport & strServerPath & " - Replication OK<BR>" & vbCrLf
Else
strReport = strReport & strServerPath & " - Replication FAILED<BR>" & vbCrLf
boolFailed = True
End If
Next
' Send e-mail with report if there are any failures
If boolFailed Then
strReport = "<B>Replication has failed for at least one server in the DFS Projects replication group.</B><BR><BR>Details are included below. All failures are reported from the perspective of FxxRomSrv01, so if all servers show failed, it may be due to a failure in Romeoville.<BR><BR><HR><BR>" & strReport
Set objMessage = CreateObject("CDO.Message")
objMessage.Subject = "DFS Replication Failure Alert"
objMessage.From = "Administrator@FxxRomSrv01.org"
objMessage.To = "ITHelp@Fxxa.com"
objMessage.HTMLBody = strReport
objMessage.Configuration.Fields.Item _
("http://schemas.microsoft.com/cdo/configuration/sendusing") = 2
objMessage.Configuration.Fields.Item _
("http://schemas.microsoft.com/cdo/configuration/smtpserver") = "SMTPrelay.xxxx.org"
objMessage.Configuration.Fields.Item _
("http://schemas.microsoft.com/cdo/configuration/smtpserverport") = 25
objMessage.Configuration.Fields.Update
objMessage.Send
End If
Wscript.Quit
Reply:
Wouldn't it be easier to just define an event in teh eventlog that sends an eamil if a ref=plication failure occurs. Windows knows when replication has failed.
WWhen you say it does not generate an alert. What does that mean. Are you saying you do not get an email message?
Some recommendations. Don't keep deleting the file. Just write ot it. ON the receiving end montitor the received file and check its LastWriteTIme file time. No need to open the file.
YOu should have one process that 'touches' the file and a separate process that looks for the updated files. Stopping the 'touch' process would test the detection script. Until this is working the detection script should log to a file each time it runs. This would let you knnow if the script is actually doing what you want. You shold also detext errors andlog them.
For testing you could cycle teh files every 5 minutes to sabe time then increase it to 60 minutes once you have everything working.
I would just monitor the eventlog.
¯\_(ツ)_/¯
- Edited by jrv Saturday, April 7, 2012 3:22 AM
------------------------------------
Reply:
I just noticed that you file check is for every 7 days not once an hour.
¯\_(ツ)_/¯
------------------------------------
Reply:
May have been easier to approach from event log, but I got this hair brained idea started and its just bugging me. Yes to email "not" sent on failure. But, email "is" sent if the test folder is empty – the alert identifying the reference server failing. So that part works.
------------------------------------
Reply:
------------------------------------
Reply:
Here is the script to touch the files periodically. It's timing is set by the scheduler. The script write its errors to the eventlog.
' Location of test files testfiles=Array("\\FxxLovSrv01\DFS_Projects\DfsrTest\", _ "\\FxxhvSrv01\DFS_Projects\DfsrTest", _ "\\FxxaApolSrv01\DFS_Projects\DfsrTest\", _ "\\FxxaDonSrv02\DFS_Projects\DfsrTest\", _ "\\FxxLynSrv01\DFS_Projects\DfsrTest\", _ "\\FxxMinSrv01\DFS_Projects\DfsrTest\", _ "\\BxxStlSrv01\DFS_Projects\DfsrTest\" ) ' Create or update a test file Set fso = CreateObject("Scripting.FileSystemObject") For Each filename In testfiles On Error Resume Next Set file = fso.OpenTextFile(filename,2,True) ' open for write and create if it doesn't exist. If Err.Number <> 0 Then Set sh=CreateObject("WScript.Shell") sh.LogEvent 1, "error opening file:" & filename & " Error:" & Err.Description WScript.Quit 1 End If file.WriteLine Now file.Close Next ¯\_(ツ)_/¯
------------------------------------
Reply:
i think i like it. i will set it up and report back with results. (but my failure still just bugs me)
thx
------------------------------------
Reply:
' Location of test files testfiles=Array("\\FxxLovSrv01\DFS_Projects\DfsrTest\", _ "\\FxxhvSrv01\DFS_Projects\DfsrTest", _ "\\FxxaApolSrv01\DFS_Projects\DfsrTest\", _ "\\FxxaDonSrv02\DFS_Projects\DfsrTest\", _ "\\FxxLynSrv01\DFS_Projects\DfsrTest\", _ "\\FxxMinSrv01\DFS_Projects\DfsrTest\", _ "\\BxxStlSrv01\DFS_Projects\DfsrTest\" ) Set fso = CreateObject("Scripting.FileSystemObject") Set logfile = fso.OpenTextFile("c:\rep.log",8,True) For Each filename in testfiles logfile.Write "Processing:" & filename Set file = fso.GetFile(filename) If Not (file.DateLastModified > Now - 1) Then messageBoby = messageBoby & "<b>[" & Now & "]File not updated:</b>" & filename & vbCrLf End If Next ' Send e-mail with report if there are any failures If messageBoby <> "" Then logfile.Write "Sending mail:" & messageBoby Set objMessage = CreateObject("CDO.Message") objMessage.Subject = "DFS Replication Failure Alert" objMessage.From = "Administrator@FxxRomSrv01.org" objMessage.To = "ITHelp@Fxxa.com" objMessage.HTMLBody = messageBoby objMessage.Configuration.Fields.Item("http://schemas.microsoft.com/cdo/configuration/sendusing") = 2 objMessage.Configuration.Fields.Item("http://schemas.microsoft.com/cdo/configuration/smtpserver") = "SMTPrelay.xxxx.org" objMessage.Configuration.Fields.Item("http://schemas.microsoft.com/cdo/configuration/smtpserverport") = 25 objMessage.Configuration.Fields.Update On Error Resume Next objMessage.Send If Err.Number <> 0 Then logfile.Write "Send mail Error:" & Err.Description End If On Error GoTo 0 End If logfile.Write "Proicessing complete" logfile.close This is approcimately what you have to do to check the files.
¯\_(ツ)_/¯
------------------------------------
Reply:
Your logic wa way to complicated for the task.
The time test is just a dummy. You need to calculate the differnce by hour.
DateDiff(filetime, Now,"h") > 1
Run one on the hour and the second 10 minutes after teh hour. Check file lastwritetime/
This way you do not need to delete fiels and you do not need to name them Just create eth files and use them and check the timestamp.
¯\_(ツ)_/¯
------------------------------------
Reply:
The line for crateing the tst file should be:
Set file = fso.OpenTextFile(filename & "testfile.txt",2,True)
The line for testing the file date should be:
Set file = fso.GetFile(filename & "testfile.txt")
If DateDiff("h",file.DateLastModified,Now) < 1 Then 'it was changed withn the last hour
¯\_(ツ)_/¯
------------------------------------
Reply:
Sorry but iI mised it. The first script hould only write to the hub server. The second script should only read from the remote servers.
Here is a modified script
strTestFile="\\FxxLovSrv01\DFS_Projects\DfsrTest\testfile.txt" ' Create or update a test file Set fso = CreateObject("Scripting.FileSystemObject") On Error Resume Next Set file = fso.OpenTextFile(strTestFile, 2, True) ' open for write and create if it doesn't exist. If Err.Number <> 0 Then Set sh=CreateObject("WScript.Shell") sh.LogEvent 1, "error opening file:" & filename & " Error:" & Err.Description WScript.Quit 1 End If file.WriteLine Now file.Close ¯\_(ツ)_/¯
- Edited by jrv Saturday, April 7, 2012 4:18 AM
------------------------------------
Reply:
error opening file:\\FxxLovSrv01\DFS_Projects\DfsrTest\ Error:Bad file name or number
error also returns same for first server in array if changed
------------------------------------
Reply:
event log error =error opening file:\\FxxLovSrv01\DFS_Projects\DfsrTest\ Error:Bad file name or number
error also returns same for first server in array if changed
Do now you know that error logging works. Now read teh reast of my postes to see that I already sent you the answer to that.
¯\_(ツ)_/¯
------------------------------------
Outgoing mail limit
Hi,
I have a customer who wants to send outbound mails for 3000 users of different domain and those mails should not be blocked by the FOPE is that possible.Please help me out.
Delshi
Reply:
------------------------------------
Reply:
Hi
Thanks for your reply!
I would like to know the procedure to prevent the mails from getting blocked also I would like to know the limit for sending outgoing mails for standalone domain along with an article so that I help out my cx more efficiently.
Delshi
------------------------------------
Reply:
First of all 3,000 emails is not that much for FOPE we send that many in an hour. Second are you saying send out 3,000 emails to 3,000 different domains, or 3,000 emails to one domain. If 3,000 to the same domain, than FOPE is not your concern but rather the receiving domain. I know that YAHOO will throttle you if you attempt to shoot 3,000 messages to it.
- Edited by G-r-e-g Wednesday, April 4, 2012 10:14 PM
------------------------------------
Reply:
Hello
It depends how you are trying to send an email. Various factors are been involved and there are some predefined thresholds been set in FOPE. We need to consider the FOPE subscription and compare the thresholds and meet the requirement.
As we are trying to achieve this there is a high possibility that the senders email address might get enlisted under the Banned Sender list with Microsoft or might come under the IP Blocklist (Internal) Tracking followed by FOPE.
Please get in touch with the FOPE Technical Support Team for further updates.
- Edited by Resolver1988 Thursday, July 19, 2012 12:57 AM
------------------------------------
Procedure for moving mailbox database/telling Outlook which CAS Server to use -
I want to test database switchover/failover in my environment. My environment is fairly simple. I will be shipping one of my servers out to another site, but I want to confirm users have no downtime while this sever is down. Both my servers are Hyper-V, with several virtual machines. The VMs are: a CAS Server, a file-server with AD, DHCP, DNS; mailbox server, and an RD Gateway server.
Reading this post:
and my own post:
http://social.technet.microsoft.com/Forums/en-US/exchange2010/thread/cb970af0-ac2b-4a82-b74f-6504223fa27d
My understanding / the steps are:
- confirm the queue copy length is 0 for databases before move
- initiate a switchover, either by EMC or EMS. Move my databases (already in a DAG) to my other server
- watch the switchover/databases and verify they are moved by checking 'MailboxDatabaseCopyStatus', and 'get-MailboxDatabase'
- after the databases have been switched over, change/update the RPCClientAccess value using 'Set-Mailbox Database' to other server
- down network interfaces
- flush dns*
- test logging on/connectivity
After doing these steps, users from the CAS server I am switching away from, will be connecting to the right CAS, until I bring it online and switch the databases back. Is this correct? I put flush DNS in there because it seemed logical to me - do I need to do this, or will Autodiscover pick right up and make changes as necessary? I'm looking for any and all advice from people who have done this before.
Thanks in advance :)
Reply:
You mentioned that you are shipping your server to another site - do you mean physical site or different site in your AD
------------------------------------
Reply:
Sukh
------------------------------------
Reply:
------------------------------------
Reply:
------------------------------------
Reply:
Sukh828:
I've been able to point this CAS server successfully, but I feel like I've missed a step or am having an error that I'm unsure how to resolve. When I run Get-OABVirtualDirectory on my CAS server that handles everything now, I get an error: 'An IIS directory entry couldn't be created. The error message is Access is denied.'
How do I go about fixing this? All the other solutions I've seen result in making sure the Exchange Trusted Subsystem is a member of the Administrators group; which it already is. Any suggestions?
------------------------------------
Reply:
Sukh
------------------------------------
Reply:
Sukh:
Quick question. I've got my server at my other site up and running. My databases are still mounted on my East-Cost Site. Am I able to just do a switchover, and have the databases mount on my other site, or will that cause split-brain? I assume split-brain is only in instances where your WAN link goes down, and you have active databases on both. In my case I only had one server serving clients. the other server before I downed it didn't have any databases mounted; but held my Witness Server. Is it as simple as a switchover again, or do I have to reseed my databases?
------------------------------------
Reply:
Yes you can do a switchover. Split Brain is what you have said.
Just to make sure, enable DAC mode if you haven't done so already.
Sukh
------------------------------------
Server 2008 R2: MUP event 139
Hello,
I have a problem that began very recently. We have a new Dell PowerEdge R515 server that is a domain controller. The backup fires off every morning at 5:00 AM (and the server has the network to itself at this time of the morning). The server is running Symantec BackupExec 2010 R3, with all the latest Symantec updates installed, and it is also running Symantec Endpoint protection 11.06A (in default configuration).
The server has dual Broadcom 5709E Netxtreme II NICs installed (with TOE and iSCSI offload option, although we are not using iSCSI at all on this particular server), with the latest firmware and drivers from Dell. It is plugged into a Cisco Catalyst 2960 switch, with an MTU of 9260 set. The Catalyst 2960 communicates with an identical unit in the next building via LC fiber, and the NAS device (an iOMega PX12-350r, running firmware version 3.2.3.9273).Most of the backup goes smoothly, until the backup hits the Active Directory backup. At that point, we get a "Delayed write failed" popup on the server console, and a MUP event 139 in the system event log:
{Delayed Write Failed} Windows was unable to save all the data for the file \\px12-sec\backups\DOM1-2\IMG000008\pdi.txt; the data has been lost. This error may be caused by network connectivity issues. Please try to save this file elsewhere.
The MTU on the NAS device is set to 9000, as is the MTU on the network interface on the Dell (via the Broadcom advanced Control Suite). Once the delayed write fails, the server loses network connectivity (via SMB) to the NAS, and it cannot be restored, despite the fact that the device can be pinged by FQDN name, IP address, or netBIOS name. A reconnect from the command line using Net Share gets either an error 53 or an error 64. All options on the Broadcom NIC are factory default, except the MTU size. A ping -l 9000 <nas name> reveals no errors related to jumbo frames. The same backup succeeds to a different NAS device (iOMega ix4-200d).- Moved by Tiger LiMicrosoft employee Thursday, March 22, 2012 7:55 AM (From:Platform Networking)
- Changed type Tiger LiMicrosoft employee Monday, March 26, 2012 8:56 AM
Reply:
Ooops, scratch "Net Share" above and replace with "Net Use" :)
------------------------------------
Reply:
Hi,
Thanks for posting here.
Could we first to disable the antivirus on this DC and try again ?
But If we can make it successfully backup to another destination NAS device with the same settings I 'd suspect there might first to douclbe check the functionality of the first one (iOMega PX12-350r).
Otherwise I think we may need to trace that by using some utilities , like Microsoft network monitor :
Delayed Write Failure Trace Study
http://blogs.technet.com/b/netmon/archive/2009/09/21/delayed-write-failure-trace-study.aspx
Thanks.
Tiger Li
Tiger Li
TechNet Community Support
------------------------------------
Reply:
Disabling antivirus has no effect.
------------------------------------
Documentation for data mining stored prcoedures available on TechNet Wiki site for SSAS
I've uploaded a technical reference for the data mining system stored procedures to the SSAS community site on TechNet.
This information comes a bit late, I know, but many people had commented that these stored procedures, while enormously helpful, are not documented (by design) and thus information can be hard to find.
So in this guide I've documented the parameters to the best of my knowledge and collected some links to the many helpful samples of how to use them that were published by the data mining development team over the years.
The guide is not complete and is an "unofficial" community article -- so please add links, examples, and above all make corrections!
Happy data mining.
SQL Server UE, Data Mining
CM1312 Driver Cd
Reply:
Is this the driver you down loaded? Make sure you turn off your anti-virus and have printer unplugged when you start the driver installation.
------------------------------------
Reply:
No, the down load is:
www2.hp.com/bizsupport/TechSupport/SoftwareIndex.jsp?lang=en&cc=us&prodNameId=3562006&prodTypeId=18972&prodSeriesId=3558902&swLang=8&taskId=135&swEnvOID=228
I will try your link. Thanks.
------------------------------------
Reply:
It really baffles me that people will come to a Microsoft website when looking for third party drivers, such as HP.
Had you either done an internet search for CM1312 or gone to the HP Support web site and checked, you would have been taken to a page that lists the two different models of the CM1312:http://h20000.www2.hp.com/bizsupport/TechSupport/ProductList.jsp?lang=en&cc=us&taskId=135&prodTypeId=18972&prodSeriesId=3558902
This page lists all the current drivers available for the CM1312: http://h20000.www2.hp.com/bizsupport/TechSupport/DriverDownload.jsp?prodNameId=3558903&lang=en&cc=us&taskId=135&prodTypeId=18972&prodSeriesId=3558902
Microsoft neither writes nor publishes the drivers for any third party hardware, so why would anyone expect to find third party drivers on a Microsoft website? The manufacturer of the hardware should be the first place to look for drivers.
Please remember to click "Mark as Answer" on the post that helps you, and to click "Unmark as Answer" if a marked post does not actually answer your question. This can be beneficial to other community members reading the thread. "
------------------------------------
Standalone Hyper-V Host?
I have read that DPM 2012 offers faster backups of VMs on a standalone Hyper-V host then DPM 2010.
What is meant by "standalone" Hyper-V host?
If we have a Windows 2008 R2 Datacentre server (full install not core) hosting several VMs on local storage (with no redundancy or resiliance), is this a standadlone Hyper-V host?
Bruce.
Reply:
HI,
Correct, we mean it is not a clustered Hyper-V server. You should see much faster guest backups because we now track .vhd file changes at the block level and can bring those changed blocks over without having to do block by block compare like we use to do, or still have to do on Windows 2008 clustered hyper-V servers.
Please remember to click "Mark as Answer" on the post that helps you, and to click "Unmark as Answer" if a marked post does not actually answer your question. This can be beneficial to other community members reading the thread. Regards, Mike J. [MSFT] This posting is provided "AS IS" with no warranties, and confers no rights.
------------------------------------
Reply:
I think I understandwhat you're saying...
At the moment every block in the latest backup of the VHD on DPM is compared with the VHD on the protected Hyper-V server, and any blocks that have changed are transferred to DPM to form a new snapshot.
But with DPM2012 either the Hyper-V server or DPM will have tracked what blocks have changed since the last backup and will only transfer those blocks?
In the former scenario, would I be correct in assuming that DPM checks for changed blocks by getting the protected server to create a hash of each block, then comparing this with a hash of the same block on the protected server to determine if it had changed?
I.e. this avoids the requirement for DPM2010 to transfer each block from the protected server to DPM for comparson (thus creating huge load on the network, which I haven't observed)?
This tallys with my observation of VM backups, where DPM reports that with a VM Express full backuo only a few MB/GB is transferred to DPM (maybe around 5% of the size of the VHD), but it seems to take considerably longer to do this than it would take if it were simply transfering that amount of data?
For my backups the VMs are backed up each night and the time taken is acceptable (finishes by 3am). But with the more efficient way DPM2012 backs-up VMs, might there be be potential to do Express Full backups/snapshots during the day as well?
Regards,
Bruce.
------------------------------------
Reply:
Hi Bruce,
You are correct in all counts, except the hash blocks sent between the PS and DPM is not that great. In fact, if you place a VM into a saved state, then perform 2 backup to back backups, the amount of data transfered will be identical and is strictly the hash blocks since obviously no changes were done in the saved .vhds.
If you have secondary protection setup on DPM 2010, for Hyper-V workloads we do track block level changes on the primary DPM Server, so you can see the huge benefit because the backups are much much quicker on a secondary DPM server versus the primary dpm server.
Please remember to click "Mark as Answer" on the post that helps you, and to click "Unmark as Answer" if a marked post does not actually answer your question. This can be beneficial to other community members reading the thread. Regards, Mike J. [MSFT] This posting is provided "AS IS" with no warranties, and confers no rights.
------------------------------------
Looking for change the last name .
I was looking for some feedback on the best way to change a user's last name and reflect changes everywhere:
1. Rename the Last Name and Display Name in Active Directory.
2. Add the new email address to his account
3. Add the changes in GAL and exchange.
Please suggest how can i build a runbook to automate this.
Thanks
- Changed type Robert_Hearn Wednesday, May 9, 2012 5:03 AM request for best practices
Reply:
------------------------------------
Connecting to a cluster virtual instance using a DNS alias
Hi all,
I've a SQL Server 2005 cluster with 2 instances (active active cluster) each insatce has its own virtual IP address and ita own virtual name. I want to create 2 DNS alias for my instances' virtual names. Can I use that DNS alias in my applications' connection strings?
I have to move to a new cluster so my plan is to use the DNS alias, when the new cluster is ready and the databases are syncronized between the old one and the new one:
stop sql server on old cluster
stop sql server on new cluster
change dns alias
start sql server on new cluster
correct?
Reply:
------------------------------------
Reply:
Yes,I have used dns alias in the past without issues, also can have multiple alias names created pointing to the same sql instance,so for different apps/databases can use separate alias.
------------------------------------
SQL MemoryClerk_SQLBufferPool
Hi everyone,
I am looking at the server standard reports Memory Allocation and I saw something which I am not sure about.
For MemoryClerk_SQLBufferPool I have the following columns values:
- Allocated MemoryClerk_SQLBufferPool is 400 KB
- VirtualMemoryReserver = 8 516 576 KB
- VirtualMemoryCommited = 3 929 472 KB
In general all values of MemoryClerk_* are higher than the ones in the Allocated column.
Could you please advise me what is the function of the memoryclerk and what are the possible reasons for this difference in the column values.
Thanks in advance
Kal
Reply:
Kind regards| Harsh Chawla | Personal Blog:- SQL-blogs
|Team Blog:- Team Blog
------------------------------------
How to best configure a Poweredge C6100
I'll start this off by saying that I'm a developer who got reined into IT, so consider me a newb. We bought a refurbed Poweredge C6100, which basically is giving me 3 drive bays. I'd like to install W2k8 R2 server as the host, and then run two Hyper-v VMs, one each for SQL 2008 and 2012.
We currently have 3 600 GB 15k SAS drives, and I'm having trouble figuring out the best way to configure them.
We'd like to have as much disk space as possible for the VMs, but we also need to leave enough space to back them up on the physical server before shipping them out.
I currently have them RAIDed as 1E, which is giving me ~900GB of space, but I'm afraid if I have to leave enough space to fully snapshot one of the VMs on the host, I'm not going to be left with very much space at all, especially at the price we paid for the 3 drives.
It seems most recommendations are to RAID1 the OS, and then RAID5 the data drives. Unfortunately with only 3 drives, this is not possible.
Any help would be greatly appreciated, I just don't really know what to do.
Thanks!
- Changed type Vincent Hu Tuesday, May 1, 2012 8:10 AM
Reply:
ShawnTracy,
You have a limited config for the RAID setup and I only see two different options. RAID-1 with a hot spare and RAID-5. Is this going to be a production server or just something used in a test / dev scenario? Since you only have the 3 drives and the single RAID controller splitting the OS and DATA drives would not really give you much benifit. Give me a little more info about the amount of data these SQL servers are going to have and how much space you want to give the VM's. You can do dynamic disks for the VM VHD's and that will only use as much physical disk as the actual data inside the vm is using. So the unused space does not count against you.
------------------------------------
Reply:
Thanks for the quick response!
This is a development box, but it's a very important development box as it will house all of our work in progress (we are a software development agency). As we work on many projects, our SQL space tends to fill up quickly. We basically want to be able to set it and forget it for a couple of years, as we're currently working with a 100GB VM and almost weekly have to do disk space cleanup on it to keep it running. I would expect that we'd wind up running over 300GB per VM (the 2008 and the 2012) within 12 months or so, if not quicker. My hope is to not have to worry about anything on this server through 2014.
The server does not have a RAID5 hardware controller, and cursory research has us priced out of that game. I tried to set up RAID5 through Windows, and was unsuccessful for some reason - it wouldn't let me convert the main OS disk to Dynamic, I'm assuming because it had the little 100MB recovery partition it automatically sets up.
Thanks again for any advice.
------------------------------------
Reply:
------------------------------------
Reply:
------------------------------------
The Word is out Game Over for Technological Industry
Reply:
Hello,
What do you mean by that? Can you please elaborate more.
------------------------------------
SCCM OS deployment without adding compute in Computer Association Node
HI,
I want to know, is there any option to deploy OS through SCCM without adding a computer in Computer Association Node.
Exp :- If I have 100 brand new computers & I want to deploy OS on them Using SCCM, do I need to add all the computers (GUID, MAC Add, etc) in Computer Association Node? or can I deploy OS without this step....
Thanks in Advanced !
Regards Rushikesh..
- Edited by Rushikesh Kalyani Wednesday, March 28, 2012 9:36 AM
Reply:
you can still deploy them, use R3 to add unknown computer support and then advertise a Deploy OS task sequence to the Unknown Computers collection, you can simply pxe boot those computers and they will see that task sequence and can be imaged.
cheers
niall
Step by Step ConfigMgr 2007 Guides | Step by Step ConfigMgr 2012 Guides | I'm on Twitter > ncbrady
------------------------------------
Reply:
Thanks for the reply Niall.
I will install R3 & check for the same, Will post my update here.
Regards Rushikesh..
------------------------------------
Reply:
Hi,
I have done the above step, but its not working...
Errors are as below -
On client system : - No Responce From Windows Deployment Service
On server smspxe.log : - Shwos "Device Not Found in The Database"
Is there any additional step I need to follow to get this worked?
Regards Rushikesh..
------------------------------------
Reply:
------------------------------------
Reply:
Hi,
My problem has resolved.....
I have only one DP in my network....
1) I have removed everything from Opereting System Deployment Node
2) Removed PXE Service Point
3) Uninstalled WDS & restarted server
4) Installed WDS & PXE Service Point
5) Added everything in Opereting System Deployment Node
Its resolved my problem.....
Thanks all of you for your reply.....!
Regards Rushikesh..
------------------------------------
No comments:
Post a Comment