Wednesday, March 2, 2022

Accessibility problems on the MSDN galory

Accessibility problems on the MSDN galory

Hello all,

I figured I would report this now so that I don't forget.  I have noticed some accessibility problems on the galory that I believe should be addressed immediately. I went to search for some contributions demonsstrating ADO.net to be exact, and upon one I found, my hope was thatI could give the author a 5-star rating.  Well, I'd like to tell you though that the rating section is not labeled, so when a user using a screen reader finds it, he/she cannot read the stars to ensure that the right one was clicked since all the screen reader will say is "blank."This is not good for blind users who want to rate author's.  Right now, a user who cannot see the screen and who has to use output from a screen reader can only go on chance the way that the rating system over there is set up right now.  I'm wondering if you could please label the rating system with the appropriate star so that when a user using a screen reader clicks, they know which one they are clicking on in order to ensure that no misrating occurs and that they give the rating they wanted to.  Thanks for listening to this. 

Gmail IMAP in Outlook 2010 delete issue

I am trying to set up my Gmail account in Outlook 2010 using IMAP.  In previous versions of Outlook, when I hit "delete" in Outlook, the message left the “Inbox,” but was still in the “All Mail” folder (I believe it removed the label "Inbox" from the message).  However, in Outlook 2010, the message is moved to “Deleted Items” and is no longer in “Inbox” or “All Mail.”  So there is no longer a copy archived to "All Mail."  I have read the other posts concerning this issue, none of which work.  If there is a solution, please help.

 

Jeremy




Operating system (e.g. WinXP): Win 7 Ultimate

Program and version you use to access Gmail (e.g. Internet Explorer 9 or Outlook 2003):  Outlook 2010

Your antivirus software (e.g. Norton 2007): Kaspersky Interney Security 2012

 


Reply:
When you delete imap mail, the messages are marked for deletion and not purged until you purge the folder. Outlook 2010 offers additional options - go to File, Account Settings, double click on the account then More Settings. Check the Deleted items tab.

Diane Poremsky [MVP - Outlook]
Outlook Daily Tips | Outlook & Exchange Solutions Center
Subscribe to Exchange Messaging Outlook weekly newsletter

------------------------------------
Reply:

Diane,

I have played with the delete items settings but I am still having the same issue.  When I receive a new email, it appears in 2 places (Inbox and All Mail).  In previous versions of Outlook, when hit the delete button, mail would be removed from Inbox but remain in All Mail.  Google described the process as removing the label "Inbox" from the message rather than deleting it.  I beleive it was set up to be an archive for ever message both received and sent. So I do not know if my problem is a setting issue or if when Outlook 2010 upgraded the configuaration was changed.

Thank you,

JermUF


------------------------------------

Can't update MS-DaRT Standalone System Sweeper- Error code 0x8007051a

"Error found: Code 0x8007051a. Indicates two revision levels are incompatible."

This has just started happening in the past couple of weeks, both on newly created DART images as well as older ones that had updated fine before. This is not a scratch space issue, since I have used the workaround described in another recent post to increase scratch space in PE.

I've built a brand new DART image, and it does correctly download the latest definitions installer during the build process, but it doesn't unpack correctly within DART due to the above error. I've unpacked and manually placed the definitions in the meantime, but this is not an appropriate long-term solution, since I can't realistically dedicate the time to build all-new DART images every day.

I suspect, as others have, that there is a problem with the definition package installer but for the moment I'm powerless to fix it myself. Anyone have any ideas?

  • Edited by Tony Dumas Monday, January 30, 2012 8:31 PM
  • Changed type Arthur Xie Wednesday, February 1, 2012 5:50 AM

Reply:
Between my own posts and others, and some calls to MS Support, it seems that it may be working again...

------------------------------------

Error using reflection from a BizTalk orchestration

Hello.

I have the following static method inside a class contained in a .NET class library.

 public static AbstractLog GetObjetoLog()
        {
            AbstractLog objLog = null;
            Type tipo = null;

            //Obtain the class type of an object
            tipo = Type.GetType(Xunta.IFRT.ModulosComunesESB.Log.Facade.FachadaLog, Xunta.IFRT.ModulosComunesESB, Version=1.0.0.0, Culture=neutral, PublicKeyToken=a8e38e545b78a460);

//Construct an object using reflection with its default constructor
 objLog = (AbstractLog)tipo.GetConstructor(Type.EmptyTypes).Invoke(null);

            return objLog;
        }

If I execute this method ouside a BizTalk orchestration with a Visual Studio 2010 unit test, it works fine. Otherwise, if I execute this code inside a orchestration (inside a expression shape) the following exception is thrown:

Activator.CreateInstance (tipo)    'Activator.CreateInstance (tipo)' threw an exception of type 'System.Reflection.TargetInvocationException'    object {System.Reflection.TargetInvocationException}

I'm using BizTalk Server 2010 and .NET Framework 4.0. The assembly Xunta.IFRT.ModulosComunesESB.dll is registered in the GAC.

Can someone help me? I can't instantiate this object using reflection from a BizTalk orchestration.

Thanks,

Best regards.

Reply:

Did you use Microsoft.BizTalk.Reflection assembly

See below:

http://weblogs.asp.net/gsusx/archive/2004/08/28/222060.aspx

http://weblogs.asp.net/gsusx/archive/2005/01/08/349177.aspx

Might be useful


If this post answers your question, please mark it as such. If this post is helpful, click 'Vote as helpful'.

------------------------------------

FAQ - Maximum number of items limited by a SharePoint 2010 List

SharePoint 2010 List can have more 50,000 items.

Central administration provides the feature to throttle how many records are viewed by users.


Amalaraja Fernando,
SharePoint Architect
This post is provided "AS IS" with no warrenties and confers no rights.

Reply:

The maximum number of items in a list is 30,000,000.

To limit the number of items in a view you will need to change the Item Limit in the List Settings.


Martin Dobbie, Silversands Ltd (MCTS, MCITP)

------------------------------------

Optimizing a companies IT, head office with 7 branch offices, charity (low budget)

We have a customer with 7 branch offices. The head office comprises one server and 15 clients, each branch office comprises three clients. The branch offices are connected via VPN and are in different nets. Sometimes employees work at different branch offices for some time. Certain employees work two days in one branch office and three days in another branch office. The current file situation is chaotic with some employees working directly on the server, some solely local and some local with user-timed backup-scripts (to server).

This year we will migrate their IT to Small Business Server 2011 Standard and Windows 7 with Office 2010. They also will probably upgrade their internet bandwith to minimum 16MBit/s down and 4MBit/s up. The branch offices run at 16/1MBit/s. They also We decided to set up a company policy how files have to be handled. Since our customer is a charity, they already reached their financial limit. So there is no more room for hardware and software. Considering these facts I am still thinking about an optimized solution for my customers branch offices.

The solutions I was thinking about are:

  1. Clients must save company regarded files on the server. Since the current Office file format produces very small files, the delay is minimized and has to be lived. In exchange all important files are backed up centrally and can be accessed by every workstation in the company. Since the policy says to save all files immediately on the server, roaming profiles are not needed and will not slow down the internet connection.
  2. Clients are allowed to save files locally if they either work solely with it, or accomplish to manage file acces amongst themselves. They have to synchronize their files with a provided script to the server. I am not happy with this solution but my customer might want this as his employees are already complaining about slow files access if file is on the server.
  3. Clients work with offline files on their home drive offering cached access and automatic file synchronisation. On shared drives they must work on the server. Since the policy says to save all files immediately on the server, roaming profiles are not needed and will not slow down the internet connection.

Do you have any ideas, improvements or did you implement a similar IT strategy? I would be glad if you would write them down.

ϻοϰsϯεr


Reply:
You do not say what the current solution is at the main office, but if it is a server of any type that can be repurposed with Windows Server 2003 or later, use it as a terminal server.  Not for profit licenses are extreamly favorable in price, but I am not certain that RDS/TS licenses can be acquired this way.  If not, it will have some financial consequenses, but is still the best way to insure that files stay in the main office.  Plus, you can keep the current generation of systems in the remote offices longer, and eventually replace them with thin clients.  Also, look at Windows Multipoint Server.
Larry Struckmeyer[SBS-MVP]

------------------------------------
Reply:

HQ with a 'baby' Dell server (T310/410) doing HyperV with 1) SBS11 and 2) Windows Multipoint Server as virtual machines, maybe 3) Premium Add On (for SQL).

TechSoup (US) pricing is probably very similar to DonorTech (AU) for a registered charity.

SBS11(Standard) AU$55

WMS (Premium, for Domain Join) AU$48

SBS11 PAO AU$81

depends on how you do it you'll need various CALs. SBS Cals, WMS CALs (functionally an RDS CAL), and Premium CALs, each being AU$5-6

 

The remote offices would work primarily in WMS but could be configured to operate locally with Outlook Anywhere and saving files to their local PC (really, this is only in case the link to HQ is down. All 'real work' would happen in WMS).

The only problem I see is that WMS Premium is 20 users, and 7*3 is 21. a 2nd copy of WMS could be purchased and the remote users evenly divided between the two.

The hardware to do this on is _pretty well_ exactly the same hardware as would be required to run SBS, bit more RAM and decent disk IO required.

Company related work would automatically be stored on the server(s) at HQ.

No silly scripts to be run copying files across the WAN.

No need for Offline Files.


------------------------------------

Create sharepoint list field with internal name specified

Hi all

I just created some script to create sharepoint list field with the internal name specified, if you are interesting, enjoy it

Function CreateSPField
{
    param
    (
        $siteUrl="",
        $listName = "",
        $newFieldXml = "<Field Type=`"Text`" Name=`"AreaTestTest`" StaticName=`"AreaTest`" DisplayName = `"Area Title`"></Field>"
    )

    Add-Pssnapin microsoft.sharepoint.powershell
    $site = get-spsite $siteUrl
    $web = $site.OpenWeb()
    $list = $web.Lists[$listName]
    $fields = $list.Fields

    $op = [Microsoft.SharePoint.SPAddFieldOptions]::AddFieldInternalNameHint
    $bindingFlags = [Reflection.BindingFlags] “Static,GetProperty,NonPublic,InvokeMethod,Instance,Public”
    $param1Type = [System.String] -as [Type]
    $param2Type = [System.Boolean] -as [Type]
    $param3Type = [Microsoft.SharePoint.SPAddFieldOptions] -as [Type]
    $methodtype = [type[]] @($param1Type,$param2Type,$param3Type)
    $method = $fields.GetType().GetMethod("AddFieldAsXmlInternal",$bindingFlags,$null,$methodtype,$null)

    $parameters = @($newFieldXml, $false,$op)
    $method.Invoke($fields,$parameters)
}


SharePoint 2010 PowerShell

Refresh XP machine with MDT - Failed capturing user data !!!!????

ok i'm trying to do a refresh pc xp to win 7 professional, when i run the litetouch script from xp pc it gives me this error :

please help....



Reply:

Hello,

as this forum ios related to Directory services please use the MDTB forum http://social.technet.microsoft.com/Forums/en-US/mdt/threads/ or as another option the Windows XP or Windows 7 forum in: http://answers.microsoft.com/en-us


Best regards Meinolf Weber Disclaimer: This posting is provided "AS IS" with no warranties or guarantees , and confers no rights.

------------------------------------

reusing the cube & dimensions

HI Team,

Currently iam having a SSAS solution which was already created and deployed in the production.
Can i use the same solution with different data source and different destination. which out changing any of the cube design and dimesion designs.
in that case
what are the steps i need to follow to reuse the cube.
i also want to add some roles above that reused cube.
is it possible.
thanks in advance

baskar k


Reply:

Yes it is possible.

Follow below steps

Open the solution in BIDS->Open the datasource-> Point it to new server/DB ->Open data source view-> Right click on open area and choose refresh, It allows you to see if there are any changes in the new source compare to old one->To change destination, right click on solution in solution explorer and choose properties->change server name as per new destination-> you can add modify any other object like Role etc.

~Chandra

Please mark as answer if this answers your query

 


------------------------------------
Reply:

HI Chandra,

Thanks for the information provided. :D


i want to know some more details about the partition in the cube and the dimension mesures in that case.
when i was working with this kind of solution.i executed the same steps you have mentione and tried to deploy the cube.
but it was throwing some partition error when i tried to deploy the cube.Yes there are some other errors but i managed to solve the errors with the help of the forms.
what happens in the above mentioned scenario i have copied the cube from other solution and deleted the partition and created the new partition and selected the option create aggregation later. so now i have created a new partition and successfully deployed it.but when i browse the cube it is not returning any results for me.
i checked the following things
1] explore data in the cube- working properly
2] checked the relation - every thing is correct.
3] checked the calculation tab- it was throwing errorr - i have repleced two .dll file and it is working fine.
but when i browse the data is not coming for me.
can you please guide me why iam not seeing any data when i browse it.

is there any cube partition related to the SQL server versions please tell me how to over come that one.

thanks in advance

baskark


------------------------------------
Reply:

how did you create your partition, is it query based? if yes, execute that query in SSMS and see if it is returning any results.

In your above statement you mentioned

i have created a new partition and successfully deployed it.but when i browse the cube it is not returning any results for me.

and

1] explore data in the cube- working properly

arent those 2 statement contradicting?

-C


------------------------------------
Reply:

HI Prakash,

Thanks for your immediate reply again.
1] I have opened the cube -> i opened partition tab -> selected each partiton -> deleted it.
3[ I created new partiton using the following steps
    open cube -> new partition hyperlink and created it -> after partition completed -> selected the agreation later check box option.

this way i have created the partiton.

when i open the cube in designer mode -> go to cube structure -> select any fact table -> right click -> explore data.

data is returned in a separate popup window. but after deploying the cube and i browse it through browse tab no data is returned.

thanks in advance

baskar k


   


------------------------------------
Reply:

hmm..you shouldnt have deleted first / base partition. By default when cube is designed, it creates a table based partition.

You have yet to answer my question whether you designed partition based on query or table?

when i open the cube in designer mode -> go to cube structure -> select any fact table -> right click -> explore data.

By doing above you probably looking at stale data (your changes not applied yet)

But after processing, As you have deleted base partition, it is not showing any data.

Solution : Create query based partitions and reprocess the cube.

-C


------------------------------------
Reply:

HI Prakash,

You have yet to answer my question whether you designed partition based on query or table?

 

It is based on the table not on query.

thanks

baskar k


------------------------------------
Reply:

Baskar,

Please use partition wizard in BIDS to create new partition based on facttable. Then deploy,process.

Thanks

-C


------------------------------------

Tesing Runbook/Activity through any external script/code i.e without using runbook designer/UI/console

Hi,

Does someone has any idea about testing a runbook/workflow or the activity(in runbook) via any external code(any scripting language would suffice).

Basically i do not wish to run or test the runbook from GUI provided with Opalis/SCO.

Any details about any webservice/API exposed by Opalis to external world for communicating with Runbook/Activitiy

I would highly appreciate if anyone can provide an early resolution to my problem.

Thanks in advance.

regards,

Nitin

 


Reply:

Hello Nitin,

no there's actually no external methode to test Policies/Runbooks outside from Opalis/Orchestrator and to be honest I see no need for it until now.

You know Testing Console / RunbookTestertdo run Workflows step by step?

Regards,

Stefan

 


------------------------------------
Reply:

System Center 2012 Orchestrator includes a REST based web service as part of the web components install.  You can submit runbooks and get the results from the activities via this interface.  You can also create reports using PowerPivot.

 

Opalis has a similar web service but it is SOAP based.

 

The documentation for both is on TechNet.


------------------------------------
Reply:

Hi,

Thanks to both for a prompt reply...

@James: I browsed through the technet forum and found following link

http://technet.microsoft.com/en-us/library/hh420377.aspx which mentions

"The Orchestrator web service is a Representational State Transfer (REST)-based service that enables custom applications to connect to Orchestrator to start and stop runbooks, and retrieve information about operations by using custom applications or scripts. The Orchestration console uses this web service to interact with Orchestrator."

But could not get any docs/reference on how to use the web service to start/stop or get Activity data fetched from runbooks. If possible, could you point me to any sample script/web service client or any documentation.

I need to explore some test automation framework which can test runbooks without using the SCO/orchestrator GUI console. So any help would be highly appreciated.

@Stefan: Yes I know of runbook designer/runbook tester , but then as I mentioned , I need some workaround to call these runbooks outside of console/GUI interface. So any help would be highly appreciated.

Regards,

Nitin

 

 

 


------------------------------------
Reply:

Hello,

maybe this helps for your intend:

http://blogs.technet.com/b/scorch/archive/2011/05/25/getting-ready-for-orchestrator-2012-accessing-rest-web-services-using-powershell.aspx

http://blogs.technet.com/b/scorch/archive/2011/06/17/fun-with-the-orchestrator-2012-beta-the-web-service-and-powershell.aspx

Edit: I think it's also discussed here http://social.technet.microsoft.com/Forums/en-AU/scogeneral/thread/8e434424-e9ff-4c5c-826a-586f5c6e9884 which links to http://jmattivi.wordpress.com/

In my first reply I may understood wrong that you intend to test single activies outside Orchestrator. For Starting and Stopping runbooks outside Orchestrator this powershell examples should work.   

 

Regards,

Stefan

 




------------------------------------
Reply:

Thx so much..

I'm in the process of trying these samples, but I could not locate much details about the Orchestrator REST based web service itself , like what APIs it exposes or any documentation on the same. Any pointers in that direction would be really helpful for me.

Regards,

Nitin

 

 

 

 

 


------------------------------------
Reply:

Hi,

 

Reqeust if any one could reply please.

 

Regards,

Nitin


------------------------------------

We receive 'An internal error has occurred 50331655. For more information....' error when trying to connect VIA RWW to a PC

Hi All,

We receive the above error RWW is setup i.e. 4125 is forwarded etc we've also installed the certificate to the PC trying to access the 'remote PC'.

Please could anyone advise us.

Thank you.

Noaman

 

 


Reply:
Did you enable remote desktop on the computer you are trying to connect to?
regards Robert Maijen

------------------------------------
Reply:

Hi,

Whether the issue still exists, looking forward to your feedback.

 


Technology changes life……

------------------------------------
Reply:
I had this problem as well, It turns out the self issued security certificate was out of date, run the re-issue wizard.

------------------------------------
Reply:
Where is the re-issue wizard in SBS 2008. Sorry NEWBIE :-0

------------------------------------

MS Powerpoint Pivot Limitation

This is a PowerPivot + SharePoint for Internet Sites Enterprise Edition case. Customer is evaluating PowerPivot as an alternate to Oracle for their reporting requirements. Feedback from Customer: We have come across a critical issue while doing the Power Pivot POC. During our analysis, we found that there is a limitation of only 60 columns / attributes which can be dragged to build the data model for the purpose of ad-hoc data browsing. Please let us know if this is some limitation with Power Pivot and in the event our understanding is correct, please let us know if there is some workaround in order to address this issue? Thanks

Reply:

Hi,

According your description, this issue is more PowerPivot for SharePoint related. In order to get the answer effectively, it is recommended to post a new thread in Microsoft PowerPivot for SharePoint Forum for further discussion.

http://social.msdn.microsoft.com/Forums/en-US/sqlkjpowerpointforsharepoint/threads

The reason why we recommend posting appropriately is you will get the most qualified pool of respondents, and other partners who read the forums regularly can either share their knowledge or learn from your interaction with us.

Best Regards.


William Zhou

TechNet Community Support


------------------------------------
Reply:
Thanks William. I'll post a new thread in Microsoft PowerPivot for SharePoint Forum for further discussion.
Thanks, Faisal

------------------------------------

FIM Syncho starts successfully but FIM Service will not start

At issue: SharePoint User Profile Synchronization

I'm positive that the crux is in this error message:

A Kerberos Error Message was received:

on logon session

Client Time:

Server Time: 21:13:18.0000 1/17/2012 Z

Error Code: 0x7 KDC_ERR_S_PRINCIPAL_UNKNOWN

Extended Error: 0xc0000035 KLIN(0)

Client Realm:

Client Name:

Server Realm: XXXX.COM

Server Name: MSSQLSvc/MMMM.XXXX.com:1433

 

Target Name: MSSQLSvc/MMMM.XXXX.com:1433@RMLS.COM

Error Text:

File: 9

Line: f09

Error Data is in record data.

 

I've gone through validating that SPNs exist and are mapped to the http service.  Delegation exists between the servers for the farm account that is running this server. 

Please provide some additional direction.  I've looked all over the net and found many a wonderful article but...

Any help would be fantastic!

 

What I see:

the FIM Synchro shows Starting=> Started

the FIM Service shows Starting=> Disabled

the FIM Synchro then changes to Disabled

^Thanks!


AT

Reply:

Seems like your service account for FIM Service does not have sufficient permissions to RUN AS SERVICE.

Try verifying your GPOs and allow run as service for fim service acconunt

Verify that this account is linked to the disabled service and make sure you have correct password.

 

//Christian


------------------------------------
Reply:

Thanks for the feedback Christian.  I went down that same path.  The domain user (service account) that has been service activation. I've even tried activating remote service activation.  The results after making the change allowed the FIM Synchronization service to transition from Disable (Autostart) => Starting => started.  The FIM Syncrhonization Service remains "Started" until the FIM Service attempt to start then fails.  The failure of the FIM service also kills the FIM Synchronization Service (no surprise).  There is a significant delay before the FIM service to attempt to start after the Synchronization service is up and running.  Both services were quite delayed when the UPS service was fired up from SharePoint but the delay went away after changing the permission on the account that runs the service to allow it activation rights.  My suspicion is that there is either something sideways in the way the account is configured in AD DS.  The account has more than sufficient rights to the database.

 

AT


AT

------------------------------------
Reply:

AT,

 

Based on the stack trace above, I would surmise the problem is with the SPN(s) for your SQL service account. The error

Error Code: 0x7 KDC_ERR_S_PRINCIPAL_UNKNOWN

typically occurs when you have either a missing or bad SPN for account that is running service. If SQL is on another machine other than your FIM portal and service, this has to be correct or client machines would be using a 'double-hop' to get to portal and then SQL. Possibly you could have duplicate SPNs as well.

 

If SQL is using a domain account that is not an administrator, it won't create the SPN for you, it would need to be assigned manually.


------------------------------------
Reply:

My first idea is also SPN.

Check if the service account has a SPN set. Use SETSPN -x to search for dupilcates (have had this some times)

Check the SPN for the serviceaccount by SETSPN -L <accountname>

Check in users and computers if the service accoutn has delegate

  • The fim-service-service-account should be configured for delegation to fimservice/fimsvr
  • The app-pool-service-account should be configured for delegation to fimservice/fimsvr

    If your SPNS  are Ok - then it must be a GPO issue.

     

    //Christian


  • ------------------------------------

    Calling sharepoint online webservices from a web application

    Hello ,

    I am trying to connect to the webservices of my sharepoint online site or public site , via code, from a web application.
    This is the code I am using :

    --
    var webs = new sharepoint_webs.Webs();
    webs.UseDefaultCredentials = false;
    webs.Url = "http://mydomain-web.sharepoint.com/_vti_bin/webs.asmx"
    webs.Credentials =new NetworkCredential("login", "password");
    XmlNodeList xmlnodes = webs.GetWebCollection().ChildNodes;
    --
     

    This code is found all over the web , but in my webapplication it always results in the following error :

    Server was unable to process request. ---> Access is denied. (Exception from HRESULT: 0x80070005 (E_ACCESSDENIED))


    I read this article on claims authentication : http://msdn.microsoft.com/en-us/library/hh147177.aspx . However I am trying to automatically connect from a web application. This code uses a webbrowser object and pops u a window for a user to enter logon and password.

    Is there any way that anyone knows of, to use the sharepoint online webservices, by just passing a username and logon ? I thought that webservices were supposed to be easy accesible and so on ?

    Is it possible to do this from a web application?

     

    Forum FAQ: After full backup, the Transaction log files are not removed. How to troubleshoot the issue?

    Question:

    After full backup, the Transaction log files are not removed. How to troubleshoot the issue?

     

    Answer:

    The issue may occur for several causes:

     

    1. One of database is not mounted in the Storage Group

    2. If possbile, please use NTBackup tool to backup Storage Group to check the issue. If the log files after truncated by using NTBackup Tool, the issue may relate to the third-party backup tool.

    3. If it is Exchange 2007, please check whether the storage group has SCR or CCR enabled. You can refer to "Log Truncation" section of following whitepaper:

     

    White Paper: Continuous Replication Deep Dive

    http://technet.microsoft.com/en-us/library/cc535020(EXCHG.80).aspx#LogTrunc

     

    4. If the issue still persists, you need to check Application log for any clues.


    Reply:
    If its E2k10, then, one of the database copies might not be in healthy state ..
    Dominic Savio | MVP – Exchange Server (2008 ) | MCITP| MCSA: M | www.ExchangeServerInfo.Net

    ------------------------------------

    DNS Warning Event ID 4013, slow boot

    Sorry to bother again, but some person moved my original thread to the Active Directory forum and my question is about DNS :-(

    I have a standalone DNS server with 3 zones. The machine worked well.

    Yesterday I installed AD on that machine. I did not change DNS settings, so DNS is still standalone, not Active Directory integrated.

    However, the DNS service does not start up after AD installation. Error is 4013.

    As far as I understand DNS is waiting for AD to start up to read ad integrated zones. Still, there are no ad integrated zones at all.

    How can I make DNS to start up normally?

    I do know by now that defining a different DNS server instead of the local machine in ip configuration settings does solve the startup issue, so there is a workaround, However I would appreciate to learn a way to just configure DNS in a way that it just starts up when the server is started.

    /Klaus

     



    Reply:

    What you are trying to achieve? DNS is integral part of Active Directory and the combination AD and DNS while DNS setting in NIC properties points to another DNS server save the functionality of DNS, but the AD is meaningless.

    Would you provide more detailed plans of target functionality of your system?

    IMHO the combination of your current DNS and AD is not a good idea.

    Regards

    Milos


    ------------------------------------
    Reply:

    Milos,

    DNS is not an integral part of Active Directory, DNS is a standalone role Windows Server 2008 knows about. And yes, you are right, pointing to another system is not a very good idea. I am actually very frustrated about this issue. The target functionality of this server is DNS. The AD implementation is temporary only. I was not aware that installing AD will break the DNS functionality.

    /Klaus

    ------------------------------------
    Reply:

    Hello,

    have you seen http://support.microsoft.com/kb/2001093 ?

    Please post an unedited ipconfig /all from the DC/DNS server.


    Best regards Meinolf Weber Disclaimer: This posting is provided "AS IS" with no warranties or guarantees , and confers no rights.

    ------------------------------------
    Reply:

    Meinolf,

    the article 2001093 refers to AD integrated zones, however, there are no ad integrated zones at all.

    I am currently not at this customer's site, so I just can't post ipconfig. Still, what useful information do you think you can take from that information?

    /Klaus


    ------------------------------------

    Applications Attachmate Infoconnect and Google Chrome Conflict

    Hello, I am running Windows 7 64-bit w/SP1, Attachmate Infoconnect 9.0 w/SP1 and Google Chrome 16.0.912.77. When both applications (Infoconnect and Chrome) are running at the same time then sooner or later Infoconnect hangs and stops respoding. The only way to get it work is to close Chrome, then Infoconnect responds immediatelly. When Chrome is closed then Infoconnect responds properly and Chrome can be started and used again. However, Infoconnect hangs again sooner or later (in several minutes or hours). It seems there is some resource conflict between these applications. I attached screenshot from Resource Monitor which shows some relation between Infoconnect and Chrome processes.
    Resource monitor screenshot link: http://img29.imageshack.us/img29/2761/captureny.png
    I would appreciate your suggestions if it is one of the application or the OS problem and how to resolve it.
    • Changed type Niki Han Wednesday, February 1, 2012 1:54 AM

    Reply:

    Hi,

     

    It seems these two applications conflict each other. It is not an OS problem, I suggest contacting chrome or Attachmate Infoconnect support to get professional assistances. Sorry for the inconvenience.


    Niki Han

    TechNet Community Support


    • Edited by Niki Han Tuesday, January 31, 2012 6:13 AM

    ------------------------------------
    Reply:

    OK, I'll contact the applications support. Thank you for your reply.

    ==

    Update: Attachmate provided hotfix for Infoconnect 9.1. I've been running those two applications all day today without problem. It seems to be fixed.

    • Edited by Andrej Cepko Wednesday, February 1, 2012 3:25 PM

    ------------------------------------

    Information

    Type the word "information" in a MS WORD document then run the thesaurus on the word.  See what pops up. 

    Reply:
    Do you have a question?
    Kind Regards, Rich ... http://greatcirclelearning.com

    ------------------------------------

    Security hole in "Software Restrictions" policy

    Back in September, I posted a thread about a security hole in the "Software Restrictions" policy.  Thru a simple work-around, non-administrators could circumvent policy restrictions created by system administrators.

    Juke Chou was very helpful in reproducing this problem and reporting it to the appropriate group at Microsoft.  However, I have never heard anything back. 

    I realize that with today's operating systems, fixes can't happen overnight.  But 4+ months seems a long time to spend making a decision about what to do with a security violation/elevation.

    Either a "yes we're going to patch this for W7" or a "this will be addressed in W8" or even a "yes we see this security hole but we don't intend to repair it" would be appreciated.
    • Changed type Niki Han Wednesday, February 1, 2012 2:54 AM

    Reply:

    Hi,

     

    I am sorry there is no response after reporting the issue. I have no idea whether the development team will release the hotfix for Windows 7 or just address the issue in Windows 8.

     

    I notice you have a workaround GPO in previous post, please set the following GPO instead for a test.

     

    User Account Control: Behavior of the elevation prompt for standard users

    http://technet.microsoft.com/en-us/library/dd851602.aspx

     

    Niki

    TechNet Subscriber Support

    If you are TechNet Subscription user and have any feedback on our support quality, please send your feedback here.


    Niki Han

    TechNet Community Support


    ------------------------------------
    Reply:
    Likely a fix will not be made for Windows XP due to its age (i.e. being in extended support) nor Windows 7 (as SRP is replaced with AppLocker which is superior SRP).
    Blogging about Windows for IT pros at www.theexperienceblog.com


    ------------------------------------
    Reply:

    Hey Niki, thanks for the response.

    That page specifically states that:

    Applies To: Windows Server 2008 R2

    Since my clients are running on W7 Pro, I wasn't optimistic. Still, I tried it:

    • Changed the policy
    • On the client: GPUpdate /force
    • On the client: Reboot
    • Run the test case

    As expected, it didn't help.  Using RAA still allows the app to run.  Good thought though.


    ------------------------------------
    Reply:
    Likely a fix will not be made for Windows XP
    Not an issue for me as I am currently (almost) all W7.  Also, since UAC was a factor, I wouldn't expect to see this behavior in XP.
    nor Windows 7 (as SRP is replaced with AppLocker which is superior SRP).

    Whether SRP is superior can be debated.  However, since it doesn't run on W7 Pro (which is what my client machines run), it's a moot point for me.

    Also, consider this:

    The fact that just clicking on the app produces the (expected) error message and RAA (for non-admins) does not means that there must be 2 separate code paths when launching EXEs.  And clearly, one is deficient.  Now, what ELSE uses that second code path?  For example, will AppLocker have the exact same issue?  Hmm...


    ------------------------------------
    Reply:

    Hi,

     

    Please ensure the GPO is set for "Automatically deny elevation requests". After applying the GPO, please check the policy on Windows 7 client to ensure the policy has been applied.

     

    If this workaround doesn't work, we can do nothing from forum side but wait for the hotfix or Windows 8 release. Or you can try AppLocker if you would like to. Thanks for your understanding.

     

    Niki

    TechNet Subscriber Support

    If you are TechNet Subscription user and have any feedback on our support quality, please send your feedback here.


    Niki Han

    TechNet Community Support


    ------------------------------------
    Reply:

    Please ensure the GPO is set for "Automatically deny elevation requests". After applying the GPO, please check the policy on Windows 7 client to ensure the policy has been applied.

    Yes, I've done this.  As I stated in my other reply (Saturday, January 28, 2012 1:57 AM), this attribute only appears to apply to Server 2008 R2.  However, I did try setting it and it made no difference.

     Or you can try AppLocker if you would like to.

    Well, *I* can't try it.  As I said in my other reply (Saturday, January 28, 2012 3:44 AM), AppLocker only works in W7 Enterprise, which I don't have.

    If this workaround doesn't work, we can do nothing from forum side but wait for the hotfix or Windows 8 release.

    I understand.  Honestly, I never expected that someone would email an updated DLL from a post to this forum.  But that said, it seems there are a few things you could do:

    1) When these issues get reported to the product team, do they go into a database?  Or do you just send an email?  Because if this is in a database that you can access, I would be interested in seeing the text of the problem that was reported.  A poor or unclear description could explain the lack of response. 

    Also, some bug databases show responses/resolutions/status for reported issues.  If a decision has been reached, perhaps the information is shown there.  (Ok, so it's not too likely they put that info where anyone can see it.  Still, thought I'd ask.)

    2) As I mentioned in one of my other replies, it would be interesting to see if this issue also affects AppLocker.  SR and AL perform similar functions, so it seems possible that they both have the same problem.

    As I've said, I don't have W7 Enterprise.  That means I don't have what I need to try this.  But someone who *does* have W7E could:

    1. Repro the SR problem (to make sure you are testing the right config).
    2. Change the GPO to use AL instead of SR.
    3. See if the problem is still there.
    If AppLocker is broken in the same way, it seems possible that a higher priority would be assigned to fixing this.

    ------------------------------------
    Reply:

    I fired up a test machine and can state that this issue is not reproducable with AppLocker.


    Blogging about Windows for IT pros at www.theexperienceblog.com


    ------------------------------------
    Reply:

    Darn.  That was my best hope for getting this fixed.

    You are sure about the repro?  Did you try the SR part first to confirm the config?


    ------------------------------------
    Reply:
    Yes I have reproduced your bug using SRP and also verified that AppLocker is working correctly and does not suffer from this bug.
    Blogging about Windows for IT pros at www.theexperienceblog.com


    ------------------------------------
    Reply:

    Well, that's a good thing.  And at the same time: rats.

    Thanks for taking the time.


    ------------------------------------

    How to prevent a particular database copy from activating

    I know how to prevent all databases from activating in my standby datacenter by running -

    Set-MailboxServer -Identity EX1 -DatabaseCopyAutoActivationPolicy Blocked

    But How do I set it for only a particular database, while allowing the other databases to become active on EX1 in the event that the primary datacenter goes
     down?


    Anand_N

    Reply:
    You can use DatabaseCopyAutoActivationPolicy to prevent this. http://technet.microsoft.com/en-us/library/aa998651.aspx 
    Sukh

    ------------------------------------
    Reply:
    I used that parameter but it sets all databases on ex1 to blocked, I only want to block a few databases, not all of them

    Anand_N

    ------------------------------------
    Reply:
    I dont think that is possible.
    Sukh

    ------------------------------------
    Reply:
    I used that parameter but it sets all databases on ex1 to blocked, I only want to block a few databases, not all of them

    Anand_N


    You can block activation using the cmdlet Suspend-MailboxDatabaseCopy with the parameter -ActivationOnly

    Example: Suspend-MailboxDatabaseCopy –Identity Database01\SERVER02 -ActivationOnly


    Suspend-MailboxDatabaseCopy
    http://technet.microsoft.com/en-us/library/dd351074.aspx


    Martina Miskovic

    ------------------------------------
    Reply:
    I used that parameter but it sets all databases on ex1 to blocked, I only want to block a few databases, not all of them

    Anand_N


    You can block activation using the cmdlet Suspend-MailboxDatabaseCopy with the parameter -ActivationOnly

    Example: Suspend-MailboxDatabaseCopy –Identity Database01\SERVER02 -ActivationOnly


    Suspend-MailboxDatabaseCopy
    http://technet.microsoft.com/en-us/library/dd351074.aspx


    Martina Miskovic

    Will this work if the primary datacenter goes down, this seems more of a controller method, i.e under maintenance, if there a failure I don't believe that will work.
    Sukh

    ------------------------------------
    Reply:
    The above command will only prevent the databases you run it on, to become active.
    Wasn't that your question?
    Martina Miskovic

    ------------------------------------
    Reply:
    yes that is true, but i dont want to suspend replication to the database. If the primary site goes down, would i be able to activate the suspended db?

    Anand_N

    ------------------------------------
    Reply:
    yes that is true, but i dont want to suspend replication to the database. If the primary site goes down, would i be able to activate the suspended db?

    Anand_N

    You arent suspending replication, just the ability for that database to automatically activate in the event a server fails. You would have to manually "resume" replication and activate the store in that case.

    ------------------------------------
    Reply:

    thanks

    this article also demonstrates both commands

    http://www.howexchangeworks.com/2010/06/how-to-block-mailbox-database-copy-from.html


    Anand_N

    ------------------------------------

    No comments:

    Post a Comment

    Setup is Split Across Multiple CDs

    Setup is Split Across Multiple CDs Lately I've seen a bunch of people hitting installation errors that have to do with the fact th...