Tuesday, February 1, 2022

Another New Build (10159)

Another New Build (10159)

Not bad, 2 in 2 days. Fast Track is, now, on Build 10159. Check Updates, it's there.  Are we having fun, yet? As for are we there, yet? Nope, 28 more days...

Edge can, now, have a Home Page icon​ > (its) Settings > Advanced Settings, it's @ the top On/Off switch.

Cheers,
Drew


Drew MS Partner / MS Beta Tester / Pres. Computer Issues Pres. Computer Issues www.drewsci.com


  • Edited by Drew1903 Wednesday, July 1, 2015 9:22 AM

Security Issue - Publishing multiple apps with same .exe?

When publishing iExplore.exe (and passing in the URL as a parameter) allowing us to publish simple individual URLs, multiple published sites pointing to iexplore.exe will fail to obey both the User Assignment and 'Show the Remote App Program in RD Web Access' Yes/No function.

When multiple published apps point to the same exe their alias is post pended with (1),(2),(3)….. Not sure if this is relevant?!

This anomaly's symptoms show in terms of the app icon failing to be hidden in RD Web  when their User Assignment is removed!

It's certainly apparent when pointing to multiple instances of iexplore.exe or RDP (mstsc.exe) - although I don't think this is evident across all .exes


Workaround

The workaround was to create individual bat files calling said affected exes and pass associated parameters in, an example below:

iExploreBES.bat
start "" "C:\Program Files\Internet Explorer\iexplore.exe" "https://bbadmin.domain.com/webconsole/login"

iExploreEXCH.bat
start "" "C:\Program Files\Internet Explorer\iexplore.exe" "https://<servername>1/ecp"

RDP01.bat
start "" "mstsc.exe" "RDPSession01.rdp"

RDP02.bat
start "" "mstsc.exe" "RDPSession02.rdp"

Has anyone else seen this - am I missing something?

Cheers
Lea




  • Edited by LeaUK Thursday, June 18, 2015 10:34 AM

Reply:
Is it just me?

------------------------------------
Reply:

Hi Lea,

First of all, this is not a security issue as user assignment and the option to hide in RD Web Access are not security features, they are usability features. The ways to properly secure your published applications are:

  1. Make sure the Remote Desktop Users group is populated with the correct set of end-users on your endpoint machines (RDSH servers or VDI VMs). In Server 2012+ the Server Manager UI should be doing this for you, but it never hurts to double-check.
  2. If you need to secure specific apps within a server, use a technology like AppLocker.

Now, as to why user assignment and show/hide in RDWeb is not working for you: I've never seen that issue before, nor have I heard any reports of other people hitting this issue. Post-pending the parameterized numeric identifier is a purposeful step taken by the code to avoid any issues that could be caused by duplicate aliases.

What version of the Server OS are you running? If it is Server 2012+, you might try turning on tracing in RD Web Access. It provides human-readable trace logs, and that is where the user assignment and show/hide in RDWeb settings are applied. Instructions:

  • On the RD Web Access server, open the Web.config file (default location is %windir%\web\rdweb\web.config).
  • Set the Trace level to Warning:    
    1. Search for: <add name="TraceTSWA" value="0" />
    2. Change the value from "0" to "4" to generate verbose logs:
              <add name="TraceTSWA" value="4" />
  • Set the Trace mode to file tracing:    
    1. Find the block that starts with: <!-- Uncomment for file tracing
              <add name="File Log"
      -->
    2. Delete the start and end comment lines (the lines starting with <!-- and -->) to make the code work.
  • Log on to RD Web Access and reproduce the issue.
  • Open the generated trace file (default location is %windir%\web\rdweb\App_Data\rdweb.log) and look for any suspicious errors or warnings.


Travis Howe | RDS Blog: http://blogs.msdn.com/rds/default.aspx


------------------------------------
Reply:

Hi Travis

Thanks for your response.

Whilst I appreciate it's not a strictly a security issue, users do rely upon 'user assignment' to function and may rely upon this feature in the mitigation of risk.  If a user's access is removed, one would expect the RD Web app icon to also be hidden, although appreciated there are a multitude of 'advanced' techniques to run hidden applications, like to simply logon to the Desktop etc.    Applocker and AppSense are definitive security measures and I suspect should be reviewed.

  1. Make sure the Remote Desktop Users group is populated with the correct set of end-users on your endpoint machines (RDSH servers or VDI VMs). In Server 2012+ the Server Manager UI should be doing this for you, but it never hurts to double-check.

All RDSH servers' Remote Desktop Users groups are populated with  <domain>\Domain Users. This was added when the Session Collection was created as the User Group <domain>\Domain Users was specified. 

Is this incorrect?

We then rely upon individual application user assignment (via Security Groups) to hide app icons, which works well, unless pointed at the same iexplore.exe 

The OS is 2012 R2 all latest patches.

Thanks for the verbose logging info, I shall configure and update when I have further information, I'm surprised this cannot be reproduced and hasn't been seen in the field.

I sincerely appreciate your assistance and advice.

Lea


 


------------------------------------
Reply:

OK, so after reviewing my log book where I have detailed this issue upon first discovery, I decided to recreate it for a sanity check!

Using iexplorer.exe as the target I now can NOT, user assignment and Rd Web access on/off is functioning correctly - hugely frustrating!



  • Edited by LeaUK Tuesday, June 30, 2015 4:34 PM

------------------------------------
Reply:

All RDSH servers' Remote Desktop Users groups are populated with  <domain>\Domain Users. This was added when the Session Collection was created as the User Group <domain>\Domain Users was specified. 

Is this incorrect?

That sounds right to me - if at collection creation you specified that you want all domain users to have access, then that is who will have access. And now that I think about it, my initial statement was incorrect - you have to separately manage the list of users who have access (which you set to <domain>\Domain Users) and the user assignment for RDWeb. In an ideal world the two would go hand-in-hand, but that is not something we currently do unfortunately.

Well, I can certainly sympathize with the frustration of not being able to repro an issue anymore... but on the plus side, I'm glad it's working for you now!

Cheers,


Travis Howe | RDS Blog: http://blogs.msdn.com/rds/default.aspx


------------------------------------
Reply:

Thanks Travis.

So today I tried to reproduce this again using mstsc.exe, as I have noted this application was also affected, and still can't reproduce it!

When I originally discovered this, perhaps I overlooked the permissions on the container somehow which may explain the issue, however I remember eliminating DC replication etc by using AD accounts directly within the RDS Manager User Assignment, rather than simply adding/removing users from an associated Security Group.

Having said all this, I still can't understand why I saw (and noted) 'Show the RemoteApp program in RD Web Access' toggle failing - strange but true!

Again, thanks for your support,' onwards and upwards' to the role out of our production ready platform.

Cheers

Lea 

 


------------------------------------

How to download all attachments from OWA?

Hello, All!

I have several users, which have to use OWA as the only way to connect with our Exchange 2013 server. Very often, they receive letters with dozens of attachments from different addresses.

The question is: Is it possible to download all attachments at once? Downloading them one by one is a stupid way, eating time and (as a result) eating money... Anyone?

As far as I can recall, in previous versions of OWA there were a link "Download all", and in OWA from Exch2013 this link disappeared...

Any suggestions? (ZIP or RAR as a "container" for multiple files or MS Outlook instead of OWA are not a solution)

Thanks in advance.

P.S. Just a minute ago found the same question here. Wow! It was in December 16Th, 2014! NO CHANGES YET? Maybe MS Officials will clear this situation?  Shell I resend my letters from OWA to Gmail post-box where "DOWNLOAD ALL" feature is working pretty good???

  • Edited by ekotik Tuesday, June 30, 2015 2:43 PM

Reply:
This feature is not available as of now. 

Remember to mark as helpful if you find my contribution useful or as an answer if it does answer your question.That will encourage me - and others - to take time out to help you Check out my latest blog posts on http://exchangequery.com Thanks Sathish (MVP)


------------------------------------

Network Protectioin Server

Review about my Network Infrastructure

1-Cisco 2811 Router DHCP,NAT its connected to the internet

2-Cisco Catalyst 2950 its connected to the "Cisco 2811 Router " and the APs connect to it

3-PC Core I3 AD,DNS,NPS,Sql Server for Acounting

4-Tp-Link Access Points  15

5-PC 10

Now my wireless clients can connect to internet using AD user Authentication Through NPS Network and Connectin Policy

Problem #1:

My PC can connect to internet without authentication so how do I make them like the wireless Clients

Problem #2:

Wireless Client can connect all using one AD user how I do I make them user unique user each, how do I maintain live connection sessions using Windows Infrastructure.

Problem #3:usng IPAM how do maintain Cisco DHCP Server IP Leases

Thank for your free Help




  • Edited by Dheere Monday, June 29, 2015 10:21 AM
  • Changed type Dheere Monday, June 29, 2015 10:23 AM
  • Changed type Dheere Monday, June 29, 2015 11:10 AM

Reply:
Hi,

>Problem 1
The steps about how to configure wired NAP, you may reference links below.

Checklist: Configure NAP Enforcement for 802.1X Wired
https://technet.microsoft.com/en-us/library/cc730926(v=ws.10).aspx

Microsoft Network Access Protection (Simple setup)
http://blogs.technet.com/b/scd-odtsp/archive/2013/05/14/microsoft-network-access-protection-simple-setup.aspx

Wired 802.1X Deployment Guide
http://www.cisco.com/c/en/us/td/docs/solutions/Enterprise/Security/TrustSec_1-99/Dot1X_Deployment/Dot1x_Dep_Guide.html

Please Note: Since the web site is not hosted by Microsoft, the link may change without notice. Microsoft does not guarantee the accuracy of this information.

>problem 2
It is recommended to configure 802.1x wireless authenticated network, which will allows authenticated computers or users (domain joined computer with domain account credentials or certificate that issued by AD PKI service) to access the network by enabling the port that it connects on network devices.

More information about 802.1X Authenticated Wireless Access, you may reference:
https://technet.microsoft.com/en-us/library/cc771455(WS.10).aspx

>Problem 3
In general, IPAM discover DHCP server( ), then IP address ranges will automatically entered into the IPAM database. IP address ranges that are not DHCP scopes on managed Microsoft DHCP servers are not automatically discovered. IPAM provides several dialogs to allow you to enter and edit IP address data manually, and you can also import IP address data directly from a file. Data can also be exported from IPAM to a file. 

More information about how to import you may reference:
https://technet.microsoft.com/en-us/library/jj878303.aspx#import

Best Regards,
Eve Wang

Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Support, contact tnmff@microsoft.com.


------------------------------------

Pavilion P6770T

Does the HP Pavilion P6770T have the capability to run Windows 10?    I have asked HP the same question.   Here is HP's response:  http://h30434.www3.hp.com/t5/Windows-10-Technical-Preview/Does-Pavilion-P6770T-support-Windows-10/td-p/5096528  .   

The logic of commenting system did not allow me to use a hot link.   I got the message:  Body text cannot contain images or links until we are able to verify your account.


Reply:

AFAICT it does.  These are the requirements for win 10

http://support.hp.com/us-en/product/HP-Pavilion-p6000-Desktop-PC-series/5035348/model/5079779/document/c02734210/

1 GHz or faster processor
1 GB RAM (32-bit) or 2 GB RAM (64-bit)
16 GB available hard disk space (32-bit) or 20 GB (64-bit)
DirectX 9 graphics device with WDDM 1.0 or higher driver
Your processor (CPU) must support the following extensions: SSE2, NX, PAE


Wanikiya and Dyami--Team Zigzag



------------------------------------
Reply:

The logic of commenting system did not allow me to use a hot link.   I got the message:  Body text cannot contain images or links until we are able to verify your account.

Use the link below to post your request to verify your account.

It might speed things up if you post your request there.

https://social.technet.microsoft.com/Forums/en-US/home?forum=reportabug



------------------------------------
Reply:
" Your processor (CPU) must support the following extensions: SSE2, NX, PAE "   For an explanation of the three extensions, read:  What is PAE NX SSE2

------------------------------------
Reply:

So LB,

What is the current situation?

We may take a try to run the Windows 8.1 upgrade assistant, if that would be OK, then your machine should be available to install Windows 10.

Upgrade Assistant: FAQ

For questions regarding Windows 10, please check:

Frequently Asked Questions: Windows 10

Regards


Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact tnmff@microsoft.com.


------------------------------------
Reply:

Thanks but my PC  is a P6770T.   The document mentioned in your comment states:

This document contains basic features and specification for the following HP CTO model:
  • HP Pavilion p6780t
  • Product number: XX204AV


------------------------------------
Reply:

" What is the current situation? "

The current situation is based on a long, sad story, a real tearjerker.   The story starts with my previous PC, an entry level box that worked just fine under XP.   With the release of Windows 7, I downloaded the Windows 7 Upgrade Advisor which stated that my PC was compatible with Windows 7.   I then purchased Windows 7 from Microsoft.   After the installation of Windows 7, there were problems almost from Day One resulting in system dumps.   After the first round of patches from Microsoft the problems increased in rate making the box nearly unusable.   I did some quick research and found that my board was not supported by Windows 7.   I was not amused by the abject failure of Microsoft's Windows 7 Upgrade Advisor.   The money wasted on upgrading to Windows 7 could have been spent on a new PC.

Now I am confronted by Microsoft's Windows 10 compatibility report from the Get Windows 10  application that was recently installed on my PC.    The application says  "Congratulations, you're good to go!   No issues were found during our scan."  First, to prevent a repeat of the Windows 7 Upgrade Advisor fiasco,  I have asked HP to test my PC model: http://h30434.www3.hp.com/t5/Windows-10-Technical-Preview/Does-Pavilion-P6770T-support-Windows-10/td-p/5096528. Second, should I decide to install Windows 10 on my PC, I will first uninstall/remove the Get Windows 10 application from Windows 7 and prevent it from being downloaded in the future.   I will then copy my hard drive containing Windows 7 to the backup, internal hard drive in my HP desk top PC plus make a second copy on my external hard drive.   Thus I am in a position to take a chance on Windows 10.  Should the Windows 7 upgrade mess repeat with Windows 10, I can reboot from the backup hard drive containing Windows 7 and then erase Windows 10 on the primary boot drive and copy Windows 7 to the primary boot drive.   The copy of Windows 7 on the external hard drive will be long term back-out source should future upgrades to Windows 10 create a repeat of the Windows 7 situation on my previous PC.   So, I think that I am covered.   That is the situation.   

Get Windows 10 App:     http://www.microsoft.com/en-US/windows/windows-10-upgrade

A related issue is described here:   https://social.technet.microsoft.com/Forums/en-US/05fa6e96-c741-461d-a9f5-919693344056/the-check-your-pc-section-of-the-get-windows-10-app-does-not-report-status-of-motherboard-or?forum=WinPreview2014General



  • Edited by Lazy Bones Wednesday, June 24, 2015 7:51 PM

------------------------------------
Reply:

OK LB,

Thanks for the clarification.

So you want to remove the Windows 10 upgrade notifications from your Windows 7 machine?

If you would like to hide the notification, then please click "Customize" in the System Tray and turn off the Get Windows 10 app notifications in the menu that comes up.

If you would like to remove, uninstall KB 3035583:

https://support.microsoft.com/en-us/kb/3035583?wa=wsignin1.0

After that, we may do a system image backup for Windows 7.

What is a system image?

Restore your computer from a system image backup

Regards


Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact tnmff@microsoft.com.


------------------------------------
Reply:
The only unknown left is the support for the motherboard which is in the Cleveland series of motherboards.   Should I open a new question:  Does the Cleveland series of motherboards support Windows 10?   

------------------------------------
Reply:

Lazy Bones,

Motherboard specific which I think we'd better confirm this at the manufacturer side.

There might be some settings that may block the installation, like mult-core, and the virtual technology.

Regards


Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact tnmff@microsoft.com.


------------------------------------

Build 10159 in Fast ring.

Yes. Build 10159 in Fast ring.

Go to your Windows Upodate and look for it.

http://blogs.windows.com/bloggingwindows/2015/06/30/whoa-another-pc-build/


  • Edited by david hk129 Wednesday, July 1, 2015 12:00 AM

Reply:
that was fast, haven't tested Build 10158 yet and we got Build 10159 already.

------------------------------------

BizTalk 2010 receiving a JSON message

Hello,

I'm working at integrating BizTalk 2010 with AtTask software.

AtTask provide a ReST API.

To be able to send a ReST request to AtTask, I followed that tutorial:

http://social.technet.microsoft.com/wiki/contents/articles/2474.invoke-restful-web-services-with-biztalk-server-2010.aspx.

I'm using a WCF-Custom send-receive port, with wsHttpBinding and an endpoint Behavior.

So far, I'm able to send a login request and receiving a successful response. The problem is processing the response.

I was expecting to receive a JSON, but BizTalk is somewhere converting the response into an Xml message:

This is the response as AtTask send it (got it from Fidler):

{   "userID": "52dfde40003301510c06558a1f246ef4",   "sessionID": "fe1a78987d46406c9c1e77372423ebda",   "versionInformation": {   "currentAPI": "v4.0",   "buildNumber": "db9b275ad4c1587951a99d83856631d9590a6781",   "apiVersions": {   "v1.0": "/attask/api/v1.0/",   "v2.0": "/attask/api/v2.0/",   "v3.0": "/attask/api/v3.0/",   "v4.0": "/attask/api/v4.0/"   },   "lastUpdated": "2014/01/21 13:21:24",   "release": "R16",   "version": "4.0"   },   "locale": "en_US",   "timeZone": "US/Pacific",   "timeZoneName": "Pacific Standard Time",   "iso3Country": "USA",   "iso3Language": "eng",   "currency": {   "useNegativeSign": false,   "fractionDigits": 2,   "symbol": "$",   "ID": "USD",   "groupingSeparator": ",",   "decimalSeparator": "."   }  }

This is what I get in BizTalk:

<root type="object">  	<data type="object">  		<userID type="string">52dfde40003301510c06558a1f246ef4</userID>  		<sessionID type="string">0294fb6dee9f40d383488ac15ebcddd7</sessionID>  		<versionInformation type="object">  			<currentAPI type="string">v4.0</currentAPI>  			<buildNumber type="string">db9b275ad4c1587951a99d83856631d9590a6781</buildNumber>  			<apiVersions type="object">  				<v1.0 type="string">/attask/api/v1.0/</v1.0>  				<v2.0 type="string">/attask/api/v2.0/</v2.0>  				<v3.0 type="string">/attask/api/v3.0/</v3.0>  				<v4.0 type="string">/attask/api/v4.0/</v4.0>  			</apiVersions>  			<lastUpdated type="string">2014/01/21 13:21:24</lastUpdated>  			<release type="string">R16</release>  			<version type="string">4.0</version>  		</versionInformation>  		<locale type="string">en_US</locale>  		<timeZone type="string">US/Pacific</timeZone>  		<timeZoneName type="string">Pacific Standard Time</timeZoneName>  		<iso3Country type="string">USA</iso3Country>  		<iso3Language type="string">eng</iso3Language>  		<currency type="object">  			<useNegativeSign type="boolean">false</useNegativeSign>  			<fractionDigits type="number">2</fractionDigits>  			<symbol type="string">$</symbol>  			<ID type="string">USD</ID>  			<groupingSeparator type="string">,</groupingSeparator>  			<decimalSeparator type="string">.</decimalSeparator>  		</currency>  	</data>  </root>

If I keep the Xml format, I'll have to define an xsd for each possible response from AtTask, and have a specific logical for each in my orchestration.

I want to process the JSON myself (AtTask provide library for that).

I tried to specify Content-Type=application/json and Accept=application/json, in the custom behavior extension (IClientMessageInspector BeforeSendRequest/AfterReceiveReply), but It doesn't work.

I have debugged the process, and when I arrive in the AfterReceiveReply, the JSON is already transformed in Xml.

So, my question is, is it possible to tel BizTalk not to transform the message he receives into Xml?

Thanks


  • Edited by Jer-ome Tuesday, January 28, 2014 7:35 PM
  • Changed type Angie Xu Wednesday, February 5, 2014 2:56 AM

Reply:

Hello ,

You can have a custom pipeline component to convert xml to json

sample code :

IBaseMessage IComponent.Execute(IPipelineContext pContext, IBaseMessage pInMsg)
{
Trace.WriteLine("DotNetTypesToJsonConverter Pipeline – Entered Execute()");
Trace.WriteLine("DotNetTypesToJsonConverter Pipeline – TypeName is set to: " + TypeName);
IBaseMessagePart bodyPart = pInMsg.BodyPart;
if (bodyPart != null)

{

Stream originalStream = bodyPart.GetOriginalDataStream();

if (originalStream != null)

{

Type myClassType = Type.GetType(TypeName);

object reqObj = PcHelper.FromXml(originalStream, myClassType);

string jsonText = JsonConvert.SerializeObject(reqObj, myClassType, Formatting.None,

new JsonSerializerSettings());

Trace.WriteLine("DotNetTypesToJsonConverter output: " + jsonText);

byte[] outBytes = Encoding.ASCII.GetBytes(jsonText);

var memStream = new MemoryStream();

memStream.Write(outBytes, 0, outBytes.Length);

memStream.Position = 0;

bodyPart.Data = memStream;

pContext.ResourceTracker.AddResource(memStream);

}

}

Trace.WriteLine("DotNetTypesToJsonConverter Pipeline – Exited Execute()");

return pInMsg;

}

You can refer below link

http://wcfbiztalk.wordpress.com/2013/08/31/json-send-and-receive-pipelines-for-biztalk-server-2/

Thanks

Abhishek


------------------------------------
Reply:

Thanks Abhishek. That's not what I'm trying to do, at least for now ;)

I found a little bit useless to have BizTalk transforming JSON to Xml, and then mysefl, transforming that Xml back to JSON. But if BizTalk can't receive JSON, that's what I will probably do.

I need to access some values inside my orchestration.
For example, a scenario could be:

  1. I send the login Query, I receive in the response a sessionID
  2. I send a query with the sessionID, to get the user group ID
  3. I send a query with the sessionID and the groupID, to create a new project in AtTask

So, any ideas if it's possible to receive the raw JSON (even embedded in an Xml tag) in BizTalk?


------------------------------------
Reply:

Hello ,

Can you look into below post . It might help you.

http://social.msdn.microsoft.com/Forums/en-US/c19d7486-0705-433f-8e1a-a9088d076ed7/restful-service-consume-at-biztalk-2010?forum=biztalkesb

http://holsson.wordpress.com/2009/10/01/implementing-a-custom-biztalk-adapter-as-a-custom-wcf-channel-part-1-send/

Thanks

Abhishek


------------------------------------
Reply:

The thing is, I'm pretty sure BizTalk has no native JSON to Xml capabilities (it would be a pretty big deal), and certainly not BizTalk Server 2010.

Double check your channel configuration, behaviors, encoders, etc.  Maybe one of the samples included a JSON to Xml component such as JSON.Net.


------------------------------------
Reply:

The thing is, I'm pretty sure BizTalk has no native JSON to Xml capabilities (it would be a pretty big deal), and certainly not BizTalk Server 2010.

Double check your channel configuration, behaviors, encoders, etc.  Maybe one of the samples included a JSON to Xml component such as JSON.Net.

Yes, that's what I thought ... but my test project is very small, and I'm pretty sure I haven't set any settings to do that.

In my WCF-Custom transport properties, I just have chosen the webHttpBinding and kept all default values.

And I have added my custom endPoint behavior. My custom endPoint behavior creates the ReST Query, but does nothing on the response (and as I said when my behavior is executed the message is already in Xml format).

I read here and there about WCF (that BizTalk is using) and it seems that there is default .Net Xml/Json serializer/deserializer, at least since .Net Framework 3.5.

I'll continue looking for a solution ...


------------------------------------
Reply:

Hello,

The situation described in this post is quite confusing. All sources I could find on BizTalk 2010 vs. JSON explicitly say that BizTalk 2010 has no native support for JSON. One has to develop custom components (pipelines/endpoint behaviours/etc.) to enable BizTalk 2010 to handle JSON. However, the OP describes how BizTalk 2010 is automatically (and mysteriously) receiving XML when JSON is expected.

The issue arises because the .NET Framework represents JSON "as an XML infoset when processed by WCF," which means that "XML APIs are used to access JSON content" (see http://msdn.microsoft.com/en-us/library/bb924435%28v=vs.110%29.aspx). The .NET Framework has been doing so since version 3.0. Using WCF-Custom to implement an adapter for a ReST API is simply enough to enable BizTalk 2010 to understand JSON. If this is not native support, I don't know what is! I suspect that any of the solutions describing how to transform JSON into XML in BizTalk 2010 has ever been implemented!

A solution to the OP question, then, is to use an earlier version of the .NET Framework than 3.0.

I do hope this will save the time and effort of someone. (Not that many people are trying to use BizTalk 2010 to integrate with ReSTful services).

- Iyad Z


------------------------------------
Reply:

Did you ever find an answer to this other than using .net 3.0 or earlier?  I've run into the exact same situation.

Thanks


------------------------------------
Reply:

Did you ever find an answer to this other than using .net 3.0 or earlier?  I've run into the exact same situation.

Thanks

Hi,

No, sorry.

I was working on a proof of concept and the project didn't go further.


------------------------------------

VB Script, arrays and classes

Hi people

I'm running into a problem involving array reference.

In the following code, there is a simple class and one of its properties (namely "elements") is an array. Next, the class is instanciated and bound to the "var1" object variable. It should follow that "var1.elements" refers to the instance array.

Next, I try to assign a string value to the first 'cell' of the instance array :

var1.elements(0) = "hello"

and an anonymous array (through the Array(...) construct) is assigned to the second cell :

var1.elements(1) = Array("it's a beautiful day")

I can print out the values of var1.elements(0) and var1.elements(1)(0) without problems, but it's a different story attempting to modify the value of 'embedded' array :

var1.elements(1)(0) = "some other value"

... just doesn't work. It throws no errors, but the programmes behaves like var1.elements(1) returns a copy of the 'embedded' array and thus does not modify it. I mean that printing out var1.elements(1)(0) still displays "it's a beautiful day"

How come?

This behaviour doesn't occur using regular arrays instead of object variables :

dim var1(10)

var1(0) = Array("some value")

var1(0)(0) = "some other value"      >>> the 'embedded' array assignment is correctly taken into account.

class Test public elements() public sub Class_Initialize() redim elements(10) end sub end class REM ---------------------------------------------- dim var1 set var1 = new Test var1.elements(0) = "hello" var1.elements(1) = Array("it's a beautiful day") WScript.echo var1.elements(0), var1.elements(1)(0) REM >>> will display "hello it's a beautiful day"

var1.elements(0) = "yes it is" var1.elements(1)(0) = "and how are you?" WScript.echo var1.elements(0), var1.elements(1)(0)

REM >>> will display "yes it is it's a beautiful day"

Thanks in advance for your input.
  • Edited by Grobu Sunday, June 28, 2015 8:47 PM -
  • Changed type Bill_Stewart Friday, August 14, 2015 7:04 PM

Reply:

Yes that is the behavior of arrays embedded in classes.  Use  Get/Set to adjust a class.  THe assigned array is only a reference and not a copy so it is not accessible.


\_(ツ)_/


------------------------------------
Reply:

Thanks for your reply but I'm afraid I don't understand.

> THe assigned array is only a reference and not a copy so it is not accessible.

Are you saying that 

var1.elements(0) = Array(0,1)

will assign a reference to the anonymous array to elements(0), and because of that it is not accessible for manipulations outside of the class?

I have conducted the following tests about this class/array problem :

class Test     public elements()      public sub Class_Initialize()   redim elements(0)   end sub     sub addArrayFromClass   elements(0) = Array("M", "N")   end sub    end class    REM ----------------------------------------------    dim var1, obj    set var1 = new Test    ' test with string value - OK  var1.elements(0) = "test"  var1.elements(0) = "other"  WScript.echo var1.elements(0) ' >>> other      ' test with object reference - OK  set obj = WScript.CreateObject("Scripting.Dictionary")  obj.item("greetings") = "hello"  set var1.elements(0) = obj  var1.elements(0).item("greetings") = "there"  WScript.echo var1.elements(0).item("greetings") ' >>> there  WScript.echo obj.item("greetings") ' >>> there      ' test with anonymous array - FAIL  var1.elements(0) = Array(0, 1)  var1.elements(0)(0) = 2  WScript.echo var1.elements(0)(0) ' >>> 0    ' test with copy of other array variable - FAIL  redim myArray(1)  myArray(0) = "A"  myArray(1) = "B"  var1.elements(0) = myArray  var1.elements(0)(0) = "Z"  WScript.echo var1.elements(0)(0) ' >>> A  WScript.echo myArray(0) ' >>> A    ' test with anonymous array assigned from inside class - FAIL  var1.AddArrayFromClass  WScript.echo var1.elements(0)(0) ' >>> M  var1.elements(0)(0) = "L"  WScript.echo var1.elements(0)(0) ' >>> M  

So it seems that it is possible to manipulate the contents of a class array with strings, numbers, objects, but not embedded arrays. This is very annoying and unintuitive. What's the logic behind that?


------------------------------------
Reply:

That is correct.  You cannot do that.  You can supply setter/getter methods and use specific "in-object" methods to manipulate arrays embedded in objects.


\_(ツ)_/


------------------------------------
Reply:

You can replace the reference but not the elements:

class Test   public elements(2)  end Class    set var1 = new Test  var1.elements(0) = "hello"  var1.elements(1) = Array("it's a beautiful day")    WScript.echo var1.elements(0), var1.elements(1)(0)    var1.elements(0) = "yes it is"  var1.elements(1) = Array("and how are you?")    WScript.echo var1.elements(0) ,var1.elements(1)(0)


\_(ツ)_/


  • Edited by jrv Monday, June 29, 2015 2:33 AM

------------------------------------
Reply:

Consider the following:

class Test   public elements(2)  end Class    set var1 = new Test    var1.elements(0) = "hello"  var1.elements(1) = Array("it's a beautiful day")    WScript.echo var1.elements(0), var1.elements(1)(0)    var1.elements(0) = "yes it is"  a2=Array("and how are you?")  var1.elements(1) = a2    WScript.echo var1.elements(0) ,var1.elements(1)(0)    a2(0) = "XXXXXXXXXXXXXXXX"  WScript.Echo "A2 === " & a2(0)  var1.elements(1) = a2  WScript.Echo "ELEMENTS ====== " & var1.elements(1)(0)    var1.elements(1)(0) = "YYYYYYYYYY"  WScript.Echo "ELEMENTS 1 ===== " & var1.elements(1)(0)  WScript.Echo "A2 =====" & a2(0)  


\_(ツ)_/


------------------------------------
Reply:

Thanks for these examples. Unfortunately a property get/let/set type of solution would be highly impractical in the case of multiple embedded arrays (i.e. a class array containing an array that contains another array). My project involves a class of extensible arrays to be used in another class, and I need that other class to access directly to the content of the xArray.

Since I want something purely VBS (no ArrayList), the only workaround I can think of, is to reinvent the wheel and completely discard the use of anonymous arrays.

class ArrayObject     private my_elements()   private my_count      public sub Class_Initialize   redim my_elements(49)   my_count = 0   end sub       public default property get item(index)   if isObject( my_elements(index) ) then   set item = my_elements(index)   else   item = my_elements(index)   end if   end property      public property let item(index, value)   if index < 0 or index >= my_count then err.raise 1, "ArrayObject", "subscript out of range"   my_elements(index) = value   end property      public property set item(index, object)   if index < 0 or index >= my_count then err.raise 1, "ArrayObject", "subscript out of range"   set my_elements(index) = object   end property      public property get count   count = my_count   end property      public sub add(any_type)   if my_count >= ubound(my_elements) then redim preserve my_element(my_count + 49)   if isObject(any_type) then   set my_elements(my_count) = any_type   else   my_elements(my_count) = any_type   end if   my_count = my_count + 1   end sub     end Class    dim var1    set var1 = new ArrayObject    var1.add "it's a boring hot"    var1.add new ArrayObject    var1(1).add "day,"  var1(1).add "and VBScript arrays are outrageously quirky"    WScript.echo var1(0), var1(1)(0), var1(1)(1)    var1(1)(1) = "and next time I'll pick a trustworthy scripting language"    WScript.echo var1(0), var1(1)(0), var1(1)(1)  

Next time I'll convince my client to install Python.


------------------------------------
Reply:
Life, at times, can be very disappointing.  I think you should consider a hobby that does not include computers.  I like fishing and horseshoes myself.

\_(ツ)_/


------------------------------------
Reply:
Microsoft, oftentimes, can be very disappointing ...

\_(ツ)_/

So true. Thank goodness there is competition.

------------------------------------
Reply:

Microsoft, oftentimes, can be very disappointing ...


\_(ツ)_/

So true. Thank goodness there is competition.

Sneaky - sneaky.

Where?  In Zanobia?  Even Groucho refused to go to Zanobia.


\_(ツ)_/


------------------------------------

New Build out for Insiders - 10159

Gabe Aul seems hot this week :-D

http://t.co/ohAOkjNVK6


Reply:
Got it about 2 hours ago converted the ESD to ISO now time for a clean install.  There must have been an issue with 10158.  I am sure those with 10158 issues will appreciate the heads up

Wanikiya and Dyami--Team Zigzag


------------------------------------

Exchange 2007 SCR Query for mailbox servers

Hello i am creating a SCR for CCR enviornmant like below 

Currently having a CCR enviornment at Site A but as per the requirment we need to setup a only Single NODE for DR at Site B.

So once i will start the installtion do i need to configure the Clustering as per my understanding YES becoz i will use the RECOVER/CMS but with having Single NODE at DR , will clustering work and for quorum File share witness will make any difference here means do i need to create that or not. 

Second query is related to ReplayLagTime

I want that once the log will be copied to DR or traget Server it should wait for 8 hours before the replay so i can use the below cmdlet BUT should i use the TruncationLag time also along with that OR NOT becoz The TL time period begins after the log has been successfully replayed into the copy of the database or do i need to set the time also like below or not  will it sense automatically that log file has been replyed and truncate now ?  Mean to say do i need to specify the time ?

Enable-StorageGroupCopy -Id TestDR-SG -StandbyMachine SRV-EX02 -ReplayLagTime 0.8:0:0 -TruncationLagTime 0.11:0:0


Reply:

The simplest way to do what you are trying to do is to upgrade to either Exchange 2010 or Exchange 2013 and configure your DAG so it has three nodes and one is lagged by 8 hours.  It's much simpler to do this in the later versions of Exchange (as well as to fully recover operations during a failure) than it is in Exchange 2007.

If you intend to have the DR system act as your cluster, then yes, you will need to configure clustering - but you won't need to until you have to recover the cluster. You will, however, need to install onto Windows Enterprise so the cluster components are available.

What do you want the truncation lag set for? Your truncation won't occur until there's a successful backup of your database, and once you have a successful backup, you won't need the logs.


Will Martin ...
-join ('77696c6c406d617274696e2d66616d696c6965732e6f7267' -split '(?<=\G.{2})' | ? { $_ } | % { [char][int]"0x$_" })


------------------------------------
Reply:
thanks will for your reply , we need to go with exchnage 2007 only as of now . Will configure cluster and see how it go. now about the truncation lag , so once the log will be copied and replayed at DR till then log will not trucnate untill unless backup will be completed at primary site correct ? but i can set the time also if i want the log truncation should happen or not ? thanks 

------------------------------------
Reply:
That's right - for the most part.  Exchange doesn't truncate log files until they are both ingested into the database and a backup has been completed that includes them.  This includes a full backup (not to be confused with a copy backup, which doesn't truncate old log files) or an incremental backup.  But if you are lagging your log files by 8 hours, at least those 8 hours of log files are always going to be retained on all servers hosting the database in question.

Will Martin ...
-join ('77696c6c406d617274696e2d66616d696c6965732e6f7267' -split '(?<=\G.{2})' | ? { $_ } | % { [char][int]"0x$_" })


------------------------------------
Reply:

Just wanted to if i enable the logging for SCR will the folders at the DR will be created automatically or i need to create the Folder (like the same path.) ?

Thanks


------------------------------------
Reply:
Define "enable logging".  Transaction logs are automatically configured, and if you don't have the required path on your system, Exchange will create them or tell you it can't find the path and make you create it.  Other logging would depend on which logging you are trying to enable.

Will Martin ...
-join ('77696c6c406d617274696e2d66616d696c6965732e6f7267' -split '(?<=\G.{2})' | ? { $_ } | % { [char][int]"0x$_" })


------------------------------------
Reply:

Thanks will for prompt reply just rephrase the question that once i will enable the logging on the existed storage Group then at the DR server the folder will be create auotmatically or i need to first create the same then logs will be copied on the same path as source ?

Enable-StorageGroupCopy for SCR then will the folder will be created automatically or do i need to create first ? 

Why i am asking that bcoz on most of the drives there the two or more Storage groups created so i hope that should not be any problem.


  • Edited by Jugalkumar Monday, June 22, 2015 12:17 PM

------------------------------------
Reply:
As I said in my response, "Transaction logs are automatically configured, and if you don't have the required path on your system, Exchange will create them or tell you it can't find the path and make you create it".  This is Exchange mailbox database logging, so once you have configured it, the folders will be created. You are always able to create them manually, if you are concerned that things might not be working properly.

Will Martin ...
-join ('77696c6c406d617274696e2d66616d696c6965732e6f7267' -split '(?<=\G.{2})' | ? { $_ } | % { [char][int]"0x$_" })


------------------------------------
Reply:

Sorry for going one level below - See as i mentioned that at the Source Exchange Server there is  One drive in which there are Couple of Storage Groups Defined as the same for logs also respective to their databases like below so suppose when i will enable log shipping for SG1 and SG2 so it will automatically create the folders at the DR exchange server.

Like for Database

 I Drive\Microsoft Exchange\SG1\SG1.Edb

 I Drive\Microsoft Exchange\SG2\SG2.Edb

For Logs

K Drive\Microsoft Exchange\Logs\SG1

K Drive \Microsoft Exchange\Logs\SG2 

My doubt is at DR server i have I and K drive configuraed with same space so once i will enable the log shipping from the Traget server the folder will be created automatically. i know you mentioned i can create manual. little afraid as there are production SG's are also configured on the same drive . 


------------------------------------
Reply:

Exchange will allow multiple storage groups on a single drive, but you need to ensure the drive is large enough for what you expect it will be holding.  If you don't have space on the drives for the data you wish to replicate to them, you need to modify your design or add storage to your server.


Will Martin ...
-join ('77696c6c406d617274696e2d66616d696c6965732e6f7267' -split '(?<=\G.{2})' | ? { $_ } | % { [char][int]"0x$_" })


------------------------------------
Reply:

Hello Will , 

Just wanted to know when i will run the Enable-StorageGroupCopy after that do i need to run the Update-StorageGroupCopy also  to seed the database is that required. ? 

Suppose i am having a Database with 200 GB is created 10 days back now when i need to enable the log shipping today (after 10 days) now so how that 200 GB of OLD data will be replecated to the DR ?? or do i need to manaully seed that Database ? 

Do i need to run the Update-StorageGroupCopy after enabling the log shipping to seed the DB or it will copy automatically ? what will happen to the database which is already in place since last 10 days , will that be copied too or do i need to seed that ??

Thanks for your patience and help...


------------------------------------
Reply:

If the database doesn't start seeding automatically, then yes, you will need to seed it.  As for logistics of what happens when you add a lagged database copy, as soon as you give the system that lagged copy, the system will copy the current database, but will not start ingesting log files into it until after the ten-day period has passed.  It will also save all log files (on each server, not just on the lagged server) for the period of the lag (in your example, ten days worth of logs).  So you had better have sufficient drive space on all servers for the expected amount of log files for that lagged period - and you need to include room for growth, just in case.

This is so much easier with Exchange 2010 and Exchange 2013.  You would be much better served by getting a plan together to upgrade your system than trying to bolt on DR pieces to your existing system.


Will Martin ...
-join ('77696c6c406d617274696e2d66616d696c6965732e6f7267' -split '(?<=\G.{2})' | ? { $_ } | % { [char][int]"0x$_" })


------------------------------------
Reply:

Hello Will, i enabled the log shipping without setting any ReplayLag time for one of the Test database and i observed that LOGS are getting replicated and DATABASE is also created. and check the status as well its hows heathly and copy queue length is 0 and reply queue length is some what 150.

BUt CatalogData folder is not yet created on the SCR target , so that should be now or at the time of activation it will be created. 

So is that means logs are getting into the Database and i do not need to seed the Database again via running the below cmdlets

Suspend-StoragegroupCopy

update-storagegroupcopy

Resume-StorageGroupCopy.

Second:- I got any request that Is there way to test SCR with one mailbox database ?


  • Edited by Jugalkumar Friday, June 26, 2015 5:35 AM

------------------------------------
Reply:

Yes, you can test your SCR database - by enabling it and invalidating your existing mailbox databases on the clustered systems (which would mean you would need to fully reseed them if you wish to bring them back online).  There are other less disruptive ways, but they require that you either: 1) have a fully operational test environment you can load the database into for your test; or 2) you are willing to fully break off this SCR server, as well as any domain controllers in its Windows site, from your organization and that you will rebuild them all from scratch after you complete your test.  This is the primary reason I initially stated that you would be better served by upgrading your infrastructure to either Exchange 2010 or Exchange 2013 - both of these allow full testing of a DR situation without reseeding or breaking your existing infrastructure.


Will Martin ...
-join ('77696c6c406d617274696e2d66616d696c6965732e6f7267' -split '(?<=\G.{2})' | ? { $_ } | % { [char][int]"0x$_" })


------------------------------------
Reply:

hi i enabled the log shipping with setting any ReplayLag time 0 for one of the Test database and i observed that LOGS are getting replicated and DATABASE is also created. and check the status as well its hows heathly and copy queue length is 0 and reply queue length is some what 50.

BUt CatalogData folder is not yet created on the SCR target , so that should be now or at the time of activation it will be created. 

So is that means logs are getting into the Database and i do not need to seed the Database again via running the below cmdlets or still i need to run that again.

Suspend-StoragegroupCopy

update-storagegroupcopy

Resume-StorageGroupCopy.

Second, as i have Database of 200 GB around 30 + so is there any best way to replicate over the wan link.


------------------------------------
Reply:
OK, no, as I said before, if the database is seeding automatically, you don't need to seed it.  And once you configure the extra copy, you needn't do anything to "assist" with replication over the WAN link.  Exchange will handle it as best as it is able.

Will Martin ...
-join ('77696c6c406d617274696e2d66616d696c6965732e6f7267' -split '(?<=\G.{2})' | ? { $_ } | % { [char][int]"0x$_" })


------------------------------------
Reply:

Thanks will for giving your valueable time,

I enabled the log shipping with one of the old database with 100 GB approx so it enabled the log but gave a warning that seeding is required before that so i had to use the suspend and updated Stroage group cmd.

Seocnd and imp thing that CatalogData folder is not yet created on the SCR target after enabling the log shipping, so catalogdata folder will be created at the time of activation or i need to restart the service (ms exchange search). 


------------------------------------
Reply:
Hello Will any help on the catalogdata folder ? its not created of as i mentioned above is that make sense or not ? Thanks

------------------------------------
Reply:

I don't currently have an Exchange 2007 database to test with or investigate on, so I'm not going to be able to give you an answer, valid or otherwise.  Keep in mind that this is an offline copy of the database, and as such, you won't need an index of it until it needs to be used live.  This is counter to how Exchange 2010 databases work, since they can be brought to live status and back to passive copies within a few minutes, so they require the index at all times.

I'd highly recommend that you read the online documentation on SCR - it may have the information you need.  Here's a link:  https://technet.microsoft.com/en-us/library/bb676502%28v=exchg.80%29.aspx


Will Martin ...
-join ('77696c6c406d617274696e2d66616d696c6965732e6f7267' -split '(?<=\G.{2})' | ? { $_ } | % { [char][int]"0x$_" })


------------------------------------

Ways to deploy SharePOint 2010 custom tools

Hi,

We have an SharePoint farm with 2 SharePoint applications.

Our application solution package deployment setting to Reset Web Server Mode on Upgrade" to Recycle.

Question I have is:

1. Will this setting work when we have a new timer jobs, event receivers or web parts in my package in future.

2. When exactly we should do IISRESET in SharePoint servers ? Is this related to any custom developed tools ?

Thanks appreciate your input


  • Changed type USMM Tuesday, June 30, 2015 5:17 PM
  • Edited by USMM Thursday, July 2, 2015 8:22 PM

Reply:

Hi,

According to your description, my understanding is that you want to deploy some SharePoint solutions without downtime.

Please check the blog below:

http://blog.ithinksharepoint.com/2012/07/16/deploying-sharepoint-wsp-solutions-without-downtime/

Or you can try a sandbox solution in SharePoint 2010.

Best Regards,

Dennis


TechNet Community Support
Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact tnmff@microsoft.com.


------------------------------------
Reply:
But even that talks about minimum downtime one server at a time

------------------------------------

User account expiration (Not password) script, to retrive users who's password is going to expire in 14 days and send email to respective user manager attribute.

User account expiration (Not password) script, to retrieve users who's password is going to expire in 14 days and send email to respective user's manager attribute.

Reply:

Check the following scripts that will email your users when their password is due to expire:

https://gallery.technet.microsoft.com/f7f5f7ed-14ee-4d0e-81c2-7d95ce7e08f5

https://gallery.technet.microsoft.com/Password-Expiration-35615c06

https://gallery.technet.microsoft.com/Account-Expiry-Email-Alert-968c487e


Cheers,

Andrew

MCSE, MCSA, VCP, CCNA, SNIA

Microsoft Infrastructure Consultant

Blog: Network Angel LinkedIn:

Note: Please remember to mark as "propose as answer" to help other members. Posts are provided "AS IS" without warranty of any kind, either expressed or implied, including but not limited to the implied warranties of merchantability and/or fitness for a particular purpose.


------------------------------------
Reply:

The question is confusing to me. Do you want the email sent when the user password is about to expire, or when the account is about to expire (as shown on the "Account" tab of ADUC)? These are very different. If the email is to be sent to the user's manager, I suspect you mean the account expiration date, not when the password expires, since the manager can do nothing about the password.

Such a script would query for all users where the accountExpires attribute is 14 days or less in the future. For all such users the script would retrieve the DN value in the manager attribute of the user (if there is a value), then retrieve attributes of this user, and send a message to the mail attribute (or perhaps the primary address in the proxyAddresses attribute) of the corresponding manager. This a bit more complex than most people assume. Also, you must decide what to do if either the manager attribute of the user or the mail attribute of the manager (or the proxyAddresses attribute) is missing.


Richard Mueller - MVP Directory Services


------------------------------------

Windows 8.1 Client Support on SBS 2011 Essentials

Does anyone know if Windows 8.1 Client PC's are supported or will be (with a forthcoming Update Roll-up) on a SBS 2011 Essentials network?

By asking that i mean full support by using the Server Dashboard to manage it and setting up Backup and redirection policies.... 

Thanks

  • Changed type Justin Gu Monday, November 25, 2013 7:24 AM

Reply:

Hi,

As I know, Windows 8.1 client is supported on SBS 2011 Essentials network.

For more details, please refer to the following article.

Supported operating systems for client computers

http://technet.microsoft.com/en-us/library/supported-operating-systems-1.aspx

Hope this helps.

Best regards,

Justin Gu

------------------------------------
Reply:

On the link you post there is NO indication that Win8.1 Pro Clients are supported. Where did you especially saw that?

I am looking for "solid evidence" of Win8.1 Pro supported under Dashboard management of Small Business Server 2011 Essentials before making any commitment to this.

Thanks in advance anyone who has that information.... 


------------------------------------
Reply:

Hi,

Sorry for my mistakes.

As I know, it seems there is no Microsoft articles that indicate Win8.1 Pro supported under Dashboard management of Small Business Server 2011 Essentials. You can contact Microsoft Customer Service, I believe that you will get an accurate response there.

 

Microsoft Customer Service

http://support.microsoft.com/gp/contact_microsoft_customer_serv?&fr=1&wa=wsignin1.0

 

Hope this helps.

Best regards,

Justin Gu

------------------------------------
Reply:

Windows 8.1 pro does not connect to the SBS2011 as a domain controller or Workgroup

you are forced to use the home group connector

Which you need to change from read only access (default) to read and write

or each shared area used by Windows 8.1 devices, this takes about a week

Windows 8.1 can still connect to SBS2011 as a Domain server or Workgroup

Windows 8 is aimed at comsumers and tablet mode GUI


------------------------------------
Reply:
I'm not following your comments?  8.1 pro can domain join to a SBS/or Essentials domain?

Unfortunately TechNet isn't coming back, sorry folks :-(


------------------------------------
Reply:

Hi Wiseman

Sorry to let you know, but Connector wizard will not work with Windows 8.1 Pro.  It goes through all the steps to require user credentials for the domain, then gets an unexpected error.  I tried this a number of times after rebooting, removing the connector software and reloading.  I also tried with the KB2790621 program instead of web connect command.

You can join the computer to a domain the old way, but this does not give you the Dashboard or Launchpad configuration - not a big issue for me, as I did not want them for these computers.

Michael


------------------------------------
Reply:
What happens if you join the domain manually, then run the connector?  Or do the regedit not to join the domain after you have joined it anyway and run the connector?

Grey


------------------------------------
Reply:

Dear Michael,

Thanks for confirming my fears..Win8 is the Last version of Client OS that is supported under Windows SBS 2011 Essentials, in terms of Connector-Backup-Dashboard functionality.

Never the less it is still the version of Essentials that has the Widest Client OS Support (from WinXP to Win8).

Thanks everyone trying answering this question for me.


------------------------------------
Reply:
Hold the phone on this verdict, I think it's wrong

Unfortunately TechNet subscriptions aren&#39;t coming back, sorry folks :-(


------------------------------------
Reply:
Confirmed it is supported.  Email me at susan-at-msmvps.com (change the at to @) and we will set up a support case

Unfortunately TechNet subscriptions aren&#39;t coming back, sorry folks :-(


------------------------------------
Reply:

Email is at your inbox Susan....

Thanks


------------------------------------
Reply:
You sure you sent it?  Try susan-at-sbslinks.com (change the -at- to @) as I haven't seen an email?

Unfortunately TechNet subscriptions aren&#39;t coming back, sorry folks :-(


------------------------------------
Reply:

Email is at your inbox for a second time my dear Susan....

Thanks


------------------------------------
Reply:
To close the loop on this 8.1 is fully supported on SBS 2011 essentials and standard.

Unfortunately TechNet subscriptions aren&#39;t coming back, sorry folks :-(


------------------------------------
Reply:

Hello Susan,

How did you find out Win 8.1 is fully supported on SBS 2011 Essentials?

Regards,

Yves Leduc - MCSE


Yves Leduc - MCSE, SMB Specialist, MS Cloud Partner


------------------------------------

visual studio online for update check using powershell

guys i am using visual studio online as a version controlling solution for my Powershell projects.

i want to do this scenario when i start one of my Powershell scripts it has to do

1 - go to a link online and check if the version in the destination file (placed as a variable) is the same as it's source

2 - if source version is older than the destination, it should download the files from a location(i don't know how to provide the files yet)

3 - after downloading the files it should close the old scripts and replace all the scripts and run again the new ones.

what could you suggest me to make this scenario work ?

i am open to any alternatives.

thanks.


  • Edited by Vagho Tuesday, June 30, 2015 1:32 PM

Reply:
I recommend that you solve the file problem and post your script with a specific question. We cannot providdce consulting services or free scripting.

\_(ツ)_/


------------------------------------
Reply:

i have opened an open discussion not a question , please don't be that much rude, if you would not want to provide any help then don't comment nor reply.

thanks,


------------------------------------
Reply:

What is the discussion?  Your intro is very vague. 

What online files?  Online where? What is it about a link that tells you the version?

You have to provide accurate and useful information.  Just calling this a discussion does not change the nature of your request.

If you want to access a web site use Invoke-WebRequest.  How you would use this is impossible to say from the limited information you have.

To compare two numbers use "-gt" or "-lt" in alogic statement.  How do you close scripts?  That makes even less sense without more information.

There is really nothing to discuss.  You just need to solve these issues and write your script.


\_(ツ)_/


------------------------------------
Reply:
ok thanks for those helpful information. 

------------------------------------
Reply:
You would do better if you learned a bit of PowerShell and then asked a specific question.  It sounds like you are trying to ask how you can retrieve a script from a version control system but then you ask about web links.  Rethink what you are stating and what you are trying to accomplish and formulate a clear question.  The answer is likely easier than all of the discussion.

\_(ツ)_/


------------------------------------
Reply:
i am learning powershell on the the way and coding too

i have my scripts being stored and using version control in my visual studio online account

i wanted to use that same storage as a files location online so that my scripts that are implemented at different locations connect to my repository (which is now my visual studio online storage) and download / or check for content in a file for version difference, and then if there is a new version the script would download the newer files

------------------------------------
Reply:

Not a very secure way to distribute scripts.  You could publish then to OneDrive and share then you can read the file date remotely and the scritps would be automatically upgraded whenever you publish a new version.


\_(ツ)_/


------------------------------------
Reply:

Here is a link to a file list that can be used: http://1drv.ms/1eXYc6N as  source.  Be cause it is OneDrive we can use tan API to retrieve the info and it can be shared with no authentication required.


\_(ツ)_/


  • Edited by jrv Tuesday, June 30, 2015 2:40 PM

------------------------------------
Reply:

thanks a lot i will create an OneDrive account and see what i can do with it. 

thanks again.


------------------------------------

Cloud ERP Successful Implementation Study

I am doing a study on Successful Implementaion of Cloud based services. Request you to fill the Questionnaire in the link below:

http://goo.gl/forms/LbyxMPlgpr

This is a survey done for a PhD related work. Won't take more than 10 mins of your time. This survey is undertaken only for academic purposes.

[Forum FAQ] How do I send multiple rows returned by Execute SQL Task as Email content in SQL Server Integration Services?

Question:
There is a scenario that users want to send multiple rows returned by Execute SQL Task as Email content to send to someone. With Execute SQL Task, the Full result set is used when the query returns multiple rows, it must map to a variable of the Object data type, then the return result is a rowset object, so we cannot directly send the result variable as Email content. Is there a way that we can extract the table row values that are stored in the Object variable as Email content to send to someone?

Answer:
To achieve this requirement, we can use a Foreach Loop container to extract the table row values that are stored in the Object variable into package variables, then use a Script Task to write the data stored in packages variables to a variable, and then set the variable as MessageSource in the Send Mail Task. 

  1. Add four variables in the package as below:

  2. Double-click the Execute SQL Task to open the Execute SQL Task Editor, then change the ResultSet property to "Full result set". Assuming that the SQL Statement like below:
    SELECT   Category, CntRecords
    FROM         [table_name]

  3. In the Result Set pane, add a result like below (please note that we must use 0 as the result set name when the result set type is Full result set):

  4. Drag a Foreach Loop Container connects to the Execute SQL Task. 
  5. Double-click the Foreach Loop Container to open the Foreach Loop Editor, in the Collection tab, change the Enumerator to Foreach ADO Enumerator, then select User:result as ADO object source variable.

  6. Click the Variable Mappings pane, add two Variables as below:

  7. Drag a Script Task within the Foreach Loop Container.
    The C# code that can be used only in SSIS 2008 and above in Script Task as below:
    public void Main()
      {
       // TODO: Add your code here
                Variables varCollection = null;
                string message = string.Empty;
                Dts.VariableDispenser.LockForWrite("User::Message");
                Dts.VariableDispenser.LockForWrite("User::Category");
                Dts.VariableDispenser.LockForWrite("User::CntRecords");     
                Dts.VariableDispenser.GetVariables(ref varCollection);
                //Format the query result with tab delimiters
                message = string.Format("{0}\t{1}\n",
                                            varCollection["User::Category"].Value,
                                            varCollection["User::CntRecords"].Value
                                          );
               varCollection["User::Message"].Value = varCollection["User::Message"].Value + message;   
               Dts.TaskResult = (int)ScriptResults.Success;
      }
    The VB code that can be used only in SSIS 2005 and above in Script Task as below, please note that in SSIS 2005, we should change PrecompileScriptIntoBinaryCode property to False and Run64BitRuntime property to False :
    Public Sub Main()
            '
            ' Add your code here
            '
            Dim varCollection As Variables = Nothing
            Dim message As String = String.Empty
            Dts.VariableDispenser.LockForWrite("User::Message")
            Dts.VariableDispenser.LockForWrite("User::Category")
            Dts.VariableDispenser.LockForWrite("User::CntRecords")
            Dts.VariableDispenser.GetVariables(varCollection)
            'Format the query result with tab delimiters
            message = String.Format("{0}" & vbTab & "{1}" & vbLf, varCollection("User::Category").Value, varCollection("User::CntRecords").Value)
            varCollection("User::Message").Value = DirectCast(varCollection("User::Message").Value,String) + message
            Dts.TaskResult = ScriptResults.Success
    End Sub
  8. Drag Send Mail Task to Control Flow pane and connect it to Foreach Loop Container.
  9. Double-click the Send Mail Task to specify the appropriate settings, then in the Expressions tab, use the Message variable as the MessageSource Property as below:

  10. The final design surface like below:


References:
Result Sets in the Execute SQL Task

Applies to:
Integration Services 2005
Integration Services 2008
Integration Services 2008 R2
Integration Services 2012
Integration Services 2014


Please click to vote if the post helps you. This can be beneficial to other community members reading the thread.


Reply:
Thank you for the post, but images starting from bullet #3 did not make it in. Seems that is due to size.

Arthur

MyBlog


Twitter


------------------------------------

Retrieving latest file and pc name

I am working to get a script to run and pull the computer name and the latest file in a directory. The end goal would be to run this report on all 5k+ computers on the domain.

here is what I have so:

gci c:\users\*\appdata\roaming\microsoft\office\recent | sort lastwritetime | select -last 1 | export-csv C:\office.csv

This will run and export correctly giving me the timestamp from the last file in this directory, but I have not been able to append the computer name it is run on to the csv file. Any help would be appreciated, have not used powershell before today.


Reply:

I'm not exactly sure what you're really trying to do, but one of these two should help you:

Get-ChildItem c:\users\*\appdata\roaming\microsoft\office\recent |    Sort LastWriteTime |    Select -Last 1 |   Export-Csv .\"$($env:COMPUTERNAME)-Office.csv"          Get-ChildItem c:\users\*\appdata\roaming\microsoft\office\recent |    Sort LastWriteTime |    Select Name,FullName,LastWriteTime,@{N='ComputerName';E={$env:COMPUTERNAME}} -Last 1 |   Export-Csv .\Office.csv


Don't retire TechNet! - (Don't give up yet - 13,225+ strong and growing)


------------------------------------

Sorry, something went wrong. Please try again. Power BI refresh.

During scheduled refresh on a power pivot workbook Power BI posts the the following error:

"We tried to run the data refresh several times, but refresh failed repeatedly. We've turned OFF the refresh schedule.

FAILURE INFORMATION
Sorry, something went wrong. Please try again.
Correlation ID: f27fa61e-cacb-4b15-8bd6-215f644df5e8 "

Power BI is setup with the data management gateway to on-prem SQL Server 2008 R2. Workbook has been refreshing fine until now and nothing has been changed.

Any suggestions? Thanks in advance.


Reply:

Any suggestions for Jonas?

Thanks!


Ed Price, Azure & Power BI Customer Program Manager (Blog, Small Basic, Wiki Ninjas, Wiki)

Answer an interesting question? Create a wiki article about it!


------------------------------------
Reply:

Jonas, is this still an issue?

Thank you!


Ed Price, Azure & Power BI Customer Program Manager (Blog, Small Basic, Wiki Ninjas, Wiki)

Answer an interesting question? Create a wiki article about it!


------------------------------------
Reply:

Hi ED.. I have a similar problem as the one Jonas describes, where I need to Schedule a data refresh to an ON-Premise database.

There is only one table involved, and no design of anykind just the datasource. The Excel sheet connect perfectly to the data source via the power Query, however the scheduled refresh dosent Work.

It seems like that the data source is never tried, and the error occours before it connects. I'm running latest versions on every thing, and there are no designs Graphs/Pivot ect in the EXCEL files.

Error messages

ERROR: Sorry, something went wrong. Please try again. Correlation ID: 048237f8-e24c-4e83-9060-4e5e4d056a0d

Power Query - SERVERNAME;MicrosoftDynamicsAX                                                             Not tried


------------------------------------
Reply:
Any response here? I am having the same issue.

------------------------------------
Reply:

I have the same problem - scheduled data refresh gives error:

Sorry, something went wrong. Please try again. Correlation ID: b1cdedd3-c63d-4724-ba2c-c50d84a888a2

It used to refresh normaly until a few weeks ago, when stopped refreshing whith the error.

My Excel report has a power query who takes data from a web page.

Any help possible?


------------------------------------
Reply:

Hello,

Same problem, since a few days, the data refresh fails with error message "Not Tried".

What does it mean? How can we resolve this?



------------------------------------
Reply:

Same problem here - a report that has refreshed for weeks fails since June 6. We didn't publish a new version of the report that day. The data can be refreshed from within Excel without problems.

Even after installing the latest version of the DMG yesterday, I keep getting the same error. The error is thrown about 30 minutes after the scheduled time for refresh.

Please help!


------------------------------------
Reply:

I have a possible workaround - I have deleted the workbook from the Power BI site and uploaded it again. So, not update the file but delete it and upload a new one. After this, scheduled refresh seems to be back again, although it took a few tries.

The drawback of this is that you lose the refresh history, and all users who favorited the report need to do this again for the report to show in their portal.

It's still a mystery what's going on here, though...


------------------------------------
Reply:

Just as an update for all of you, my report refresh stopped working again during the weekend, on what looks like a time-out error (but the error message is all gibberish).

This time, however, installing the latest update of the DMG client appears to solve the issue. Yes, there's another update release on June 23!


------------------------------------

Installing SQL Server in Windows Server 2012 hardened

I am a System guy. I have this issue a few days already and I want to make it work.

I have setup a Windows Server 2012R2 and hardened the OS with CIS template. 

I installed Microsoft SQL Server 2012 on this hardened server. However, the following feature failed.

1) Database Engine Services Failed
2) Data Quality Services failed
3) Full-Text and Semantic Extractions for Search failed
4) SQL Server Replication failed
5) Reporting Services - Native failed

I believed I am lacking of some permissions or GPO rights for the setup account or service accounts. Any experts can tip me off? I have the logs. I can upload it if needed.

Reply:

Well to help you more I need to see summary.txt file generated after failed installation. Please use below link to locate it and post the output here

https://msdn.microsoft.com/en-us/library/ms143702(v=sql.120).aspx


Please mark this reply as answer if it solved your issue or vote as helpful if it helped so that other forum members can benefit from it

My Technet Wiki Article

MVP


------------------------------------

[Forum FAQ]Install MSI file using command remotely and silently

Scenario
For small organization which is unable to afford SCCM but need to deploy software or 3rd party software manually after Windows 7 deployment/imaged. We can use Psexec command and msiexec command to silent install .msi files in domain-joined computers or workgroup.

Additionally, the msi file could be stored in a share folder. These commands are applied to work in Workgroup and Microsoft Domain.

Solution

We have two domain-joined Windows 7. (I'd suggest that we using another client machine as remote client since we need to create extra files under system32 folder.)

The computer which need software deployed was named v-wdi076848vm (logging in with domain user account) and we need to apply a 3<sup>rd</sup> party software on it from another domain client (logging in with domain administrator account).

Steps:

  1. Download the PsTools from the Windows Systeminternal.

       https://technet.microsoft.com/en-us/sysinternals/bb896649.aspx

The tool PsExec is included in the PsTools suite, which are downloadable as a package.

  1. Extract the pstools.zip file and copy these file to the folder C:\Windows\System32.
  2. Run the command psexec \\<computer name> -u <username> -p <password> cmd in computer v-wdi076848vm to launch an interactive command prompt on the remote computer. When you first run this command, the agreement window may prompt, click "agree" then wait for remote CMD finish. 
  1. The path of shared 7zip.msi file is \\V-WDI076748VM\share\7zip.msi
  2. Run the command msiexec /i <path of package>  /qn to install the msi package. 
  3. The process will be running silently and shown in the task manager or by using tasklist command, and after a while, the adobe reader will be installed. (sometimes the installation will be very fast depending on software you decide to install).

Please click to vote if the post helps you. This can be beneficial to other community members reading the thread.

No comments:

Post a Comment

Setup is Split Across Multiple CDs

Setup is Split Across Multiple CDs Lately I've seen a bunch of people hitting installation errors that have to do with the fact th...