sub report formatting
How can i position my sub report adjacent to the main report and function like there is not a sub report on the end? Currently i have to match up the main reprt headings with the sub report. Is there a more precise way?
Reply:
Hi,
You have set the column Sequence and column width in both the report is Same
Hope this will help you !!!
Sanjeewan
------------------------------------
Reply:
------------------------------------
Technical difference between Adapter and Pipelines
Pls delete it
- Edited by JBhola Tuesday, February 7, 2012 4:13 PM wrong send
Reply:
On a very basic level, the adapters provide the transport capability (TCP, HTTP, FILE, MSMQ...), the pipelines provide normalization (Decoding, Disassembly, Assembly, Encoding,...). You can download the following posters to get an idea of where everything fits in BizTalk.
BizTalk Posters Link
http://msdn.microsoft.com/en-us/library/ff742262.aspx
David Downing... If this answers your question, please Mark as the Answer. If this post is helpful, please vote as helpful.
------------------------------------
In SharePoint 2010 event calendar , which column can be used as edit area, which allow the user to embed links, images, video and text?
Reply:
Two possible options
1. you can use a meeting workspace with the calander event.
2. you can add a custom column to the calander to store the information.
Thanks,
Sahil
------------------------------------
how to get all task list items from all sites in a web application
hi,
In my senario. i need to display all my task item from all sites and subsites in a web application. for example in my site have 5 sub sites each site have a task list. I need to display all task in site 3 from all the sites.
i write a custom webpart and using spwebcollection but it assking authendiacation for the assigned to user also.
please help me
Reply:
Hi there
So to confirm this, you are trying to query multiple site collections and so using SPSiteDataQuery or other cross-site query mechanisms are not available?
In this case you should be running this code as a user with permission to all site collections, in other words, someone with full control to the entire web application. If the user context in SharePoint is not that of a user with this level of permissions then you may want to look at using this http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.spsecurity.runwithelevatedprivileges.aspx
Thanks
Stuart
Read my wiki at www.intheknow.it for more code and tips for developing with SharePoint 2007 & 2010
Twitter: @starznet
Technical Architect at Starznet Ltd. WSS/MOSS development and customisation with a primary focus on CMS.
------------------------------------
Outlook 2010 not syncing with exchange 2010, iOS devices fine.
Where can I find logs to trouble shoot this issue?
This is happening on 2 of my client PCs.
Thanks for any help
PS its not happening on my iOS devices.
Reply:
Synchronizing Outlook is a complete different story than syncing mobile devices. Regarding your Outlook client running in cache mode, switch to folder list mode and see if there's a "Sync Issues" folder with a possible clue in there on the problem.
Also, did you check out this article?
http://support.microsoft.com/kb/842284
Michel de Rooij,
MCITP Ent.Msg 2007+2010| MCTS W2008, Ex2007+2010 Conf, OCS2007 Cfg | MCSE+Msg2k3 | MCSE+Inet2k3 | Prince2 Fnd | ITIL
Check my blog at http://eightwone.com/ or follow me on twitter.com/mderooij
------------------------------------
Reply:
Hi,
I had a look at the sync folder and there was one instance from this morning at 9.35 when outlook failed to sync deletions but that would not explain the problems before today and for the rest of today. My outlook shows as connected and "All folders are up to date" next to that thou I just noticed it says updating address book and looks like it hanging on that, could that be causing the problem , I assume so as wouldnt outlook have to complete this task before starting normal activity?
Thanks for your time.
regards
Gordon
------------------------------------
libiodbc.so & libodbc.so error: cannot open shared object file.
#include <cstdio> // for printf #include <SQLAPI.h> // main SQLAPI++ header #include <iostream> int main(int argc, char* argv[]) { SAConnection con; // create connection object try { // connect to database // in this example it is Oracle, // but can also be Sybase, Informix, DB2 // SQLServer, InterBase, SQLBase and ODBC con.Connect( "test", // database name "tester", // user name "tester", // password SA_ODBC_Client); printf("We are connected!\n"); // Disconnect is optional // autodisconnect will ocur in destructor if needed con.Disconnect(); printf("We are disconnected!\n"); } catch(SAException &x) { // SAConnection::Rollback() // can also throw an exception // (if a network error for example), // we will be ready try { // on error rollback changes con.Rollback(); } catch(SAException &) { } // print error message printf("%s\n", (const char*)x.ErrText()); } return 0; } Well, I found out what the problem was, seems I was giving the Oracle connection parameter in my source program.
the correct one should be SA_ODBC_CLIENT which is for SQL Server connections.You know, another story just started to play as soon as this bug was identified.
bash-3.1# g++ dbConnection.obj -o dbConnection -L/usr/lib/SQLAPI/lib -s -lsqlapi -lc -lm -lcrypt -ldl
bash-3.1# export LD_LIBRARY_PATH=/usr/lib/SQLAPI/lib
bash-3.1# ./dbConnection
libiodbc.so: cannot open shared object file: No such file or directory
libiodbc.so.3: cannot open shared object file: No such file or directory
libiodbc.so.2: cannot open shared object file: No such file or directory
libodbc.so: cannot open shared object file: No such file or directory
libodbc.so.1: cannot open shared object file: No such file or directory
DBMS API Library loading fails
This library is a part of DBMS client installation, not SQLAPI++
Make sure DBMS client is installed and
this required library is available for dynamic loading
You know, I think the SQLServer Driver is not included in the SQLAPI++. So, I think I need to install the unixODBC driver for SQLServer. You got the same idea?, moreover, if the case is right, I think m not gonna need SQLAPI++ anymore, as I guess, you know, the ODBC manager driver automatically supports SQL APIs.
looking forward to hearing from you.
Saman
- Moved by SSISJoostMVP Wednesday, June 19, 2013 6:26 PM Not SSIS related
help with seeing web page
- Changed type Niki Han Wednesday, February 22, 2012 7:11 AM
Reply:
Hi,
does this happen on every web page you visit? Which web site page if not?
Which Windows and IE version are you using?
Regards.
Rob^_^
------------------------------------
Reply:
what excatly do you see? what is displayed on the part of the website that you cannot see.
please paste the screen dump
Web Designer Edinburgh Web Designer Glasgow
SEO Edinburgh SEO Glasgow
------------------------------------
Shared Calendar in mixed mode
in a mixed mode EX2k10/Ex2k3 if a user on Ex2k3 share his calendar with a user on EX2k10 and if we close port 135 between both server ( we are using Outlook anywhere), then the calendar will not show details, it shows just free or buzy.
if we open the port RPC 135, the calendar will be shown correctly.
my questions is, why port RPC 135 is needed?
i tested sharing calendar between 2 users on an exchange 2k10 ( in the same site as Exchange 2003) and there is now issue.
So:
1- EX2k3 to Ex2k3: No Issue with port RPC 135 closed
2- Ex2k10 to Ex2k10: No Issue with port RPC 135 closed
3- Ex2k3 to Ex2k10: Calendar shown with No details with port RPC 135 closed
4- Ex2k3 to Ex2k10: Calendar shown with details with port RPC 135 opened.
5- Ex2k10 to Ex2k3: Calendar shown with No details with port RPC 135 closed
6- Ex2k10 to Ex2k3: Calendar shown with details with port RPC 135 opened
can anyone explain to me why Port RPC 135 is needed in mixed mode in this case please?
Reply:
in a mixed mode EX2k10/Ex2k3 if a user on Ex2k3 share his calendar with a user on EX2k10 and if we close port 135 between both server ( we are using Outlook anywhere), then the calendar will not show details, it shows just free or buzy.
if we open the port RPC 135, the calendar will be shown correctly.
my questions is, why port RPC 135 is needed?
i tested sharing calendar between 2 users on an exchange 2k10 ( in the same site as Exchange 2003) and there is now issue.
So:
1- EX2k3 to Ex2k3: No Issue with port RPC 135 closed
2- Ex2k10 to Ex2k10: No Issue with port RPC 135 closed
3- Ex2k3 to Ex2k10: Calendar shown with No details with port RPC 135 closed
4- Ex2k3 to Ex2k10: Calendar shown with details with port RPC 135 opened.
5- Ex2k10 to Ex2k3: Calendar shown with No details with port RPC 135 closed
6- Ex2k10 to Ex2k3: Calendar shown with details with port RPC 135 opened
can anyone explain to me why Port RPC 135 is needed in mixed mode in this case please?
- Merged by wendy_liu Friday, March 2, 2012 3:13 AM duplicate
------------------------------------
Reply:
Why are you closing port 135 between the servers? When the shared calendar needs to be accessed for the details, it uses RPC to connect to it.
You can see free/busy because Exchange 2010 uses HTTP to connect to the 2003 Public folder store virtual directory to access that and on the 2003 side, it looks at the free/busy folder in the pf store.
------------------------------------
Reply:
------------------------------------
Reply:
the company policy need to close all RPC trafic between both sites.
could you please give me more details or any
MS article about this information:
"When the shared calendar needs to be accessed for the details, it uses RPC to connect to it."
it's clear there that it's not supported to install a firewall between exchange servers http://technet.microsoft.com/en-us/library/bb331973.aspx
but in the other hand, Microsoft said that Outlook Anywhere does not need any RPC connections to be made there:
- Edited by HMondher Tuesday, January 31, 2012 2:51 PM
------------------------------------
Reply:
http://technet.microsoft.com/en-us/library/bb232134.aspx
Understanding the Availability Service
Check the flow chart for "Detailed" information.
Note also that blocking Exchange required ports between servers in the same forest is not supported.
------------------------------------
Reply:
Hi A_D
thanks for reply, this article does not explain how it work for Outlook Anywhere and what port are required !!!
excuse me for the confusion:
we did not blocked any RPC communication between servers, just between servers and clients using OA.
from our tests we found that:
A- if the UserA is on EX2k3A and share his calendar with UsersB residing on EX2k3B server: NO issue, we can see details even if Port 135 is blocked between EX2k3B and UserA
B- if the UserA is on EX2k10A and share his calendar with UsersB residing on EX2k10B server: NO issue, we can see details even if Port 135 is blocked between EX2k3B and UserA
C- if the UserA is on EX2k3A and share his calendar with UsersB residing on EX2k10B server: We cannot viwe details.
D- if the UserA is on EX2k10A and share his calendar with UsersB residing on EX2k3B server: We cannot viwe details.
the conclusion is:
if UserB who shared the calender is not on the same version of exchange of UserA, and Port 135 is blocked, We cannot viwe details.
My Question is: What is the difference between 2 scénarios? is this any article that explain that?
Thanks in advance :)
Mondher
- Edited by HMondher Thursday, February 2, 2012 11:36 AM
------------------------------------
Reply:
Hi ,
I recommend you view Exchange 2003 and Exchange 2010 network Port about 135 port. And you will understand.
Port Used in Exchange Server 2003:
http://technet.microsoft.com/en-us/library/bb124075(v=exchg.65).aspx
Exchange Network Port Reference:
http://technet.microsoft.com/en-us/library/bb331973.aspx
Wendy Liu - MSFT
------------------------------------
Reply:
Hi ,
The similar issue for your reference.
Wendy Liu - MSFT
------------------------------------
Reply:
Hi
has anyone an answer for this question please?
Some peoples said that RPC 135 is required for calendar details other said No, i'm really confused !!!!!!!!
kinds
------------------------------------
SCCM architecture
Where can I find the sample case studies or Diagrams for SCCM !!
Abheek Dutta IT Analyst
- Changed type Sabrina Shen Tuesday, February 21, 2012 9:12 AM
Reply:
Torsten Meringer | http://www.mssccmfaq.de
------------------------------------
Reply:
Hi Torsten,
I know . That's why I mentioned any sample case studies or diagrams !! :-)
Abheek Dutta IT Analyst
------------------------------------
Reply:
That's why I mentioned any sample case studies or diagrams !! :-)
Try this http://anoopmannur.wordpress.com/2010/12/31/sample-of-sccm-configmgr-highlevel-architecturedesign/
Anoop C Nair - @anoopmannur
MY BLOG: http://anoopmannur.wordpress.com
User Group: ConfigMgr Professionals
This posting is provided AS-IS with no warranties/guarantees and confers no rights.
------------------------------------
Discuss Private Cloud Infrastructure as a Service Monitoring and Manageable Applications
Discuss the Private Cloud Infrastructure as a Service Monitoring and Manageable Applications article here.
Thanks!
Tom
MS ISDUA/UAG DA Anywhere Access Team Get yourself some Test Lab Guides! http://blogs.technet.com/b/tomshinder/archive/2010/07/30/test-lab-guides-lead-the-way-to-solution-mastery.aspx
- Changed type Kristian Nese [MSFT]Microsoft employee Tuesday, February 7, 2012 4:39 PM Changing it to "General Discussion"
Problem selecting queues in userrole
Hello all,
I've got the following problem; since CU3 ouis installed our custom made incident form erases all the fields when a service engineer clicks on "apply".
The information is saved, but the form has been emptied which makes a immediate followup tricky. I managed to trace back the problem to the userrole rights. If I edit the userrole to include all the queues than this problem doesn't occur.
However; the strange thing is that I need to check the option "All work items can be accessed" . If I use the combination "Provide access to only the selected queues" in combination with "select all", it DOESN'T work. Which in my opinion should be exactly the same? So apparently there is some sort of "internal" queue which is selected when I use the "All work items can be accessed" option.
The problem is that I cannot use this option, since I have implemented several dedicated Servicedesk departments in our SCSM setup, and they are not allowed to see each other Incidents. Does anyone have any insight in this? Is it possible to edit the userrole in an other way?
Regards
Martijn
- Edited by Martijn van Zeeland Tuesday, February 7, 2012 12:55 PM
Reply:
Hi,
Queues and Groups are updated on a set interval - so once the incident is saved, it can actually take a couple of minutes until it's actually accessable for people in the Queue.
Regards
//Anders
Anders Asp | Lumagate | www.lumagate.com | Sweden | My blog: www.scsm.se
------------------------------------
Reply:
Hi,
Queues and Groups are updated on a set interval - so once the incident is saved, it can actually take a couple of minutes until it's actually accessable for people in the Queue.
Regards
//Anders
Anders Asp | Lumagate | www.lumagate.com | Sweden | My blog: www.scsm.se
Hi Anders,
Thanks for the reply, however that's not the issue, if you fill in an incident form, and you click Apply instead of OK then the entire form is erased of it's content. If you change the userrole to include all queue's than this doesn't happen...
------------------------------------
Outlook 2010 Auto Reply based on time of day
Reply:
you may try below settings for all the users and schedule this every day at particular time as you wished
http://www.telnetport25.com/2012/01/exchange-2010-out-of-office-fun-with-set-mailboxautoreplyconfiguration/
Kottees : My Blog : Please mark it as an answer if it really helps you.
------------------------------------
Reply:
Hi DrEvl,
try this process:
To use Outlook 2007 to turn on and turn off Out-of-Office replies-
On an Outlook 2007 client computer, on the Tools menu, click Out of Office Assistant.
-
In Out of Office, perform the appropriate task:
- To turn on out-of-office replies, click Send Out of Office auto-replies, and then customize your auto-reply messages.
- To turn off out-of-office replies, click Do not send Out of Office auto-replies.
- To turn on out-of-office replies, click Send Out of Office auto-replies, and then customize your auto-reply messages.
Thanks... (Rember to vote as Helpful.)
------------------------------------
Installation requirements to run SSIS package on a server??
Hi
Can any one tell me about the installation requirements to run an EXE which has ssis package implementation on a server..
SUNIL PARISI
Reply:
Hi,
To execute SSIS Package you need to Install Integration Service on same server.
Shailesh, Please mark the post as answered if it answers your question.
------------------------------------
Reply:
Phani Note: Please vote/mark the post as answered if it answers your question/helps to solve your problem.
------------------------------------
Reply:
Hi,
To execute SSIS Package you need to Install Integration Service on same server.
Shailesh, Please mark the post as answered if it answers your question.
That's not true. The SSIS service is only used to monitor packages and to manage package storage.
To run SSIS packages, you only need the DTEXEC executable.
http://support.microsoft.com/kb/942176
MCTS, MCITP - Please mark posts as answered where appropriate.
Answer #1: Have you tried turning it off and on again?
Answer #2: It depends...
------------------------------------
Reply:
Hi,
To execute SSIS Package you need to Install Integration Service on same server.
Shailesh, Please mark the post as answered if it answers your question.
That's not true. The SSIS service is only used to monitor packages and to manage package storage.
To run SSIS packages, you only need the DTEXEC executable.http://support.microsoft.com/kb/942176
MCTS, MCITP - Please mark posts as answered where appropriate.
Answer #1: Have you tried turning it off and on again?
Answer #2: It depends...
I wanted to say Integration Services (SSIS) not Integration Services Serivce
since installing it will install Integration Services core components and the dtexec which require to execute package.
Shailesh, Please mark the post as answered if it answers your question.
------------------------------------
Reply:
I wanted to say Integration Services (SSIS) not Integration Services SerivceThat is indeed something very different :)since installing it will install Integration Services core components and the dtexec which require to execute package.
Shailesh, Please mark the post as answered if it answers your question
MCTS, MCITP - Please mark posts as answered where appropriate.
Answer #1: Have you tried turning it off and on again?
Answer #2: It depends...
------------------------------------
Reply:
Hello Sunil,
After development of package do the following steps for package installation:
1) Make a Deployment Manifest file. For this Right Project and Create. It will save in Bin\Deployment Folder
2) Now Double Click on the Manifest File. You will see Installation Wizard for the package installation.
3) Follow the steps as per your requirements (Deploy the package, Configure it with your parameters, etc.)
You need Integration Services installed on the Server where you are deploying, Execute Package Utility (dtexecui) and you have to start SQL Server Agent.
Kind Regards
Dani
------------------------------------
Reply:
That's only necessary if you want to schedule the SSIS package through SQL Agent....and you have to start SQL Server Agent.
MCTS, MCITP - Please mark posts as answered where appropriate.
Answer #1: Have you tried turning it off and on again?
Answer #2: It depends...
------------------------------------
Shared storage mandatory ?
Hi,
Is there a way to use failover clustering with two nodes but without a shared storage ?
Or is it mandatory to put an external disk as shared storage connected to the 2 servers ?
Thank you for your answers
- Changed type Vincent Hu Wednesday, February 8, 2012 8:05 AM discussion is better
Reply:
I have looked at ioDrives in combination with Steeleye datakeeper for our SQL server. Steeleye promises that it does not need shared resources. We decided to go for a SAN. You can look it up and download evaluation software.
Fred
------------------------------------
Reply:
Hi,
You need to have the same data available to both of the nodes. The only supported way I know of that does not required shared storage between the two nodes is a mult-site cluster however this will usually utilise a enterprise SAN with SAN replication between the two storage arrays to provide the same data to both nodes.
You usually use iSCSI, SAN or external DAS arrays for this.
Sean Massey | Consultant, iUNITE
Feel free to contact me through My Blog, Twitter or Hire Me.
Please click the Mark as Answer button if a post solves your problem!
------------------------------------
Reply:
Hi,
Thanks for your reply.
For what I have understood, I will need a SAN/iSCSI or DAS to have a common disk for my two nodes...
Is it possible, whatever software I run on one node, to have it available on the other node in case of failure of thie first node ? (assuming I put the software data on the external disk of course)
And do you have a model of disk to give me that will be not too expensive ? My two servers are DELL and will be in a rack. So I need the disk to be in the rack too.
Thanks again.
------------------------------------
Reply:
For an application to work successfully with a cluster it normally needs to be written to be cluster aware, although you can probably make certain applications work within a cluster it may be quite difficult. Is the application you are trying to cluster proprietary?
Recommending specific hardware is too difficult due to the fact that there are too many unknowns. I would suggest that you contact your Dell account manager. Dell have a large number of suitable storage arrays available. In order to choose one appropriate you really need to know things like your expected IOPS, RAID requirements, storage redundancy requirements, if pondering iSCSI you need to know about your network infrastructure, etc, etc.
Sean Massey | Consultant, iUNITE
Feel free to contact me through My Blog, Twitter or Hire Me.
Please click the Mark as Answer button if a post solves your problem!
------------------------------------
Reply:
No, the application is not designed to work with cluster. But maybe I can install the software on each server, and use only one instance at a time. But I need the data of this software to be common of course.
Concerning the choice of storage, I have contacted my DELL account manager to have more technical information.
Fred B. has talked about "Steeleye Datakeeper". Is it really a good solution to replace a shared storage ?
Really thank you for your help.
------------------------------------
Reply:
Hi Marco,
We use a two node failover cluster with a 4 node iSCSI SAN for Hyper-V virtualisation. For a specific situation (SQL server replication with high IOPS) I have looked at the Steeleye Datakeeper Solution which promises shared-nothing clustering based on host replication integrating with Microsoft failover clustering.
As we will be adding more cluster nodes and ioDrives are still expensive, we choose to stick with the SAN. The principle of Steeleye can also be used without ioDrives. I am still watching the developments of ioDrives and Netlist Hyper-cloud RAM as I belief it will replace current technology.
This link is from the iodrive perspective:
http://www.fusionio.com/blog/do-you-have-to-sacrifice-high-availability-for-high-performance/H
From Steeleye:
"SteelEye DataKeeper Cluster Edition extends the robust DataKeeper replication engine to allow the use of replicated volumes within Windows Server Failover clustering in lieu of a shared disk. Once DataKeeper Cluster Edition is installed, a new cluster called a DataKeeper Volume replaces the shared disk resource required in traditional clusters, allowing users to build clusters without shared storage. Microsoft refers to this type of cluster as a multi-site cluster and relies on 3rd party replication engines such as SteelEye DataKeeper Cluster Edition to enable multi-site configurations."
http://us.sios.com/products/steeleye-datakeeper-windows/
Steeleye Quickstart:
http://docs.us.sios.com/DK4W/DK4WCEQuickStartWHTarget/
Have fun.
- Edited by Fred B. _ Saturday, February 4, 2012 9:20 PM
------------------------------------
Reply:
Hi,
Is there a way to use failover clustering with two nodes but without a shared storage ?
Or is it mandatory to put an external disk as shared storage connected to the 2 servers ?
Thank you for your answers
Absolutely not. There are number of companies and products except already referenced SteelEye (AFAIK it has issue of putting one VM to one LUN most people don't do it) solving this task either mirroring whole LUNs (StarWind Native SAN for Hyper-V and DataCore SANsymphony-V) or only VHD files between multiple hosts (VM6 and Virsto). StarWind and DataCore can work w/o Hyper-V not sure about VM6 and Virsto. Never seen Steel Eye in action.
-nismo
------------------------------------
Reply:
Hi,
Is there a way to use failover clustering with two nodes but without a shared storage ?
Or is it mandatory to put an external disk as shared storage connected to the 2 servers ?
Thank you for your answers
You've asked on Windows forum so I assume you run Windows, don't you? :) But whole concept of using DAS instead of SAN/NAS is not new. Red Hat experimental hypervisor called KVM can feed mirrored storage from both nodes, Xen (don't confuse with XenServer) can do it out-of-box as well and DRBD does this for years. So you may solve your issue with paid software under Windows or you may re-think what you're doing :) Here are some links just in case you'd like to read on topic:
http://www.linbit.com/en/products-services/drbd-enterprise-cluster/drbd-storage-enterprise-cluster/
http://www.linux-kvm.org/page/Migration
http://blog.allanglesit.com/2011/08/linux-kvm-management-live-migration-without-shared-storage/
Hope is helped :)
-nismo
------------------------------------
Reply:
I have read all the links you have all given to me... Thank you for that.
To answer one question : Yes, I will use Windows Server 2008 R2 Enterprise Edition.
It is really hard to decide what architecture must be build for my needs... I am a bit lost in fact :-(
I have two servers under Windows Server 2008 R2 (already bought) that must be set in failover clustering.
This cluster will manage a network of 15 workstations. For now, I have only the two servers (no shared storage, no replication software).
So, what is the best solution to "activate" the cluster ?
And I have also to install Tripwire Log Center to collect all the logs from each workstation and each server.
In fact, I don't understand where I can install Tripwire (on one of the node ?, on both ?) and what will happen for Tripwire if one node goes down ?
I know this must be basic for clustering expert but I am a bit lost...
Thanks for any help.
------------------------------------
Reply:
I am not sure for what application you need High Availability. A cluster server setup needs shared storage in order to be able to provide high availability. Cluster server nodes monitor heartbeats and in case of a node failure will start failover of cluster systems/applications. On top of cluster server you run for example virtual servers giving you the ability to provision, load balance, and make them HA. Installing Tripwire on the local disk of a cluster node won't make it HA. Tripwire in that case would fail with the cluster node. If you installed it on a virtual server on the shared storage resource that VM would failover and Tripwire with it, IF its database is cluster aware, fault tolerant.
With 15 workstations cluster server might be the deep end. It gives you two active cluster server nodes versus server mirroring which has an active and passive node for mission critical apllications. I am not sure what is mission critical in your case, compared to disk RAID, redundant power, UPS, spare parts and a good backup procedure.
For cluster server you need shared storage, period. Steeleye is a possibility for nothing shared. If you have an (older) third server you could use it for a software virtual SAN solution like Starwind or HP VSA. The high end is a hardware SAN.
You can also check out HP storage mirroring software but that gives you one active hardware host and a second passive one. On the active server host you can run a number of Hyper-v virtual servers depending on your licence.
- Edited by Fred B. _ Sunday, February 5, 2012 1:45 PM
------------------------------------
Reply:
Really thank you for the time spent to answer me. It's a bit clearer now.
Concerning Tripwire, do you think I can install a licence on each node (so totally independent instances of the software) that will both collect every logs from all the workstations ?
So that if one node is down, I have everything also on the other node.
For me, the failover clustering is really used for the active directory, in order for each workstation to be able to log on, even if one server is down.
The other software I will install does not need the HA, but must be installed on the two servers.
Also, you have talked about Hyper-V. I don't see how Hyper-V will be used in my case ?
Thank you again.
------------------------------------
Reply:
Marco, you don't need failover clustering for AD, you can join a second Domain Controllor to an existing domain for that.
Tripwire can be made HA and Failover as it's database can be Ms-SQL and data collection, log management and database server can run on separate virtualized systems. MySQL can also be made HA http://www.clusterdb.com/mysql/mysql-with-windows-server-2008-r2-failover-clustering/
What more do you need? Have you looked at MS Small Business Server? It is a product I would advice for small/medium sized companies, pre-configured out of the box with wizards to run for configuration. I would advice secure AD user and password policies, a good virus scanner solution and a good firewall with AD integration.
Read up on virtualization. If you bought W2008 R2 you have the possibility to run Hyper-V virtual servers on your W2008 R2 host system. The guest OS in that virtual server (VM) can be your OS of choice and be given your role of choice. Virtualization gives you the opportunity to utilize your hardware more efficient and effective, seperating server roles on their own VM and giving you more bang for your hardware buck.
For an explenation have a look at: http://www.youtube.com/watch?v=5gYvSdZMWO4
- Edited by Fred B. _ Sunday, February 5, 2012 4:56 PM
------------------------------------
Reply:
I have read all the links you have all given to me... Thank you for that.
To answer one question : Yes, I will use Windows Server 2008 R2 Enterprise Edition.
It is really hard to decide what architecture must be build for my needs... I am a bit lost in fact :-(
I have two servers under Windows Server 2008 R2 (already bought) that must be set in failover clustering.
This cluster will manage a network of 15 workstations. For now, I have only the two servers (no shared storage, no replication software).
So, what is the best solution to "activate" the cluster ?
And I have also to install Tripwire Log Center to collect all the logs from each workstation and each server.
In fact, I don't understand where I can install Tripwire (on one of the node ?, on both ?) and what will happen for Tripwire if one node goes down ?
I know this must be basic for clustering expert but I am a bit lost...
Thanks for any help.
Give a fast try to StarWind as it's pretty easy to install just to find out would whole concept work for you or not. Sorry I have no idea what Tripwire is and how it's related to clustering :)
-nismo
------------------------------------
Reply:
In fact, I realize that might failover clustering is too big for my needs.
What I need, if possible, is to have two servers and one is the replication of the other. If the first one stopped, I start the second one and I retreive all my data.
------------------------------------
Reply:
In fact, I realize that might failover clustering is too big for my needs.
What I need, if possible, is to have two servers and one is the replication of the other. If the first one stopped, I start the second one and I retreive all my data.
There are many ways to skin a cat (c) ...
You may find some sort of replication software for Windows (Double Take or whatever they call themselves now had one for years) suiting your needs. But I would suggest to choose a common route other people could help you with. In this particular case it's running a hypervisor, putting your servers into VMs spread on a different hardware and using some shared storage for data they produce. And running HA or failover. IMHO of course.
-nismo
------------------------------------
Reply:
I don't really understand how Hyper V will be used inside my architecture in fact :-(
You mean the two servers will be physically linked to a shared storage with failover clustering activated.
And on each server, I run a VM for all the other software ?
------------------------------------
Reply:
I don't really understand how Hyper V will be used inside my architecture in fact :-(
You mean the two servers will be physically linked to a shared storage with failover clustering activated.
And on each server, I run a VM for all the other software ?
Yes, exactly! If your app is not cluster aware you need to configure HA. So if one node will die you'll eventually switch to second one with nearly no downtime. Something you wanted, right?
-nismo
------------------------------------
Reply:
Yes, thank you very much !
I will try to go on with all the information everyone gave to me.
------------------------------------
WD smartware and compacting virtual disk.
I seem to have found a glitch.
While compacting XP vhd, it seems the WD smartware tries to follow which ends up with a code 67 and compact fails.
I stopped WD from auto back up, but it appears to have made no difference, I have had no previous problems and other then code 67 in wvpc, no other errors are prompting.
I am compacting again, with WD backup dissabled, but the WD drive is very active, and it has nothing to do with wvpc or XP mode, so I wonder why compacting or WD is behaving this way?
- Changed type Niki Han Wednesday, February 15, 2012 9:17 AM
Reply:
Came up with code 67 agian, compacting failed, the Wd drive not as active but still active and my XP drive is still active?
Time to power down I guess.
------------------------------------
Reply:
I didnt power down, and all has returned to normal, but I havent tried compact again, XPmode is running normal.
So I will try to compact again.
------------------------------------
Reply:
------------------------------------
Reply:
------------------------------------
configure RTAudio with constant bit rate
Reply:
<p>is there an option to configure Lync to use RTAudio with constant bit rate and not variable bit rate?</p><p>I saw in RAudio overview document that this should be possible:</p>
the doc says: The RT Audio encoder is capable of encoding single-channel (mono), 16 bits per sample audio signals. The encoder can be configured to run in constant bit rate mode or variable bit rate mode
------------------------------------
internet ixplorer
- Changed type Niki Han Wednesday, February 15, 2012 9:16 AM
Reply:
Maybe you can explain what's going wrong in more detail?
My blogs: Henk's blog and Virtuall | Follow Me on: Twitter | View My Profile on: LinkedIn
------------------------------------
Reply:
Hi,
If possible, please use PSR to capture the issue.
How do I use Problem Steps Recorder?
http://windows.microsoft.com/en-US/windows7/How-do-I-use-Problem-Steps-Recorder
Niki Han
TechNet Community Support
------------------------------------
Using MSProject with SharePoint 2010
We want to save MS Projects (2010) on SharePoint 2010 and link them to form hierachies of a prgramme. (MS Project is not the server edition). Depending on particular user's designation, we would like to assign specific permissions (CRUD) to them. So an administrator may have full rights; the project managers will have full rights over their own projects, but not be able to see the projects of other PM's. Department managers will be have read access only to those projects within their department; Division managers will have read rights of all projects in their departments; and executives to have read rights of all projects. Some variations of this may be required.
Can SharePoint be set up to allow such configurations (including cases where new departments are created or amalgamated). I have been told that everytime there is a change, SharePoint will need to be reinstalled with the new configuration. I cannot believe that this is the case.
Any assistance will be appreciated.
Thanks,
Allen Lang
Reply:
You are talking about MS Project Plan (.mpp) files, correct?
Yes, SharePoint can do what you are asking, its simply a matter of assigning the appropriate permissions to each file. I recommend you assign the permissions to SharePoint groups and then add your users into these groups. Depending on the size of your organization and your administrative requirements, you may want to create Active Directory security groups with the users as members and join these AD groups into the SharePoint groups.
This could get complicated if you have many project plans in the same document library.
------------------------------------
Reply:
Allen +27 (0) 21 532 7040
------------------------------------
access to deploymentshare
why is it possible to access from a client with local administrator login to deploymentshare$ after setup
there is only domain users in local\users (server), and no everyone?
Chris
Reply:
MDT maps a network drive to the deployment share during deployment. Also, if your local admin password is the same as your domain you will get access.
/ Johan
Regards / Johan Arwidmark Twitter: @jarwidmark Blog: http://www.deploymentresearch.com FB: www.facebook.com/deploymentresearch
------------------------------------
Reply:
Chris
------------------------------------
Duplicate Sent MS Outlook Emails associated with Gmail POP3 & IMAP account
When sending Gmail Pop3/IMAP emails from my laptop via MSOutlook two messages get sent; however, when sending the message from the Gmail web access, only one message gets sent. IMAP and POP3 are both enabled, could this be the reason for the duplicate sent emails? I am not receiving duplicate emails in my Inbox.
How could I resolve this issue?
Reply:
Based on my recall, we could setup Gmail Pop3/SMTP account and Gmail IMAP/SMTP account separately and individual. (create the Pop3 account data file first) And we only could Send mail from one type of account in one time.
A bit more information and your exact steps how you configure the Gmail account would be helpful. You may capture a screenshot in Outlook Account setting feature.
Thanks.
Tony Chen
TechNet Community Support
------------------------------------
No comments:
Post a Comment