You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/active-directory-domain-services/active-directory-ds-admin-guide-configure-secure-ldap.md
+14-12Lines changed: 14 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,7 +13,7 @@ ms.workload: identity
13
13
ms.tgt_pltfrm: na
14
14
ms.devlang: na
15
15
ms.topic: article
16
-
ms.date: 09/21/2016
16
+
ms.date: 01/20/2016
17
17
ms.author: maheshu
18
18
19
19
---
@@ -28,43 +28,45 @@ To perform the tasks listed in this article, you need:
28
28
3.**Azure AD Domain Services** must be enabled for the Azure AD directory. If you haven't done so, follow all the tasks outlined in the [Getting Started guide](active-directory-ds-getting-started.md).
29
29
4. A **certificate to be used to enable secure LDAP**.
30
30
31
-
***Recommended** - Obtain a certificate from your enterprise CA or public certification authority. This configuration option is more secure.
31
+
***Recommended** - Obtain a certificate from a trusted public certification authority. This configuration option is more secure.
32
32
* Alternately, you may also choose to [create a self-signed certificate](#task-1---obtain-a-certificate-for-secure-ldap) as shown later in this article.
33
33
34
34
<br>
35
35
36
36
### Requirements for the secure LDAP certificate
37
37
Acquire a valid certificate per the following guidelines, before you enable secure LDAP. You encounter failures if you try to enable secure LDAP for your managed domain with an invalid/incorrect certificate.
38
38
39
-
1.**Trusted issuer** - The certificate must be issued by an authority trusted by computers that need to connect to the domain using secure LDAP. This authority may be your organization's enterprise certification authority or a public certification authority trusted by these computers.
39
+
1.**Trusted issuer** - The certificate must be issued by an authority trusted by computers that need to connect to the domain using secure LDAP. This authority may be a public certification authority trusted by these computers.
40
40
2.**Lifetime** - The certificate must be valid for at least the next 3-6 months. Secure LDAP access to your managed domain is disrupted when the certificate expires.
41
41
3.**Subject name** - The subject name on the certificate must be a wildcard for your managed domain. For instance, if your domain is named 'contoso100.com', the certificate's subject name must be '*.contoso100.com'. Set the DNS name (subject alternate name) to this wildcard name.
42
42
4.**Key usage** - The certificate must be configured for the following uses - Digital signatures and key encipherment.
43
43
5.**Certificate purpose** - The certificate must be valid for SSL server authentication.
44
44
45
+
> [!NOTE]
46
+
> **Enterprise Certification Authorities:** Azure AD Domain Services does not currently support using secure LDAP certificates issued by your organization's enterprise certification authority. This restriction is because the service does not trust your enterprise CA as a root certification authority. We expect to add support for enterprise CAs in the future. If you absolutely must use certificates issued by your enterprise CA, [contact us](active-directory-ds-contact-us.md) for assistance.
47
+
>
48
+
>
49
+
45
50
<br>
46
51
47
52
## Task 1 - Obtain a certificate for secure LDAP
48
53
The first task involves obtaining a certificate used for secure LDAP access to the managed domain. You have two options:
49
54
50
-
* Obtain a certificate from a certification authority. The authority may be your organization's enterprise CA or a public certification authority.
55
+
* Obtain a certificate from a certification authority. The authority may be a public certification authority.
51
56
* Create a self-signed certificate.
52
57
53
58
### Option A (Recommended) - Obtain a secure LDAP certificate from a certification authority
54
-
If your organization deploys an enterprise public key infrastructure (PKI), you need to obtain a certificate from the enterprise certification authority (CA) for your organization. If your organization obtains its certificates from a public certification authority, you need to obtain the secure LDAP certificate from that public certification authority.
59
+
If your organization obtains its certificates from a public certification authority, you need to obtain the secure LDAP certificate from that public certification authority.
55
60
56
61
When requesting a certificate, ensure that you follow the requirements outlined in [Requirement for the secure LDAP certificate](#requirements-for-the-secure-ldap-certificate).
57
62
58
63
> [!NOTE]
59
-
> Client computers that need to connect to the managed domain using secure LDAP must trust the issuer of the LDAPS certificate.
64
+
> Client computers that need to connect to the managed domain using secure LDAP must trust the issuer of the secure LDAP certificate.
60
65
>
61
66
>
62
67
63
68
### Option B - Create a self-signed certificate for secure LDAP
64
-
You may choose to create a self-signed certificate for secure LDAP, if:
65
-
66
-
* certificates in your organization are not issued by an enterprise certification authority or
67
-
* you do not expect to use a certificate from a public certification authority.
69
+
If you do not expect to use a certificate from a public certification authority, you may choose to create a self-signed certificate for secure LDAP.
68
70
69
71
**Create a self-signed certificate using PowerShell**
70
72
@@ -81,7 +83,7 @@ In the preceding sample, replace 'contoso100.com' with the DNS domain name of yo
81
83
The newly created self-signed certificate is placed in the local machine's certificate store.
82
84
83
85
## Task 2 - Export the secure LDAP certificate to a .PFX file
84
-
Before you start this task, ensure that you have obtained the secure LDAP certificate from your enterprise certification authority or a public certification authority or have created a self-signed certificate.
86
+
Before you start this task, ensure that you have obtained the secure LDAP certificate from a public certification authority or have created a self-signed certificate.
85
87
86
88
Perform the following steps, to export the LDAPS certificate to a .PFX file.
87
89
@@ -170,7 +172,7 @@ To enable secure LDAP, perform the following configuration steps:
> It takes about 10 to 15 minutes to enable secure LDAP for your managed domain. If the provided secure LDAP certificate does not match the required criteria, secure LDAP is not enabled for your directory and you see a failure. For example, the domain name is incorrect, the certificate is expired or expires soon etc.
175
+
> It takes about 10 to 15 minutes to enable secure LDAP for your managed domain. If the provided secure LDAP certificate does not match the required criteria, secure LDAP is not enabled for your directory and you see a failure. For example, the domain name is incorrect, the certificate has already expired or expires soon.
174
176
>
175
177
>
176
178
9. When secure LDAP is successfully enabled for your managed domain, the **Pending...** message should disappear. You should see the thumbprint of the certificate displayed.
@@ -54,6 +53,7 @@ To enable your app to authenticate users and get tokens, you'll first need to re
54
53
* Click the application and choose **Manifest** to open the inline manifest editor.
55
54
* Locate the `oauth2AllowImplicitFlow` property. Set its value to `true`.
56
55
* Click **Save** to save the manifest.
56
+
8. Grant permissions across your tenant for your application. Go to Settings -> Properties -> Required Permissions, and click the **Grant Permissions** button in the top bar. Click **Yes** to confirm.
57
57
58
58
## 2. Install ADAL & Configure the SPA
59
59
Now that you have an application in Azure AD, you can install adal.js and write your identity-related code.
Copy file name to clipboardExpand all lines: articles/data-factory/data-factory-compute-linked-services.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -86,7 +86,7 @@ To use a Windows-based HDInsight cluster, set **osType** to **windows** or do no
86
86
| clusterSize |Number of worker/data nodes in the cluster. The HDInsight cluster is created with 2 head nodes along with the number of worker nodes you specify for this property. The nodes are of size Standard_D3 that has 4 cores, so a 4 worker node cluster takes 24 cores (4*4 for worker nodes + 2*4 for head nodes). See [Create Linux-based Hadoop clusters in HDInsight](../hdinsight/hdinsight-hadoop-provision-linux-clusters.md) for details about the Standard_D3 tier. |Yes |
87
87
| timetolive |The allowed idle time for the on-demand HDInsight cluster. Specifies how long the on-demand HDInsight cluster stays alive after completion of an activity run if there are no other active jobs in the cluster.<br/><br/>For example, if an activity run takes 6 minutes and timetolive is set to 5 minutes, the cluster stays alive for 5 minutes after the 6 minutes of processing the activity run. If another activity run is executed with the 6 minutes window, it is processed by the same cluster.<br/><br/>Creating an on-demand HDInsight cluster is an expensive operation (could take a while), so use this setting as needed to improve performance of a data factory by reusing an on-demand HDInsight cluster.<br/><br/>If you set timetolive value to 0, the cluster is deleted as soon as the activity run in processed. On the other hand, if you set a high value, the cluster may stay idle unnecessarily resulting in high costs. Therefore, it is important that you set the appropriate value based on your needs.<br/><br/>Multiple pipelines can share the same instance of the on-demand HDInsight cluster if the timetolive property value is appropriately set |Yes |
88
88
| version |Version of the HDInsight cluster. The default value is 3.1 for Windows cluster and 3.2 for Linux cluster. |No |
89
-
| linkedServiceName |Azure Storage linked service to be used by the on-demand cluster for storing and processing data. |Yes |
89
+
| linkedServiceName |Azure Storage linked service to be used by the on-demand cluster for storing and processing data. <p>Currently, you cannot create an on-demand HDInsight cluster that uses an Azure Data Lake Store as the storage. If you want to store the result data from HDInsight processing in an Azure Data Lake Store, use a Copy Activity to copy the data from the Azure Blob Storage to the Azure Data Lake Store.</p> |Yes |
90
90
| additionalLinkedServiceNames |Specifies additional storage accounts for the HDInsight linked service so that the Data Factory service can register them on your behalf. |No |
91
91
| osType |Type of operating system. Allowed values are: Windows (default) and Linux |No |
92
92
| hcatalogLinkedServiceName |The name of Azure SQL linked service that point to the HCatalog database. The on-demand HDInsight cluster is created by using the Azure SQL database as the metastore. |No |
@@ -208,7 +208,7 @@ You can create an Azure HDInsight linked service to register your own HDInsight
208
208
| clusterUri |The URI of the HDInsight cluster. |Yes |
209
209
| username |Specify the name of the user to be used to connect to an existing HDInsight cluster. |Yes |
210
210
| password |Specify password for the user account. |Yes |
211
-
| linkedServiceName |Name of the linked service for the blob storage used by this HDInsight cluster. |Yes |
211
+
| linkedServiceName |Name of the linked service for the Azure Blob Storage / Azure Data Lake Store used by this HDInsight cluster. |Yes |
212
212
213
213
## Azure Batch Linked Service
214
214
You can create an Azure Batch linked service to register a Batch pool of virtual machines (VMs) to a data factory. You can run .NET custom activities using either Azure Batch or Azure HDInsight.
Copy file name to clipboardExpand all lines: articles/storage/storage-disaster-recovery-guidance.md
+7-2Lines changed: 7 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,10 +13,11 @@ ms.workload: storage
13
13
ms.tgt_pltfrm: na
14
14
ms.devlang: dotnet
15
15
ms.topic: article
16
-
ms.date: 12/08/2016
16
+
ms.date: 1/19/2017
17
17
ms.author: robinsh
18
18
19
19
---
20
+
20
21
# What to do if an Azure Storage outage occurs
21
22
At Microsoft, we work hard to make sure our services are always available. Sometimes, forces beyond our control impact us in ways that cause unplanned service outages in one or more regions. To help you handle these rare occurrences, we provide the following high-level guidance for Azure Storage services.
22
23
@@ -51,7 +52,9 @@ A couple of points regarding the storage geo-failover experience:
51
52
* Storage geo-failover will only be triggered by the Azure Storage team – there is no customer action required.
52
53
* Your existing storage service endpoints for blobs, tables, queues, and files will remain the same after the failover; the DNS entry will need to be updated to switch from the primary region to the secondary region.
53
54
* Before and during the geo-failover, you won’t have write access to your storage account due to the impact of the disaster but you can still read from the secondary if your storage account has been configured as RA-GRS.
54
-
* When the geo-failover has been completed and the DNS changes propagated, your read and write access to your storage account will be resumed. You can query [“Last Geo Failover Time” of your storage account](https://msdn.microsoft.com/library/azure/ee460802.aspx) to get more details.
55
+
* When the geo-failover has been completed and the DNS changes propagated, read and write access to your storage account will be resumed; this points to what used to be your secondary endpoint.
56
+
* Note that you will have write access if you have GRS or RA-GRS configured for the storage account.
57
+
* You can query [“Last Geo Failover Time” of your storage account](https://msdn.microsoft.com/library/azure/ee460802.aspx) to get more details.
55
58
* After the failover, your storage account will be fully functioning, but in a “degraded” status, as it is actually hosted in a standalone region with no geo-replication possible. To mitigate this risk, we will restore the original primary region and then do a geo-failback to restore the original state. If the original primary region is unrecoverable, we will allocate another secondary region.
56
59
For more details on the infrastructure of Azure Storage geo replication, please refer to the article on the Storage team blog about [Redundancy Options and RA-GRS](https://blogs.msdn.microsoft.com/windowsazurestorage/2013/12/11/windows-azure-storage-redundancy-options-and-read-access-geo-redundant-storage/).
57
60
@@ -63,3 +66,5 @@ There are some recommended approaches to back up your storage data on a regular
63
66
* Tables – use [AzCopy](storage-use-azcopy.md) to export the table data into another storage account in another region.
64
67
* Files – use [AzCopy](storage-use-azcopy.md) or [Azure PowerShell](storage-powershell-guide-full.md) to copy your files to another storage account in another region.
65
68
69
+
For information about creating applications that take full advantage of the RA-GRS feature, please check out [Designing Highly Available Applications using RA-GRS Storage](storage-designing-ha-apps-with-ragrs.md)
0 commit comments