Month: November 2024
Problem to change site url’s in SharePoint admin center
Since a short time ago, I cannot change the url for some sites. I know that sites that are hub-sites, or connected to a hub, can not be changed unless the are released from the hub. It only seems to affect sites linked to a 365 group.
I get almost the standard message, when trying to change url, but with the exception of this text:
“Some features may not work after you change this location address. Do you want to continue? The site has a Business Connectivity Services (BCS) connection. The connection may need to be re-established after you change the location address.”
Anyone having the same problem or know how I can fix this?
/Hans
Since a short time ago, I cannot change the url for some sites. I know that sites that are hub-sites, or connected to a hub, can not be changed unless the are released from the hub. It only seems to affect sites linked to a 365 group. I get almost the standard message, when trying to change url, but with the exception of this text:”Some features may not work after you change this location address. Do you want to continue? The site has a Business Connectivity Services (BCS) connection. The connection may need to be re-established after you change the location address.” Anyone having the same problem or know how I can fix this?/Hans Read More
99贵宾厅开户-17300435119
实现 Microsoft 365 各种功能和应用的开发工作。他们运用不同的编程语言和技术,不断优化和改进产品的性能与功能,以满足用户的需求。例如,在开发新的办公软件功能时,开发工程师需要确保功能的稳定性和兼容性。 Read More
万宝路公司开户-17300435119
Excel 的公式审核功能可以算是一种帮助你检查和理解公式运算过程的 “助手”。当你使用复杂的公式时,通过公式审核中的追踪引用单元格、追踪从属单元格等功能,可以清晰地看到公式所涉及的数据来源和影响范围,就像有个 “副驾驶” 在旁边帮你分析公式的逻辑。 Read More
MDE Onboarding issue – 2012 R2 – CheckPPL Error
Hello,
I am facing an onboarding issue with one specific 2012 R2 server. I’ve already ran MDEAnalyzer and there’s an error message: “112004 CheckPPL Please note the sensor on this device is not PPL protected: SERVICE sense PROTECTION LEVEL: NONE.. “. Event viewer logs for Sense service contains multiple warning events ID 65 “Failed to load Microsoft Security Events Component Minifilter driver. Failure code: 0x80070005”. Server’s status in MDE is “can be onboarded”.
Any suggestions will be highly appreciated!
Hello, I am facing an onboarding issue with one specific 2012 R2 server. I’ve already ran MDEAnalyzer and there’s an error message: “112004 CheckPPL Please note the sensor on this device is not PPL protected: SERVICE sense PROTECTION LEVEL: NONE.. “. Event viewer logs for Sense service contains multiple warning events ID 65 “Failed to load Microsoft Security Events Component Minifilter driver. Failure code: 0x80070005”. Server’s status in MDE is “can be onboarded”. Any suggestions will be highly appreciated! Read More
Raspberry Pi & Intune
Hi All,
A bit of a random question which I believe the answer to be no.
Can Intune managed Raspberry Pi devices?
Thought I would double check on here.
Have a good one.
Thanks
Hi All, A bit of a random question which I believe the answer to be no. Can Intune managed Raspberry Pi devices? Thought I would double check on here. Have a good one. Thanks Read More
Site content stuck in classic
I accidentally clicked return to classic sharepoint and cannot get it to go back, the site contents are arranged as thumbnails now instead of a list, it looks awful and is barely useable. There are no settings to restore this and the option to “Return to modern experience” is not there.
Please can someone assist.
I accidentally clicked return to classic sharepoint and cannot get it to go back, the site contents are arranged as thumbnails now instead of a list, it looks awful and is barely useable. There are no settings to restore this and the option to “Return to modern experience” is not there. Please can someone assist. Read More
Quick question
How can I lock data to copy from same row but different columns?
How can I lock data to copy from same row but different columns? Read More
removable media
Is it possible to block specific file types on removable media using Microsoft Defender for Endpoint?
Is it possible to block specific file types on removable media using Microsoft Defender for Endpoint? Read More
Different (conditional) background colors in a list – Microsoft List vs. Sharepoint Group
Hi,
I have created a Microsoft List that uses conditional row formatting to mark entries that have been changed within the last 31 days with a light blue background color.
This is the JSON for it:
{
“$schema”: “https://developer.microsoft.com/json-schemas/sp/v2/row-formatting.schema.json”,
“additionalRowClass”: {
“operator”: “:”,
“operands”: [
{
“operator”: “<“,
“operands”: [
“[$Age]”,
31
]
},
“sp-css-backgroundColor-BgLightBlue sp-field-fontSizeSmall sp-css-color-BlackText”,
“”
]
}
}
This works fine, as long as I view the list from within Microsoft Lists.
I have also added the List to our SharePoint Group Site, but the the background color here does not look the same. Instead, it is much darker and seemingly the “palette” of usable colors is much smaller.
I tested some other nuances by changing the ‘sp-css-backgroundColor’ to BgGold, BgLightBlue20 and BgLightGrey but always ended up with a much darker background color when displaying the List from the Sharepoint Group.
Does anyone know if there is some parameter in the Sharepoint Group settings that I can change to make the Group display the colors correctly, or any other way to solve this?
Thanks.
Best regards
Patrick
List View from within Microsoft Lists:
List View when embedded in Sharepoint Group:
Hi,I have created a Microsoft List that uses conditional row formatting to mark entries that have been changed within the last 31 days with a light blue background color.This is the JSON for it:{
“$schema”: “https://developer.microsoft.com/json-schemas/sp/v2/row-formatting.schema.json”,
“additionalRowClass”: {
“operator”: “:”,
“operands”: [
{
“operator”: “<“,
“operands”: [
“[$Age]”,
31
]
},
“sp-css-backgroundColor-BgLightBlue sp-field-fontSizeSmall sp-css-color-BlackText”,
“”
]
}
} This works fine, as long as I view the list from within Microsoft Lists. I have also added the List to our SharePoint Group Site, but the the background color here does not look the same. Instead, it is much darker and seemingly the “palette” of usable colors is much smaller.I tested some other nuances by changing the ‘sp-css-backgroundColor’ to BgGold, BgLightBlue20 and BgLightGrey but always ended up with a much darker background color when displaying the List from the Sharepoint Group. Does anyone know if there is some parameter in the Sharepoint Group settings that I can change to make the Group display the colors correctly, or any other way to solve this?Thanks. Best regardsPatrick List View from within Microsoft Lists: List View when embedded in Sharepoint Group: Read More
Modernising Registrar Technology: Implementing EPP with Kotlin, Spring & Azure Container Apps
Introduction
In the domain management industry, technological advancement has often been a slow and cautious process, lagging behind the rapid innovations seen in other tech sectors. This measured pace is understandable given the critical role domain infrastructure plays in the global internet ecosystem. However, as we stand on the cusp of a new era in web technology, it is becoming increasingly clear that modernization should be a priority. This blog post embarks on a journey to demystify one of the most critical yet often misunderstood components of the industry: the Extensible Provisioning Protocol (EPP).
Throughout this blog, we will dive deep into the intricacies of EPP, exploring its structure, commands and how it fits into the broader domain management system. We will walk through the process of building a robust EPP client using Kotlin and Spring Boot. Then, we will take our solutions to the next level by containerizing with Docker and deploying it to Azure Container Apps, showcasing how modern cloud technologies can improve the reliability and scalability of your domain management system. We will also set up a continuous integration and deployment (CI/CD) pipeline, ensuring that your EPP implementation remains up-to-date and easily maintainable.
By the end of this blog, you will be able to provision domains programatically via an endpoint, and have the code foundation ready to create dozens of other domain management commands (e.g. updating nameservers, updating contact info, renewing and transferring domains).
Who it is for
What you will need: EPP credentials
Understanding EPP
EPP is short for Extensible Provisioning Protocol. It is a protocol designed to streamline and standardise communication between domain name registries and registrars. Developed to replace older, less efficient protocols, EPP has become the industry standard for domain registration and management operations.
Stateful connections: EPP maintains persistent connections between registrars and registries, reducing overhead and improving performance.
Extensibility: As the name suggests, EPP is designed to be extensible. Registries can add custom extensions to support unique features or requirements.
Standardization: EPP provides a uniform interface across different registries, simplifying integration for registrars and reducing development costs.
Kotlin
Spring
Azure Container Apps (‘ACA’)
The architecture
Registrant (end user) requests to purchase a domain
Website backend sends instruction to EPP API (what we are making in this blog)
EPP API sends command to the EPP server provided by the registry
Response provided by registry and received by registrant (end user) on website
Setting up the development environment
Prerequisites
For this blog, I will be using the following technologies:
Visual Studio Code (VS Code) as the IDE (integrated development environment). I will be installing some extensions and changing some settings to make it work for our technology. Download at Download Visual Studio Code – Mac, Linux, Windows
Docker CLI for containerization and local testing. Download at Get Started | Docker
Azure CLI for deployment to Azure Container Registry & Azure Container Apps (you can use the portal if more comfortable). Download at How to install the Azure CLI | Microsoft Learn
Git for version control and pushing to GitHub to setup CI/CD pipeline. Download at Git – Downloads (git-scm.com)
VS Code Extensions
Kotlin
Spring Initialzr Java Support
Implementing EPP with Kotlin & Spring
Creating the project
First up, let us create a blank Spring project. We will do this with the Spring Initializr plugin we just installed:
Press CTRL + SHIFT + P to open the command palette
Select Spring Initialzr: Create a Gradle project…
Select version (I recommend 3.3.4)
Select Kotlin as project language
Type Group Id (I am using com.stephen)
Type Artifact ID (I am using eppapi)
Select jar as packaging type
Select any Java version (The version choice is yours)
Add Spring Web as a dependency
Choose a folder
Open project
Your project should look like this:
We are using the Gradle build tool for this project. Gradle is a powerful, flexible build automation tool that supports multi-language development and offers convenient integration with both Kotlin & Spring. Gradle will handle our dependency management, allowing us to focus on our EPP implementation rather than build configuration intricacies.
Adding the EPP dependency
It handles the low-level details of EPP communication, allowing us to focus on business logic.
It is a Java-based implementation, which integrates seamlessly with our Kotlin and Spring setup.
It supports all basic EPP commands out of the box, such as domain checks, registrations and transfers.
Modifying the build settings
toolchain {
languageVersion = JavaLanguageVersion.of(21)
}
sourceCompatibility = JavaVersion.VERSION_21
targetCompatibility = JavaVersion.VERSION_21
}
kotlin {
jvmToolchain(21)
}
tasks.withType(org.jetbrains.kotlin.gradle.tasks.KotlinCompile) {
kotlinOptions {
jvmTarget = “21”
freeCompilerArgs = [“-Xjsr305=strict”]
}
}
tasks.named(‘test’) {
enabled = false
}
The structure
Rename the main class to EPPAPI.kt (Spring auto generation did not do it justice).
Split the code into two folders: epp and api, with our main class remaining at the root.
Create a class inside the epp folder named EPP.kt – this is where we will connect to and manage the EPPClient soon.
Create a class inside the api folder named API.kt – this is where we will configure and run the Spring API.
api
└── API.kt
epp
└── EPP.kt
PORT=700
USERNAME=X
PASSWORD=X
The code
import java.net.Socket
import java.security.KeyStore
import java.security.cert.X509Certificate
import javax.net.ssl.KeyManagerFactory
import javax.net.ssl.SSLContext
import javax.net.ssl.TrustManager
import javax.net.ssl.X509TrustManager
class EPP private constructor(
host: String,
port: Int,
username: String,
password: String,
) : EPPClient(host, port, username, password) {
companion object {
private val HOST = System.getenv(“HOST”)
private val PORT = System.getenv(“PORT”).toInt()
private val USERNAME = System.getenv(“USERNAME”)
private val PASSWORD = System.getenv(“PASSWORD”)
lateinit var client: EPP
fun create(): EPP {
println(“Creating client with HOST: $HOST, PORT: $PORT, USERNAME: $USERNAME”)
return EPP(HOST, PORT, USERNAME, PASSWORD).apply {
try {
println(“Creating SSL socket…”)
val socket = createSSLSocket()
println(“SSL socket created. Setting socket to EPP server…”)
setSocketToEPPServer(socket)
println(“Socket set. Getting greeting…”)
val greeting = greeting
println(“Greeting received: $greeting”)
println(“Connecting…”)
connect()
println(“Connected. Logging in…”)
login(PASSWORD)
println(“Login successful.”)
client = this
} catch (e: Exception) {
println(“Error during client creation: ${e.message}”)
e.printStackTrace()
throw e
}
}
}
private fun createSSLSocket(): Socket {
val sslContext = setupSSLContext()
return sslContext.socketFactory.createSocket(HOST, PORT) as Socket
}
private fun setupSSLContext(): SSLContext {
val trustAllCerts = arrayOf<TrustManager>(object : X509TrustManager {
override fun getAcceptedIssuers(): Array<X509Certificate>? = null
override fun checkClientTrusted(certs: Array<X509Certificate>, authType: String) {}
override fun checkServerTrusted(certs: Array<X509Certificate>, authType: String) {}
})
val keyStore = KeyStore.getInstance(KeyStore.getDefaultType()).apply {
load(null, null)
}
val kmf = KeyManagerFactory.getInstance(KeyManagerFactory.getDefaultAlgorithm()).apply {
init(keyStore, “”.toCharArray())
}
return SSLContext.getInstance(“TLS”).apply {
init(kmf.keyManagers, trustAllCerts, java.security.SecureRandom())
}
}
}
}
EPP.create()
}
Domains: These are the web addresses that users type into their browsers. In EPP, a domain object represents the registration of a domain name.
Contacts: These are individuals or entities associated with a domain. There are typically four types of contact: Registrant, Admin, Tech & Billing. ICANN (Internet Corporation for Assigned Names and Numbers) mandates that every provisioned domain must have a valid contact attached to it.
Hosts: Also known as nameservers, these are the servers that translate domain names into IP addresses. In EPP, host objects can either be internal (subordinate to a domain in the registry) or external.
api
└── API.kt
epp
├── contact
├── domain
│ └── CheckDomain.kt
├── host
└── EPP.kt
import epp.EPP
import com.tucows.oxrs.epprtk.rtk.xml.EPPDomainCheck
import org.openrtk.idl.epprtk.domain.epp_DomainCheckReq
import org.openrtk.idl.epprtk.domain.epp_DomainCheckRsp
import org.openrtk.idl.epprtk.epp_Command
fun EPP.Companion.checkDomain(
domainName: String,
): Boolean {
val check = EPPDomainCheck().apply {
setRequestData(
epp_DomainCheckReq(
epp_Command(),
arrayOf(domainName)
)
)
}
val response = processAction(check) as EPPDomainCheck
val domainCheck = response.responseData as epp_DomainCheckRsp
return domainCheck.results[0].avail
}
We create an EPPDomainCheck object, which represents an EPP domain check command.
We set the request data using epp_DomainCheckReq. This takes an epp_command (a generic EPP command) and an array of domain names to check. In this case, we are only checking one domain.
We process the action using our EPP client’s processAction function, which sends the request to the EPP server.
We cast the response to EPPDomainCheck and extract the responseData.
Finally, we return whether the domain is available or not from the first (and only result) by checking the avail value.
EPP.create()
println(EPP.checkDomain(“example.gg”))
}
import epp.EPP
import org.openrtk.idl.epprtk.contact.*
import org.openrtk.idl.epprtk.epp_AuthInfo
import org.openrtk.idl.epprtk.epp_AuthInfoType
import org.openrtk.idl.epprtk.epp_Command
fun EPP.Companion.createContact(
contactId: String,
name: String,
organization: String? = null,
street: String,
street2: String? = null,
street3: String? = null,
city: String,
state: String? = null,
zip: String? = null,
country: String,
phone: String,
fax: String? = null,
email: String
): Boolean {
val create = EPPContactCreate().apply {
setRequestData(
epp_ContactCreateReq(
epp_Command(),
contactId,
arrayOf(
epp_ContactNameAddress(
epp_ContactPostalInfoType.INT,
name,
organization,
epp_ContactAddress(street, street2, street3, city, state, zip, country)
)
),
phone.let { epp_ContactPhone(null, it) },
fax?.let { epp_ContactPhone(null, it) },
email,
epp_AuthInfo(epp_AuthInfoType.PW, null, “pass”)
)
)
}
val response = client.processAction(create) as EPPContactCreate
val contactCreate = response.responseData as epp_ContactCreateRsp
return contactCreate.rsp.results[0].m_code.toInt() == 1000
}
import epp.EPP
import org.openrtk.idl.epprtk.epp_Command
import org.openrtk.idl.epprtk.host.epp_HostAddress
import org.openrtk.idl.epprtk.host.epp_HostAddressType
import org.openrtk.idl.epprtk.host.epp_HostCreateReq
import org.openrtk.idl.epprtk.host.epp_HostCreateRsp
fun EPP.Companion.createHost(
hostName: String,
ipAddresses: Array<String>?
): Boolean {
val create = EPPHostCreate().apply {
setRequestData(
epp_HostCreateReq(
epp_Command(),
hostName,
ipAddresses?.map { epp_HostAddress(epp_HostAddressType.IPV4, it) }?.toTypedArray()
)
)
}
val response = client.processAction(create) as EPPHostCreate
val hostCreate = response.responseData as epp_HostCreateRsp
return hostCreate.rsp.results[0].code.toInt() == 1000
}
import com.tucows.oxrs.epprtk.rtk.xml.EPPDomainCreate
import org.openrtk.idl.epprtk.domain.*
import org.openrtk.idl.epprtk.epp_AuthInfo
import org.openrtk.idl.epprtk.epp_AuthInfoType
import org.openrtk.idl.epprtk.epp_Command
fun EPP.Companion.createDomain(
domainName: String,
registrantId: String,
adminContactId: String,
techContactId: String,
billingContactId: String,
nameservers: Array<String>,
password: String,
period: Short = 1
): Boolean {
val create = EPPDomainCreate().apply {
setRequestData(
epp_DomainCreateReq(
epp_Command(),
domainName,
epp_DomainPeriod(epp_DomainPeriodUnitType.YEAR, period),
nameservers,
registrantId,
arrayOf(
epp_DomainContact(epp_DomainContactType.ADMIN, adminContactId),
epp_DomainContact(epp_DomainContactType.TECH, techContactId),
epp_DomainContact(epp_DomainContactType.BILLING, billingContactId)
),
epp_AuthInfo(epp_AuthInfoType.PW, null, password)
)
)
}
val response = client.processAction(create) as EPPDomainCreate
val domainCreate = response.responseData as epp_DomainCreateRsp
return domainCreate.rsp.results[0].code.toInt() == 1000
}
import epp.contact.createContact
import epp.domain.createDomain
fun main() {
EPP.create()
val contactResponse = EPP.createContact(
contactId = “12345”,
name = “Stephen”,
organization = “Test”,
street = “Test Street”,
street2 = “Test Street 2”,
street3 = “Test Street 3”,
city = “Test City”,
state = “Test State”,
zip = “Test Zip”,
country = “GB”,
phone = “1234567890”,
fax = “1234567890”,
email = “test@gg.com”
)
if (contactResponse) {
println(“Contact created”)
} else {
println(“Contact creation failed”)
return
}
val domainResponse = EPP.createDomain(
domainName = “randomavailabletestdomain.gg”,
registrantId = “123”,
adminContactId = “123”,
techContactId = “123”,
billingContactId = “123”,
nameservers = arrayOf(“ernest.ns.cloudflare.com”, “adaline.ns.cloudflare.com”),
password = “XYZXYZ”,
period = 1
)
if (domainResponse) {
println(“Domain created”)
} else {
println(“Domain creation failed”)
}
}
Domain created
org.openrtk.idl.epprtk.domain.epp_DomainCreateRsp: { m_rsp [org.openrtk.idl.epprtk.epp_Response: { m_results [[org.openrtk.idl.epprtk.epp_Result: { m_code [1000] m_values [null] m_ext_values [null] m_msg [Command completed successfully] m_lang [] }]] m_message_queue [org.openrtk.idl.epprtk.epp_MessageQueue: { m_count [4] m_queue_date [null] m_msg [null] m_id [916211] }] m_extension_strings [null] m_trans_id [org.openrtk.idl.epprtk.epp_TransID: { m_client_trid [null] m_server_trid [1728110331467] }] }] m_name [randomavailabletestdomain2.gg] m_creation_date [2024-10-05T06:38:51.464Z] m_expiration_date [2025-10-05T06:38:51.493Z] }
Both of those objects were created using our extension functions on top of the EPP-RTK which is in contact with my target EPP server. If your registry has a user interface, you should see that these objects have now been created and are usable going forward. For example, one contact can be used for multiple domains. For my case study, you can see that both objects were successfully created on the Channel Isles side through our EPP communication:
Domain check
Domain info
Domain create
Domain update
Domain delete
Domain transfer
Contact check
Contact info
Contact create
Contact update
Contact delete
Contact transfer
Host check
Host info
Host create
Host update
Host delete
api
├── controller
│ └── ContactController.kt
│ └── DomainController.kt
│ └── HostController.kt
└── API.kt
epp
├── contact
├── domain
│ └── CheckDomain.kt
├── host
└── EPP.kt
The job of controllers in Spring is to handle incoming HTTP requests, process them and return appropriate responses. In the context of our EPP API, controllers will act as the bridge between the client interface and our EPP functionality. Therefore, it makes logical sense to split up the three major sections into multiple classes so that the code does not become unmaintainable.
import epp.domain.checkDomain
import org.springframework.http.ResponseEntity
import org.springframework.web.bind.annotation.GetMapping
import org.springframework.web.bind.annotation.RequestParam
import org.springframework.web.bind.annotation.RestController
@RestController
class DomainController {
@GetMapping(“/domain-check”)
fun helloWorld(@RequestParam name: String): ResponseEntity<Map<String, Any>> {
val check = EPP.checkDomain(name)
return ResponseEntity.ok(
mapOf(
“available” to check
)
)
}
}
GetMapping(“domain-check”): This annotation maps the HTTP GETrequests to the domain-check route. When a GET request is made to this URL, Spring will call this function to handle it.
fun helloWorld(@RequestParam name: String): This is the function that will handle the request. The @RequestParam annotation tells Spring to extract the name parameter from the query string of the URL. For example, a request to /domain-check?=name=example.gg would set name to example.gg. This allows us to then process the EPP command with their requested domain name.
ResponseEntity<Map<String, Any>>: This is the return type of the function. ResponseEntity allows us to have full control over the HTTP response, including status code, headers and body.
val check = EPP.checkDomain(name): This line calls our EPP function to check if the domain is available (remember, it returns true if available and false if not).
return ResponseEntity.ok(mapOf(“available” to check)): This creates a response with HTTP status 200 (OK) and a body containing the JSON object with a single key available whose value is the result of the domain check.
import org.springframework.boot.runApplication
@SpringBootApplication
class API {
companion object {
fun start() {
runApplication<API>()
}
}
}
import epp.EPP
fun main() {
EPP.create()
API.start()
}
Creating SSL socket…
SSL socket created. Setting socket to EPP server…
Socket set. Getting greeting…
Greeting received: org.openrtk.idl.epprtk.epp_Greeting: { m_server_id [OTE] m_server_date [2024-10-06T05:47:08.628Z] m_svc_menu [org.openrtk.idl.epprtk.epp_ServiceMenu: { m_versions [[1.0]] m_langs [[en]] m_services [[urn:ietf:params:xml:ns:contact-1.0, urn:ietf:params:xml:ns:domain-1.0, urn:ietf:params:xml:ns:host-1.0]] m_extensions [[urn:ietf:params:xml:ns:rgp-1.0, urn:ietf:params:xml:ns:auxcontact-0.1, urn:ietf:params:xml:ns:secDNS-1.1, urn:ietf:params:xml:ns:epp:fee-1.0]] }] m_dcp [org.openrtk.idl.epprtk.epp_DataCollectionPolicy: { m_access [all] m_statements [[org.openrtk.idl.epprtk.epp_dcpStatement: { m_purposes [[admin, prov]] m_recipients [[org.openrtk.idl.epprtk.epp_dcpRecipient: { m_type [ours] m_rec_desc [null] }, org.openrtk.idl.epprtk.epp_dcpRecipient: { m_type [public] m_rec_desc [null] }]] m_retention [stated] }]] m_expiry [null] }] }
Connecting…
Connected. Logging in…
Login successful.
. ____ _ __ _ _
/\ / ___’_ __ _ _(_)_ __ __ _
( ( )___ | ‘_ | ‘_| | ‘_ / _` |
\/ ___)| |_)| | | | | || (_| | ) ) ) )
‘ |____| .__|_| |_|_| |___, | / / / /
=========|_|==============|___/=/_/_/_/
:: Spring Boot :: (v2.7.18)
2024-10-06 06:47:09.531 INFO 43872 — [ main] com.stephen.eppapi.EPPAPIKt : Starting EPPAPIKt using Java 1.8.0_382 on STEPHEN with PID 43872 (D:IntelliJ Projectsepp-apibuildclasseskotlinmain started by [Redacted] in D:IntelliJ Projectsepp-api)
2024-10-06 06:47:09.534 INFO 43872 — [ main] com.stephen.eppapi.EPPAPIKt : No active profile set, falling back to 1 default profile: “default”
2024-10-06 06:47:10.403 INFO 43872 — [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat initialized with port(s): 8080 (http)
2024-10-06 06:47:10.414 INFO 43872 — [ main] o.apache.catalina.core.StandardService : Starting service [Tomcat]
2024-10-06 06:47:10.414 INFO 43872 — [ main] org.apache.catalina.core.StandardEngine : Starting Servlet engine: [Apache Tomcat/9.0.83]
2024-10-06 06:47:10.511 INFO 43872 — [ main] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring embedded WebApplicationContext
2024-10-06 06:47:10.511 INFO 43872 — [ main] w.s.c.ServletWebServerApplicationContext : Root WebApplicationContext: initialization completed in 928 ms
2024-10-06 06:47:11.220 INFO 43872 — [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat started on port(s): 8080 (http) with context path ”
2024-10-06 06:47:11.229 INFO 43872 — [ main] com.stephen.eppapi.EPPAPIKt : Started EPPAPIKt in 2.087 seconds (JVM running for 3.574)
/domain-check?name=test.gg – {“available”:false}
/domain-check?name=thisshouldprobablybeavailable.gg – {“available”:true}
Deploying to Azure Container Apps
Now that we have our EPP API functioning locally, it is time to think about productionizing our application. Our goal is to run the API as an Azure Container App (ACA), which is a fully managed environment perfect for easy deployment and scaling of our Spring application. However, before deploying to ACA, we will need to containerise our application. This is where Azure Container Registry (ACR) comes into play. ACR will serve as the private Docker registry to store and manage our container images. It provides a centralised repository for our Docker images and integrates seamlessly with ACA, streamlining our CI/CD pipeline.
FROM openjdk:21-jdk-alpine
# Set the working directory in the container
WORKDIR /app
# Copy the JAR file into the container
COPY build/libs/*.jar app.jar
# Expose the port your application runs on
EXPOSE 8080
# Command to run the application
CMD [“java”, “-jar”, “app.jar”]
./gradlew build – build our application and package into a JAR file found under /build/libs/X.jar.
docker build -t epp-api . – tells Docker to create an image named epp-api based on the instructions in our Dockerfile.
docker run -p 8080:8080 –env-file .env epp-api – start a container from the image, mapping port 8080 of the container to port 8080 on the host machine. We use this port because this is the default port on which Spring exposes endpoints. The -p flag ensures that the application can be accessed through localhost:8080 on your machine. We also specify the .env file we created earlier so that Docker is aware of our EPP login details.
az login – if not already authenticated, be sure to log in through the CLI.
az group create –name registrar –location uksouth – create a resource group if you have not already. I have named mine registrar and chosen the location as uksouth because that is closest to me.
az acr create –resource-group registrar –name registrarcontainers —sku Basic – create an Azure Container Registry resource within our registrar resource group, with the name of registrarcontainers (note that this has to be globally unique) and SKU Basic.
az acr login –name registrarcontainers – login to the Azure Container Registry.
docker tag epp-api myacr.azurecr.io/epp-api:v1 – tag the local Docker image with the ACR login server name.
docker push myacr.azurecr.io/epp-api:v1 – push the image to the container registry!
2111bc7193f6: Pushed
1b04c1ea1955: Pushed
ceaf9e1ebef5: Pushed
9b9b7f3d56a0: Pushed
f1b5933fe4b5: Pushed
v1: digest: sha256:07eba5b555f78502121691b10cd09365be927eff7b2e9db1eb75c072d4bd75d6 size: 1365
az containerapp env create –resource-group registrar –name containers –-location uksouth – create the Container App environment within our resource group with name containers and location uksouth.
az acr update -n registrarcontainers –admin-enabled true – ensure ACR allows admin access.
az containerapp create
–name epp-api
–resource-group registrar
–environment containers
–image registrarcontainers.azurecr.io/epp-api:v1
–target-port 8080
–ingress external
–registry-server registrarcontainers.azurecr.io
–env-vars “HOST=your_host” “PORT=your_port” “USERNAME=your_username” “PASSWORD=your_password”
– creates a new Container App named epp-api within our resource group and the containers environment. It uses the Docker image stored in the ACR. The application inside the container is configured to listen on port 8080 which is where our Spring endpoints will be accessible. The -ingress external flag makes it accessible from the internet. You must also set your environment variables or the app will crash.
Setting up GitHub CI/CD
git init – Initialise a new Git repository in your current directory. This creates a hidden .git directory that stores the repository’s metadata.
git add . – Stages all of the files in the current directory and its subdirectories for commit. This means that these files will be included in the next commit.
git commit -m “Initial commit” – Creates a new commit with the staged files and a common initial commit message.
git remote add origin <URL> – Adds a remote repository named origin to your local repository, connecting it to our remote Git repository hosted on GitHub.
git push origin master – Uploads the local repository’s content to the remote repository named origin, specifically to the master branch.
Head to your Container App
On the sidebar, hit Settings
Hit Deployment
You should find yourself in the Continuous deployment section. There are two headings, let us start with GitHub settings:
Authenticate into GitHub and provide permissions to repository (if published to a GH organization, give permissions also)
Select organization, or your GitHub name if published on personal account
Select the repository you just created (for me, epp-api)
Select the main branch (likely either master or main)
Then, under Registry settings:
Ensure Azure Container Registry is selected for Repository source
Select the Container Registry you created earlier (for me, registrarcontainers)
Select the image you created earlier (for me, epp-api)
It should look something like this:
run: chmod +x gradlew
– name: Set up JDK 21
uses: actions/setup-java@v2
with:
java-version: ’21’
distribution: ‘adopt’
– name: Build with Gradle
run: ./gradlew build
Grant execute permission to gradlew – gradlew is a wrapper script that helps manage Gradle installations. This step grants execute permission to the gradlew file which allows this build process to execute Gradle commands, needed for the next steps.
Set up JDK – This sets up the JDK as the Java envrionment for the build process. Make sure this matches the Java version you have chosen to use for this tutorial.
Build with Gradle – This executes the Gradle build process which will compile our Java code and package it into a JAR file which will then be used by the last job to push to the Container Registry.
The final workflow file should look like this:
name: Trigger auto deployment
# When this action will be executed
on:
# Automatically trigger it when detected changes in repo
push:
branches:
[ master ]
paths:
– ‘**’
– ‘.github/workflows/AutoDeployTrigger-aec369b2-f21b-47f6-8915-0d087617a092.yml’
# Allow manual trigger
workflow_dispatch:
jobs:
build-and-deploy:
runs-on: ubuntu-latest
permissions:
id-token: write #This is required for requesting the OIDC JWT Token
contents: read #Required when GH token is used to authenticate with private repo
steps:
– name: Checkout to the branch
uses: actions/checkout@v2
– name: Grant execute permission for gradlew
run: chmod +x gradlew
– name: Set up JDK 21
uses: actions/setup-java@v2
with:
java-version: ’21’
distribution: ‘adopt’
– name: Build with Gradle
run: ./gradlew build
– name: Azure Login
uses: azure/login@v1
with:
client-id: ${{ secrets.AZURE_CLIENT_ID }}
tenant-id: ${{ secrets.AZURE_TENANT_ID }}
subscription-id: ${{ secrets.AZURE_SUBSCRIPTION_ID }}
– name: Build and push container image to registry
uses: azure/container-apps-deploy-action@v2
with:
appSourcePath: ${{ github.workspace }}
_dockerfilePathKey_: _dockerfilePath_
registryUrl: fdcontainers.azurecr.io
registryUsername: ${{ secrets.REGISTRY_USERNAME }}
registryPassword: ${{ secrets.REGISTRY_PASSWORD }}
containerAppName: epp-api
resourceGroup: registrar
imageToBuild: registrarcontainers.azurecr.io/fdspring:${{ github.sha }}
_buildArgumentsKey_: |
_buildArgumentsValues_
Conclusion
That is it! You have successfully built a robust EPP API using Kotlin and Spring Boot and now containerised it with Docker and deployed it to Azure Container Apps. This journey took us from understanding the intricacies of EPP and domain registration, through implementing core EPP operations, to creating a user-friendly RESTful API. We then containerised our application, ensuring consistency across different environments. Finally, we leveraged Azure’s powerful cloud service services – Azure Container Registry for storing our Docker image, and Azure Container Apps for deploying and running our application in a scalable, managed environment. The result is a fully functional, cloud-hosted API that can handle domain checks, registrations and other EPP operations. This accomplishment not only showcases the technical implementation but also opens up possibilities for creating sophisticated domain management tools and services, such as by starting a public registrar or managing a domain portfolio internally.
I hope this blog was useful, and I am happy to answer any questions in the replies. Well done on bringing this complex system to life!
Microsoft Tech Community – Latest Blogs –Read More
Remove deleted files.
Hi,
How do you remove files/dokuments from the list “Latest”, that no longer exist at my onedrive or my computer??
Thank you.
Hi,How do you remove files/dokuments from the list “Latest”, that no longer exist at my onedrive or my computer??Thank you. Read More
Teams Mobile – “Custom Notifications” NOT WORKING
When selecting “Notifications > Notify me for > Customized”, I am just being sent to the Android app notifications for this app, which does not have anything to do with custom notifications, only Permissions, where all the necessary allowances are given.
It has been like this for months…
This used to work on my old S21 ..
TLDR ; If I select customized > I am being sent to Android app notification permissions?!
Samsung Galaxy S24U, Mobile app is up to date.
When selecting “Notifications > Notify me for > Customized”, I am just being sent to the Android app notifications for this app, which does not have anything to do with custom notifications, only Permissions, where all the necessary allowances are given.It has been like this for months…This used to work on my old S21 .. TLDR ; If I select customized > I am being sent to Android app notification permissions?! Selecting this sends me to Android app notifications permissions, not custom notifications.The Android app notifications permissions where you cannot select what you are notified about. Samsung Galaxy S24U, Mobile app is up to date. Read More
Excel worksheet 的計算錯誤
A1=30.8,A2=A1;B2=33.3,B3=B2+0.01;F2=ROUND(B2*(A2-0.8),0)
B87=B86+0.01=34.15,A87=A86=30.8,F87的公式=ROUND(B87*(A87-0.8),0),工作表計算結果F87=1024
K1=30.8,K2=K1;L2=33.3,L3=L2+0.01;P2=ROUND(L2*(K2-0.8),0)
B87=B86+0.01=34.15,A87=A86=30.8,P37的公式=ROUND(L37*(K37-0.8),0),工作表計算結果P37=1025
依上列公式ROUND(B87*(A87-0.8),0)為(34.15*(30.8-0.8))=1024.5,四捨五入1025,F87的結果是錯的,P37是對的,請問這種錯誤(F87)要怎麼修正?
感謝
A1=30.8,A2=A1;B2=33.3,B3=B2+0.01;F2=ROUND(B2*(A2-0.8),0)B87=B86+0.01=34.15,A87=A86=30.8,F87的公式=ROUND(B87*(A87-0.8),0),工作表計算結果F87=1024K1=30.8,K2=K1;L2=33.3,L3=L2+0.01;P2=ROUND(L2*(K2-0.8),0)B87=B86+0.01=34.15,A87=A86=30.8,P37的公式=ROUND(L37*(K37-0.8),0),工作表計算結果P37=1025依上列公式ROUND(B87*(A87-0.8),0)為(34.15*(30.8-0.8))=1024.5,四捨五入1025,F87的結果是錯的,P37是對的,請問這種錯誤(F87)要怎麼修正?感謝 Read More
How to convert PDF to MOBI on my Windows 11 computer (Without loss)?
I often have eBooks and documents in PDF that I want to read on my Kindle, so I’m looking for assistance with converting PDF files to MOBI format on my Windows 11 computer. And I’m unsure of the best method to make this conversion. While I’ve encountered a few software options and online tools, I’m not certain which ones are reliable and easy to use, especially since I want to preserve the layout and formatting of the original document.
I would greatly appreciate your advice. Thank you.
I often have eBooks and documents in PDF that I want to read on my Kindle, so I’m looking for assistance with converting PDF files to MOBI format on my Windows 11 computer. And I’m unsure of the best method to make this conversion. While I’ve encountered a few software options and online tools, I’m not certain which ones are reliable and easy to use, especially since I want to preserve the layout and formatting of the original document. I would greatly appreciate your advice. Thank you. Read More
Help making something like a decision tree with dropdowns
Hello,
I would like help creating a thing please. I’m pretty confident that Excel will be able to handle it, but I’m not sure how do go about it.
I work in a medical practice, and for this one task we have a very complicated assigning structure which I’m wanting to simplify. The task is requested by a Doctor, and is then assigned to a Person to report on based off of who the requesting Doctor is, and what day of the week it is. It could look like any of the following scenarios:
Dr A requests the test on Wednesday, and so it is assigned to Person Z
Dr A requests the test on Tuesday, and so it is assigned to Person Y
Dr B requests the test and it is assigned to Person Y
Dr C requests the test and it is assigned to Person X, except if it’s a Tuesday then it’s Person Y
Dr D requests the test and it is assigned to Person W, but only once a day, and after that it goes to person Y on a Wednesday, and Person X every other day
This task involves a list of maybe 50 Doctors, and almost as many Person’s. I’m hoping to eventually have an Excel form where I can select a day of the week, and a Doctor, and it will tell me who the Person is.
I hope that that makes sense, and I’m hoping someone might be able to help point me in the right direction please? Thanks
Hello, I would like help creating a thing please. I’m pretty confident that Excel will be able to handle it, but I’m not sure how do go about it. I work in a medical practice, and for this one task we have a very complicated assigning structure which I’m wanting to simplify. The task is requested by a Doctor, and is then assigned to a Person to report on based off of who the requesting Doctor is, and what day of the week it is. It could look like any of the following scenarios:Dr A requests the test on Wednesday, and so it is assigned to Person ZDr A requests the test on Tuesday, and so it is assigned to Person YDr B requests the test and it is assigned to Person YDr C requests the test and it is assigned to Person X, except if it’s a Tuesday then it’s Person YDr D requests the test and it is assigned to Person W, but only once a day, and after that it goes to person Y on a Wednesday, and Person X every other day This task involves a list of maybe 50 Doctors, and almost as many Person’s. I’m hoping to eventually have an Excel form where I can select a day of the week, and a Doctor, and it will tell me who the Person is. I hope that that makes sense, and I’m hoping someone might be able to help point me in the right direction please? Thanks Read More
How to copy text from a PDF file opened through teams
Teams includes a file browser. I can open PDF-files through it. But I can’t manage to copy text. I can even select text, but CTRL+C does not work.
How does it work?
And if it is impossible…why on earth does teams not provide the most basic functionality?
Teams includes a file browser. I can open PDF-files through it. But I can’t manage to copy text. I can even select text, but CTRL+C does not work. How does it work? And if it is impossible…why on earth does teams not provide the most basic functionality? Read More
15% discount for eligible nonprofit customers on Microsoft 365 Copilot effective November 1, 2024
Microsoft 365 Copilot is now available to eligible nonprofits for $25.50 (USD) per user per month when purchased for an annual subscription and billing cycle.
To purchase Microsoft 365 Copilot, customers must have or purchase a separate license for a qualifying Microsoft 365 plan.
Eligible nonprofit customers can switch to the nonprofit-priced Microsoft 365 Copilot subscription at renewal. If they wish to add new licenses before renewal, they can start a new subscription 11/1 onwards.
Discount details
Geography: Available in all markets where Microsoft 365 Copilot is sold on CSP (Worldwide).
Customer eligibility: All eligible nonprofit customers (new and existing Microsoft 365 Copilot customers).
There is no minimum purchase requirement, and no limit on the number of licenses you can buy.
Eligible nonprofit customers must purchase a new subscription; licenses added to existing subscriptions won’t receive the nonprofit discount.
The discount is available on New Commerce Experience (NCE) only.
Additional resources:
Leverage nonprofit Copilot GTM resources for your customer conversations: Modern Work | Drive Business Transformation with Copilot
Partners, invite your nonprofit customers to the Copilot Quickstart Trainings for nonprofits.
Microsoft 365 Copilot is now available to eligible nonprofits for $25.50 (USD) per user per month when purchased for an annual subscription and billing cycle.
To purchase Microsoft 365 Copilot, customers must have or purchase a separate license for a qualifying Microsoft 365 plan.
Eligible nonprofit customers can switch to the nonprofit-priced Microsoft 365 Copilot subscription at renewal. If they wish to add new licenses before renewal, they can start a new subscription 11/1 onwards.
Discount details
Geography: Available in all markets where Microsoft 365 Copilot is sold on CSP (Worldwide).
Customer eligibility: All eligible nonprofit customers (new and existing Microsoft 365 Copilot customers).
There is no minimum purchase requirement, and no limit on the number of licenses you can buy.
Eligible nonprofit customers must purchase a new subscription; licenses added to existing subscriptions won’t receive the nonprofit discount.
The discount is available on New Commerce Experience (NCE) only.
Additional resources:
Leverage nonprofit Copilot GTM resources for your customer conversations: Modern Work | Drive Business Transformation with Copilot
Partners, invite your nonprofit customers to the Copilot Quickstart Trainings for nonprofits. Read More
Enhance Cost Optimization in Azure Cosmos DB Without Compromising Service
Optimizing Azure cosmos DB involves using strategies and best practices to reduce the overall spending on the service while maintaining and improving performance and availability.
In this blog, I’ll explore actionable strategies and tools for reducing your Azure Cosmos DB costs without sacrificing the speed, scalability, or reliability of your database. Whether you’re managing large-scale applications or developing on a budget, these insights will help you make the most out of Azure Cosmos DB.
The main sub-topics we shall dive into includes:
Pricing model
Free development
Plan for Optimization
1. Pricing model
Azure cosmos DB bills for three types of usage: compute, storage, and bandwidth. What does this mean? Let us break it down.
Compute
Azure Cosmos DB bills based on Request Units (RU) measured per second (RU/s). Request Units are currency for throughput. This represents the cost of operations on your database which can either be read, write, and query. These operations consume a certain number of RUs (Request Units) based on their complexity and size.
Storage
Azure Cosmos DB bills for consumed storage, rounded up to the next gigabyte (GB) per container, collection, table, or graph per region.
Bandwidth
Data transfer between Azure Cosmos DB and other Azure services or the internet incurs additional costs. The exact pricing depends on the amount of data transferred.
Types Azure Cosmos DB account
You get charged depending on the type of account you have. Azure cosmos DB account can be either be Provisioned Throughput or a Serverless Account.
Provisioned Throughput.
Think of provisioned throughput as the “speed limit” for your Azure Cosmos DB database and containers. It determines how many requests (or operations) your database or container can handle per second. Just like a highway with different speed limits, you allocate a specific amount of throughput to your database or container.
Based on your workloads, you can scale either upwards or downwards. However, there is a minimum throughput requirement to guarantee SLAs. You will be charged for the provisioned throughput even when you do not run any workloads, this is because the model dedicates resources to your container or your database.
What is the difference between Azure cosmos DB database and azure cosmos DB containers?
Azure cosmos DB database is a storage area where you can organize a set of containers. You can create multiple containers to hold different types of data.
Azure cosmos DB Containers are individual storage compartments within a database. They help organize data effectively. An advantage you get is that you can set different throughput for each based on the workload.
Example/ Illustration
Imagine you are developing an e-commerce application. Your database is called: OnlineStore. The containers it contains include:
Products – to store product details
Customers – to store customer information
Orders – to store orders places
Database level and Container level Throughput in Azure cosmos DB.
When you set throughput at the database level, it applies to all containers within that database. You can choose between two options for provisioned throughput.
Standard (Manual) Throughput: This is a fixed speed limit for a specific container.
Autoscale Throughput: The speed limit adjusts automatically based on demand.
The throughputs are evenly distributed to all the partitions. If a container gets more requests than it is allocated, it might slow down.
Throughput on a container.
Throughput provisioned for a container is evenly distributed among its physical partitions while it assumes the logical partitions of the container. At times, the logical partitions consume more than the throughput allocated to the underlying physical partition. If such happens, operations will be rate limited. This will need you to reprovision the throughput for the entire container. Learn more about Partitioning and horizontal scaling in Azure Cosmos DB
The image shows how a physical partition hosts one or more logical partitions of a container:
Throughput on a database.
As earlier stated, when you provision throughput on a database, it will be shared across all the containers unless you specify that the throughput should be used in some containers.
It is recommended that when you configure throughput for your database, it should be across all the containers and not any parti container.
The image below demonstrates how a physical partition can host one or more logical partitions that belong to different containers within a database:
Learn more provisioned throughput in Azure Cosmos DB
Serverless Account
The serverless account type in Azure Cosmos DB is designed for scenarios where you want to pay only for the resources you use. With the serverless option, you’re charged based on the request units (RUs) consumed by your database operations and the storage used by your data.
Uses Cases of serverless Account
You are in development or testing a service
When there is unpredicted traffic
When integrating with serverless compute services, like Azure Functions.
Learn more about Azure Cosmos DB serverless account type
2. Free development
The second way to optimize your services on Azure cosmos Db is free development. There are two ways to develop for free: You can use Free tier or use the emulator.
Free tier
The Azure Cosmos DB free tier makes it easy to get started, develop, test your applications, or even run small production workloads for free.
When you enable the free tier on an Azure Cosmos DB account, you’ll receive the following benefits:
First 1000 RU/s (Request Units per second): You get the first 1000 RU/s for free.
25 GB of Storage: You also receive 25 GB of storage at no cost.
Beyond these limits, any additional throughput or storage consumed is billed at regular prices.
The free tier is available for all API accounts with provisioned throughput, autoscale throughput, single, or multiple write regions.
With the free tier, you do not accrue any cost, however, you do not have the option for high availability.
Note that free tier is not available for serverless accounts.
You can have up to one free tier Azure Cosmos DB account per Azure subscription. If you don’t see the option to apply the free tier discount, another account in the same subscription has already been enabled with the free tier.
Learn more about Azure Cosmos DB lifetime free tier
Use the emulator
The Azure Cosmos DB emulator is a powerful tool that provides a local environment for emulating the Azure Cosmos DB service. It is mostly used for development and Testing. With the emulator, you can develop without the need for Azure subscriptions.
The emulator’s Data Explorer pane is only supported in the API for NoSQL and API for MongoDB. Understand the differences between the emulator and cloud service
3. Plan for Optimization
Estimate Costs Before Creating Resources:
Use the Azure Cosmos DB capacity calculator to estimate your workload cost before creating any resources. This tool allows you to input details such as the number of regions, data stored, and anticipated operations volume.
It provides an estimate of storage costs and provisioned throughput based on your workload parameters.
Here is an image to show how to use the capacity calculator. Visit the Documentation on Estimating RU/s using the Azure Cosmos DB capacity planner to learn how to use it
Understand the Full Billing Model
Azure Cosmos DB runs on Azure infrastructure, and costs accrue when you deploy new resources. Be aware that the costs for Azure Cosmos DB are only a portion of your monthly Azure bill, which includes all azure services and resources used in your subscription, including third-party services
Understand the full billing model
Monitor Costs
As you use resources with Azure Cosmos DB, you incur costs. Regularly monitor your usage and spending to stay within budget.
You can view this on Azure portal.
Step 1: Login to your azure portal and go to your resource.
Step 2: Go to the overview section
Step 3: At the bottom, click on monitoring to show the estimate chart., you can estimate the usage at an interval of 1hr, 24hrs,7days and 30days. You will be able to see the estimates on the chart.
Create Budgets
Set budgets in Azure Cost Management to manage costs effectively. Create alerts that will notify of spending anomalies and overspending risks.
Here are images to show you how you can achieve that.
Step 1: In your resource, expand on monitoring and select alerts and click on add rule.
Step 2: Select a signal, I will select CPU Credits consumed.
Step 3: Set the threshold that will trigger the actions
Step 4: On the sections tab, select actions. I will go with ‘Use quick actions’ a feature with is on preview. It will send me an email to alert me. Give the group name and a display name for it.
Step 5: On Details Tab, add a name to the alert then review and create the alert.
Conclusion.
We have covered several ways you can optimize your resources on Azure cosmos DB. Following the good practices, you can optimize costs while your services are still running as expected. To learn more, you can visit the links shared below.
Read More
Total Cost of Ownership (TCO) with Azure Cosmos DB
Best practices for scaling provisioned throughput (RU/s)
Provisioned throughput in Azure Cosmos DB
Learn more on how to plan and manage costs for Azure Cosmos DB
Optimize request cost in Azure Cosmos DB
Optimize storage cost in Azure Cosmos DB
Optimize multi-region cost in Azure Cosmos DB
Optimize development and testing cost in Azure Cosmos DB
Microsoft Tech Community – Latest Blogs –Read More
How to disable annoying popup in Outlook
Over the past few days, Outlook desktop (new) version 1.2024.1023.300 has been showing an annoying popup (as shown in the image) every time I open Outlook. It’s becoming quite frustrating.
I’ve tried clicking both buttons and even submitted a report, but the popup still appears every time I open Outlook.
Over the past few days, Outlook desktop (new) version 1.2024.1023.300 has been showing an annoying popup (as shown in the image) every time I open Outlook. It’s becoming quite frustrating.I’ve tried clicking both buttons and even submitted a report, but the popup still appears every time I open Outlook. Read More