Month: October 2024
Connect-MgGraph -UseDeviceCode does not prompt MFA
I am investigating different Microsoft Entra ID sign-in mechanisms to confirm the effectiveness of Microsoft Graph API with MFA. While Connect-MgGraph cmdlet itself and alongside many other flags like “-TenantId” prompted for MFA, the Connect-MgGraph -UseDeviceCode does not prompt for MFA.
The question would be “Are you sure MFA has been configured on your Azure Tenant?” Well, Good question. The answer will be “It is only the use of -UseDeviceCode that is failing to prompt the MFA. So something is quite wrong other than MFA setup on our Azure.
Is this something someone has also witnessed?
I am investigating different Microsoft Entra ID sign-in mechanisms to confirm the effectiveness of Microsoft Graph API with MFA. While Connect-MgGraph cmdlet itself and alongside many other flags like “-TenantId” prompted for MFA, the Connect-MgGraph -UseDeviceCode does not prompt for MFA. The question would be “Are you sure MFA has been configured on your Azure Tenant?” Well, Good question. The answer will be “It is only the use of -UseDeviceCode that is failing to prompt the MFA. So something is quite wrong other than MFA setup on our Azure. Is this something someone has also witnessed? Read More
Cannot start SQL Server instance: did not respond in a timely fashion
Suddenly my SQL Server instance stopped. When I try to start it in SQL Server Configuration Manager I get the following output:
Meanwhile, althought I can’t perfectly remember, I may have tried installing SQL Express Server 2022, and the installed version is SQL Server 2019. Could it have broken the server?
Furthermore, I have already tried all the fixes proposed in the internet.
Where can I find the logs?
Thanks in advance!
Suddenly my SQL Server instance stopped. When I try to start it in SQL Server Configuration Manager I get the following output:Meanwhile, althought I can’t perfectly remember, I may have tried installing SQL Express Server 2022, and the installed version is SQL Server 2019. Could it have broken the server?Furthermore, I have already tried all the fixes proposed in the internet. Where can I find the logs?Thanks in advance! Read More
Accidentially changed my admin user to a guest user … and cannot access azure anymore
Hi,
I did something stupid and now cannot access my azure anymore:
1. I wanted to integrate the MS TEams into my Google account
2. To do this, I had to create an account via Azure – i was told on different platforms
3. But since it’s the same email address (***@gmail.com) – I couldn’t invite myself as an internal user, (I already was one!) so … I turned myself into a guest user and wanted to invite me again.
Obviously, this did not work – and now I can not access azure anymore – Can someone help me how to restore my account?
Here the error messages i got:
{
“sessionId”: “448959718eec490b89d366bb329cecec”,
“errors”: [
{
“errorMessage”: “interaction_required: AADSTS16000: User account '{EUII Hidden}' from identity provider 'live.com' does not exist in tenant 'Microsoft Services' and cannot access the application 'b677c290-cf4b-4a8e-a60e-91ba650a4abe'(AzurePortal Console App) in that tenant. The account needs to be added as an external user in the tenant first. Sign out and sign in again with a different Azure Active Directory user account. Trace ID: c5672f20-06a6-421d-91d3-33ec930b3e00 Correlation ID: 304caf2a-0555-4935-bdb2-7724571ffebe Timestamp: 2024-10-15 10:35:32Z”,
“clientId”: “b677c290-cf4b-4a8e-a60e-91ba650a4abe”,
“scopes”: [
“https://management.core.windows.net//.default“
]
},
{
“errorMessage”: “interaction_required: AADSTS16000: User account '{EUII Hidden}' from identity provider 'live.com' does not exist in tenant 'Microsoft Services' and cannot access the application 'b677c290-cf4b-4a8e-a60e-91ba650a4abe'(AzurePortal Console App) in that tenant. The account needs to be added as an external user in the tenant first. Sign out and sign in again with a different Azure Active Directory user account. Trace ID: a889d113-15ce-4c6d-9b36-f62228521f00 Correlation ID: 0625b216-085c-40df-9547-e233841dac35 Timestamp: 2024-10-15 10:35:32Z”,
“clientId”: “b677c290-cf4b-4a8e-a60e-91ba650a4abe”,
“scopes”: [
“https://management.core.windows.net//.default“
]
}
]
}
Hi, I did something stupid and now cannot access my azure anymore: 1. I wanted to integrate the MS TEams into my Google account2. To do this, I had to create an account via Azure – i was told on different platforms 3. But since it’s the same email address (***@gmail.com) – I couldn’t invite myself as an internal user, (I already was one!) so … I turned myself into a guest user and wanted to invite me again. Obviously, this did not work – and now I can not access azure anymore – Can someone help me how to restore my account? Here the error messages i got: { “sessionId”: “448959718eec490b89d366bb329cecec”, “errors”: [ { “errorMessage”: “interaction_required: AADSTS16000: User account '{EUII Hidden}' from identity provider 'live.com' does not exist in tenant 'Microsoft Services' and cannot access the application 'b677c290-cf4b-4a8e-a60e-91ba650a4abe'(AzurePortal Console App) in that tenant. The account needs to be added as an external user in the tenant first. Sign out and sign in again with a different Azure Active Directory user account. Trace ID: c5672f20-06a6-421d-91d3-33ec930b3e00 Correlation ID: 304caf2a-0555-4935-bdb2-7724571ffebe Timestamp: 2024-10-15 10:35:32Z”, “clientId”: “b677c290-cf4b-4a8e-a60e-91ba650a4abe”, “scopes”: [ “https://management.core.windows.net//.default” ] }, { “errorMessage”: “interaction_required: AADSTS16000: User account '{EUII Hidden}' from identity provider 'live.com' does not exist in tenant 'Microsoft Services' and cannot access the application 'b677c290-cf4b-4a8e-a60e-91ba650a4abe'(AzurePortal Console App) in that tenant. The account needs to be added as an external user in the tenant first. Sign out and sign in again with a different Azure Active Directory user account. Trace ID: a889d113-15ce-4c6d-9b36-f62228521f00 Correlation ID: 0625b216-085c-40df-9547-e233841dac35 Timestamp: 2024-10-15 10:35:32Z”, “clientId”: “b677c290-cf4b-4a8e-a60e-91ba650a4abe”, “scopes”: [ “https://management.core.windows.net//.default” ] } ]} Read More
I lost my Admin privileges in Microsoft 365
So, I’m working in a corporate company and we had services purchased like Azure, PowerBI etc. that we were paying for a long time. And until today I was logging in with the Admin email to the 365 admin portal with my admin account. but today when I try that Email has lost it’s admin privileges.
And so to recover that account I tried directly connecting through the phone call which also had to go through an automated voice assistant. And even after finally connected with the call. the only way they were about to provide a help was to telling them what is the current admin account’s email address. which is like the reason why we called them because we have a security breach and don’t know who did that. And I had all my previous admin accounts with credentials and all payment details etc. but I had to talk to some guy for like 20 minutes that just repeating the same thing like tell me the current admin email so w can help you further. Like if I know that why would I even call them. And I have all the details of my previous info but how can I know what the email that the attacker has used in just one day.
So, I’m working in a corporate company and we had services purchased like Azure, PowerBI etc. that we were paying for a long time. And until today I was logging in with the Admin email to the 365 admin portal with my admin account. but today when I try that Email has lost it’s admin privileges. And so to recover that account I tried directly connecting through the phone call which also had to go through an automated voice assistant. And even after finally connected with the call. the only way they were about to provide a help was to telling them what is the current admin account’s email address. which is like the reason why we called them because we have a security breach and don’t know who did that. And I had all my previous admin accounts with credentials and all payment details etc. but I had to talk to some guy for like 20 minutes that just repeating the same thing like tell me the current admin email so w can help you further. Like if I know that why would I even call them. And I have all the details of my previous info but how can I know what the email that the attacker has used in just one day. Read More
What’s new in FinOps toolkit 0.6 – September 2024
Whether you consider yourself a FinOps practitioner, someone who’s enthusiastic about driving cloud efficiency and maximizing the value you get from the cloud or were just asked to look at ways to reduce cost, the FinOps toolkit has something for you. This month, we’re excited to share a new library for FinOps best practices, new Power BI reports for governance and workload optimization, promoted tags in Power BI, more datasets in FinOps hubs, a consolidated tool for all FinOps workbooks, more Azure Optimization Engine improvements, an updated services mapping file that includes FOCUS 1.1 ServiceSubcategory, and other small improvements and bug fixes.
New to FinOps toolkit?
In case you haven’t heard, the FinOps toolkit is an open-source collection of tools and resources that help you learn, adopt, and implement FinOps in the Microsoft Cloud. The foundation of the toolkit is the Implementing FinOps guide that helps you get started with FinOps whether you’re using native tools in the Azure portal, looking for ways to automate and extend those tools, or if you’re looking to build your own FinOps tools and reports. To learn more about the toolkit, how to provide feedback, or how to contribute, see the FinOps toolkit site.
Introducing the FinOps best practice library
FinOps is an extremely broad space. Whether you’re looking for more insight into your usage, how that usage is priced, how to identify anomalies based on unique pricing models, how to allocate and build a chargeback model for shared costs across them, how to forecast and budget for them, and so on. And this isn’t something you do once, either. This is something you need to do for each and every service. And as new services and pricing models are introduced, this challenge continues to grow year after year. Learning about each of these areas for every service your organization uses requires a staggering effort. Building out a collection of lessons learned and proven practices and formalizing those into flexible tools and resources was one of the foundational goals of the FinOps toolkit. And with that, we are happy to introduce the FinOps best practices library.
As a starting point, we’ve pulled in some of the key queries from the Cost optimization workbook. Going forward, we will continue to build out the library to include more than just queries, but also cover tips and tricks for how to understand, optimize, and quantify the value of each of the services you use. Of course, as I mentioned, this is a very formidable task. With that, we are looking for feedback on what you would like to see next. And for those who’ve amassed their own collection of proven practices, we encourage you to share them with others via this central resource.
To learn more about the FinOps best practices library, see Unlocking Azure savings: Introducing the FinOps best practices library. And if you have any requests to add to the library or want to submit your own tips and tricks, create an issue or, better yet, submit a pull request! Learn more about the many ways to contribute and jump right in!
New Power BI reports for governance and workload optimization
In today’s fast-paced environment, engineering, business, and finance teams must work together to accelerate product development and maximize business value through better financial control and predictability. But this can only happen when FinOps data is easily accessible by all stakeholders. And while engineers have many tools in the Azure portal, business and finance teams historically haven’t had access to details about what’s deployed in the cloud or the optimization opportunities that might exist. This is where FinOps toolkit Power BI reports come in. And in September, the FinOps toolkit now includes new governance and workload optimization reports to offer even more clarity.
The Governance report summarizes your Microsoft Cloud governance posture and offers standard metrics aligned with the Cloud Adoption Framework to facilitate identifying issues, applying recommendations, and resolving compliance gaps. The report includes many views including a summary of subscriptions and resources, policy compliance, virtual machines, managed disks, SQL databases, and network security groups.
The Workload optimization report provides insights into resource utilization and efficiency opportunities based on historical usage patterns. Specifically, you can get a summary of Azure Advisor cost recommendations or review any managed disks that are not currently being used and may no longer be needed anymore. We recommend reviewing unattached disks to determine if they are still needed and deleting any that aren’t to avoid unnecessary storage costs.
Both reports are just the beginning of what’s possible. They leverage Azure Resource Graph and will require the person or service principal used to refresh reports to have at least read access to the subscriptions you want to report on. We’ll continue to expand both reports to cover more scenarios and bring additional clarity based on your feedback. We encourage you to build on these reports and let us know what you’d like to see next in upcoming releases.
Reporting on tags in Power BI
One of the most important steps to understanding your cloud costs is knowing who’s responsible. While identifying costs based on subscriptions and resource groups provides a simple mechanism for tracking accountability, it often isn’t enough to provide a holistic view for leaders across the organization. This is why many organizations use tags to amend the cloud cost and usage with metadata that allows them to map costs back to responsible projects and teams, identify engineering owners, define the purpose, identify environment, and more. Now, the latest version of the FinOps toolkit reports include an option to extract specific tags to support you building your own custom reports.
To update the list of promoted tags, go to Transform data > Storage > CostDetails, select Advanced Editor to view the underlying query, update the list of PromotedTags as desired, and select Done, then Close & Apply. The list of tags will be extracted into “tag_*” columns in the CostDetails table. Once data is refreshed, you can customize existing visuals to include your tags or build out new pages and reports to suit your needs.
Of course, there’s a lot to do when it comes to tagging, metadata, and the larger allocation space. Let us know what you’d like to see next. We’d like to add a more comprehensive allocation engine into FinOps hubs in the future, so understanding your needs will help inform that design. Please join us in FinOps toolkit discussions to share your perspective on this or any other capability.
Ingest all Cost Management datasets in FinOps hubs
In August, we added the ability to point Power BI reports to raw exports without FinOps hubs to support all exportable datasets from Cost Management. In September, we completed the other half of that by adding native support for all Cost Management datasets, data formats, and compression options in FinOps hubs. This provides a simpler, more performant option for ingesting and working with data at scale in storage.
Cost Management supports the following exportable datasets:
Cost and usage
Price sheet
Reservation details
Reservation recommendations
Reservation transactions
Note the price and reservation exports are only available for Enterprise Agreement billing accounts and Microsoft Customer Agreement billing profiles today.
If you currently have CSV exports or are still using the Cost Management connector for reservation recommendations, we highly recommend updating to use parquet exports with snappy compression, when available, and to switch to reservation recommendations coming from exports rather than the connector. As a reminder, the Cost Management connector is no longer being maintained so this provides you a nice option to move forward.
Performance improvements in FinOps hubs and Power BI
Given the breadth and depth of data needed to manage and optimize cost, usage, and carbon over time, performance and scalability are two critical aspects of any FinOps practice. This is one of the core design principles for FinOps hubs. And as we continue to lay the foundational elements to enable our vision of FinOps, we continue to look back at ways to optimize what we have so far. In September, we introduced a few changes to streamline performance and improve scalability.
When FinOps hubs was first released, we converted Cost Management CSV exports to parquet to improve data refresh speeds and scale to larger datasets for reporting on raw cost data in Power BI. And now that Cost Management has native support for Gzip CSV and snappy parquet exports, FinOps hubs has been updated to support ingestion of these format and compression options. If you’re using CSV exports today, we highly recommend switching to snappy parquet exports as this provides improved performance and works better with Power BI incremental refresh than the current parquet conversion in Azure Data Factory. Once you’ve updated to FinOps hubs 0.6, simply delete the old exports and create new ones with snappy parquet.
Looking beyond initial ingestion, we’ve also been evaluating ways to streamline data loading in Power BI and other tools, like Microsoft Fabric or databases. With the inclusion of additional datasets, we realized it was time to change how data is stored. For details about which versions of reports work with which versions of FinOps hubs, see the compatibility guide. When you identify the right target release, use the upgrade guide to help.
Beyond these changes in FinOps hubs, the CostDetails and Prices queries were also optimized to reduce load time. These changes will impact anyone using FinOps toolkit Power BI reports, whether using them against raw storage or FinOps hubs.
Stay tuned for more performance and scalability improvements. We’re eager to enable large scale data analytics on top of all datasets to unlock new scenarios and capabilities.
Get the latest FinOps workbooks in one convenient package
Every month we look for ways to improve the FinOps workbooks to make it easier for you to optimize and govern your cloud environment. In September, we streamlined the deployment experience to make it easier for you to get the latest workbooks into your environment with a single FinOps workbooks template.
When you deploy the FinOps workbooks template, you’ll see a new option to select which workbooks you want. It’s that simple. As we look to include additional workbooks, you can simply redeploy the template to get the latest and greatest improvements to existing workbooks as well as any new workbooks.
With that in mind, let us know what you’d like to see. Whether you’re looking for more capabilities in the optimization or governance workbooks or maybe you’re interested in coverage of a new FinOps capability or Microsoft Cloud service. Whatever you need, we’re hear to help. We evaluate changes to our workbooks every month so let us know what you’d like to see next!
What’s new in Azure Optimization Engine
Last month, I talked about how important security is to us at Microsoft. In September, we continued our secure by default push by improving storage account security in the Azure Optimization Engine (AOE) and also improved troubleshooting documentation and deprecated the legacy Log Analytics agent in the process.
AOE runbooks have all been updated to replace key-based authentication against Azure storage with Entra ID authentication. Deployment scripts were also updated to remove plain text Entra ID token responses for added security.
And in an effort to provide additional self-help guidance for troubleshooting common issues, AOE now includes a troubleshooting page with the most common deployment and runtime issues and their respective solutions. We hope this will save you time if you ever run into an issue. And if you find anything missing, let us know where you’re getting stuck and how we can help.
Finally, with the deprecation of the legacy Log Analytics agent in August 2024, we stopped maintaining the legacy agent-related AOE setup assets and now recommend everyone migrate to the Azure Monitor Agent and corresponding toolset. For additional details, refer to Migrate to Azure Monitor Agent from Log Analytics agent.
New mapping for FOCUS 1.1 ServiceSubcategory
As many of you already know, I’m a staunch believer and proponent of the FinOps Open Cost and Usage Specification (FOCUS). FOCUS has so much potential to streamline every corner of FinOps, from early education and enablement to advanced optimization and unit economics. And as both a FOCUS steering committee member and maintainer, I can say that, as proud as we were to ship FOCUS 1.0 in June, that didn’t slow us down. FOCUS members from all corners of the globe continue to dedicate their time to pushing FOCUS forward day after day. And with a goal of shipping 2 updates every year, we’re coming up on the FOCUS 1.1 release. Of specific interest to the FinOps toolkit is one of our open data files that facilitates mapping resources to services, service categories, and now – new as of FOCUS 1.1 – service subcategories.
For those who aren’t familiar, ServiceName in FOCUS refers to the service the resource type falls into. This is distinctly different from MeterCategory or even the current ServiceName column in actual and amortized cost datasets because those revolve around the usage and not the resource. Perhaps my favorite example is this: If you calculate the total cost from all rows where ResourceType is “Microsoft.Compute/virtualMachines” and compare that to the sum of cost where MeterCategory is “Virtual Machines”, some of you may be surprised to learn these return different totals. The reason is because each resource emits different types of usage, like bandwidth, which is categorized as a networking charge rather than a VM charge. FOCUS ServiceName improves on this by helping you quantify the total cost of all resources within a specific service. This is what the Services mapping file in the toolkit provides.
FOCUS also introduced a provider-agnostic categorization of services, which can be helpful when grouping and aggregating costs across providers. While each provider has their own columns to track the type of service (more accurately, the types of usage), the names of and values in those columns are currently inconsistent given there has never been a centralized standard to align to. With FOCUS, you’re able to group by or filter on ServiceCategory across providers for a single set of consistent groups for simpler reporting and quicker answers.
Coming soon, in FOCUS 1.1, you’ll also see a new ServiceSubcategory that breaks ServiceCategory down to the next level. As an example, Compute is broken down into Virtual Machines, Containers, Serverless Compute, and more. Databases are broken down into Relational Databases, NoSQL Databases, Caching, and more. The list goes on. As the FOCUS ServiceCategory column was finalized, we updated the Services mapping file in the toolkit to include this additional detail so you can now apply that to your own datasets, whether you’re using an existing FOCUS version, actual or amortized costs, or even if you’re interested in categorizing other resource datasets. There are many uses of a provider-agnostic categorization of services and this dataset will help you achieve your goals, whatever they might be.
And for those interested in leveraging this data from PowerShell, you can also use the Get-FinOpsService command from the FinOps toolkit PowerShell module, which now includes a -ServiceSubcategory filter option.
If this sounds interesting, please do check out the other open data files available in the toolkit and let us know what you’d like to see next.
Other new and noteworthy updates
Many small improvements and bug fixes go into each release, so covering everything in detail can be a lot to take in. But I do want to call out a few other small things that you may be interested in.
In FinOps hubs:
Renamed the following pipelines to be clearer about their intent:
config_BackfillData to config_StartBackfillProcess.
config_ExportData to config_StartExportProcess.
config_RunBackfill to config_RunBackfillJob.
config_RunExports to config_RunExportJobs.
Changed the storage ingestion path from “{scope}/{yyyyMM}/{dataset}” to “{dataset}/{yyyy}/{MM}/{dataset}”
Improved error handling in the config_RunBackfillJob and config_StartExportProcess pipelines which was causing them to fail in some situations.
Removed the temporary Event Grid resource from the template which was attempting to streamline first-time setup, but inadvertently caused unexpected costs in scenarios where the deployment cleanup script failed.
In Power BI reports:
Documented how to use storage account SAS tokens to setup the reports.
Documented how to preview reports with sample data using Power BI Desktop.
In the Prices query, we renamed ChargePeriodStart/End to x_EffectivePeriodStart/End and updated x_SkuId when not set correctly.
In open data:
48 new resource types were added and 14 were updated.
4 new service mappings were added.
What’s next
Here are a few of the things we’re looking at in the coming months:
FinOps hubs will enable large scale analytics in Azure Data Explorer and add support for private endpoints.
FinOps workbooks will continue to get recurring updates, expand to more FinOps capabilities, and add cost from FinOps hubs.
Azure Optimization Engine will continue to receive small updates as we plan out the next major release of the tool.
Each release, we’ll try to pick at least one of the highest voted issues (based on 👍 votes) to continue to evolve based on your feedback, so keep the feedback coming!
To learn more, check out the FinOps toolkit roadmap, and please let us know if there’s anything you’d like to see in a future release. Whether you’re using native products, automating and extending those products, or using custom solutions, we’re here to help make FinOps easier to adopt and implement.
Microsoft Tech Community – Latest Blogs –Read More
Not a valid installation folder for Replicator SSMS
Hi,
I’m trying to configure the replicator in SSMS but I have to install it. Then problem arises when selecting a folder to proceed with the instalation as it says “it is not a valir installation folder”.
I have tested disabling the Windows Firewall but didn’t work.
Any help?
Thanks.
Hi,I’m trying to configure the replicator in SSMS but I have to install it. Then problem arises when selecting a folder to proceed with the instalation as it says “it is not a valir installation folder”.I have tested disabling the Windows Firewall but didn’t work.Any help?Thanks. Read More
Using the Members of a Dynamic Microsoft 365 Group to Populate an Adaptive Scope
Adaptive searches are a nice way to target users, sites, and groups for Purview retention processing. But a user adaptive scope can’t select members of a group and target them. That is, unless you use the same attribute to identify users for both a dynamic group and an adaptive scope, which is what’s explained here.
https://office365itpros.com/2024/10/15/adaptive-scope-group-members/
Adaptive searches are a nice way to target users, sites, and groups for Purview retention processing. But a user adaptive scope can’t select members of a group and target them. That is, unless you use the same attribute to identify users for both a dynamic group and an adaptive scope, which is what’s explained here.
https://office365itpros.com/2024/10/15/adaptive-scope-group-members/ Read More
एयरटेल पेमेंट बैंक की लिमिट कितनी है?
आप इनमें से किसी भी चैनल का उपयोग करके मीशो ग्राहक सहायता टीम (08102↑611↑817} तक पहुंच सकते हैं और जितनी जल्दी हो सके अपनी शिकायत दर्ज कर सकते हैं।
आप इनमें से किसी भी चैनल का उपयोग करके मीशो ग्राहक सहायता टीम (08102↑611↑817} तक पहुंच सकते हैं और जितनी जल्दी हो सके अपनी शिकायत दर्ज कर सकते हैं। Read More
Display Issue opening files via Sharepoint “Open in App”
Hello MS Tech Community,
We frequently work with CAD files/programs on SharePoint. Using SharePoint’s “Open in App” feature, we can open and edit CAD files directly in the desired CAD application.
However, we have encountered the following issue: When we open CAD files via the “Open in App” feature, the user interface in several CAD programs is misaligned. Buttons appear disproportionately large, the drawing window is too small, and the entire display is distorted. However, when we open the same file through File Explorer (e.g., via OneDrive synchronization), the file is displayed correctly without any issues.
Affected programs: Archicad, Cadwork, Rhino
This issue has been occurring since we first started working with CAD files on SharePoint, approximately 3–4 months ago. It still persists, and we tested it again last week with the same result.
My colleagues are experiencing the exact same issue. We’ve tried different devices and users.
The support of the affected programs didn’t have any solution, so we asume it’s a sharepoint / onedrive issue. We also contacted the Microsoft Support. They told us to open a ticket in this community.
We have some console logs of the file openig process. There are some differences when we open it in the local file System and via “Open in App” in Sharepoint.
Example missalignment (Buttons way to big, drawing window (black) should be the whole size of the white display):
Does anyone have any ideas on how to solve this problem or has experienced something similar? We would be grateful for any help or suggestions.
Thank you in advance and best regards,
Hello MS Tech Community,We frequently work with CAD files/programs on SharePoint. Using SharePoint’s “Open in App” feature, we can open and edit CAD files directly in the desired CAD application. However, we have encountered the following issue: When we open CAD files via the “Open in App” feature, the user interface in several CAD programs is misaligned. Buttons appear disproportionately large, the drawing window is too small, and the entire display is distorted. However, when we open the same file through File Explorer (e.g., via OneDrive synchronization), the file is displayed correctly without any issues.Affected programs: Archicad, Cadwork, Rhino This issue has been occurring since we first started working with CAD files on SharePoint, approximately 3–4 months ago. It still persists, and we tested it again last week with the same result.My colleagues are experiencing the exact same issue. We’ve tried different devices and users. The support of the affected programs didn’t have any solution, so we asume it’s a sharepoint / onedrive issue. We also contacted the Microsoft Support. They told us to open a ticket in this community. We have some console logs of the file openig process. There are some differences when we open it in the local file System and via “Open in App” in Sharepoint. Example missalignment (Buttons way to big, drawing window (black) should be the whole size of the white display): Does anyone have any ideas on how to solve this problem or has experienced something similar? We would be grateful for any help or suggestions.Thank you in advance and best regards, Read More
एयरटेल पेमेंट बैंक की लिमिट कितनी है?
एयरटेल पेमेंट बैंक की लिमिट कितनी है?
ग्राहक सहायता टीम (08102↑611↑817} तक पहुंच सकते हैं और जितनी जल्दी हो सके अपनी शिकायत दर्ज कर सकते हैं।
एयरटेल पेमेंट बैंक की लिमिट कितनी है? ग्राहक सहायता टीम (08102↑611↑817} तक पहुंच सकते हैं और जितनी जल्दी हो सके अपनी शिकायत दर्ज कर सकते हैं। Read More
macOS Onboarding – Profile installation failed
0Hi at all 🙂
Hope you can help me with my onboarding problem with macOS 🙂
I have a user with a existing macOS device. In the Microsoft Learn docs I found that it is possible to add existing devices without the need of completely reset them. The device is company owned and registered in Apple School Manager.
– device successfully transferred to Intune
– device added to enrollment profile
– enrollment restrictions: BYOD not allowed
For onboarding I´ve downloaded and started the company portal app and logged in. Then i downloaded the profile and pressed on the install button.
Then following message appears:
How can i fix this, can find any solution 😞
0Hi at all 🙂 Hope you can help me with my onboarding problem with macOS 🙂 I have a user with a existing macOS device. In the Microsoft Learn docs I found that it is possible to add existing devices without the need of completely reset them. The device is company owned and registered in Apple School Manager.- device successfully transferred to Intune- device added to enrollment profile- enrollment restrictions: BYOD not allowed For onboarding I´ve downloaded and started the company portal app and logged in. Then i downloaded the profile and pressed on the install button. Then following message appears: How can i fix this, can find any solution 😞 Read More
एयरटेल पेमेंट बैंक की लिमिट कितनी है?
एयरटेल पेमेंट बैंक की लिमिट कितनी है?ग्राहक सहायता टीम (08102↑611↑817} तक पहुंच सकते हैं और जितनी जल्दी हो सके अपनी शिकायत दर्ज कर सकते हैं।
एयरटेल पेमेंट बैंक की लिमिट कितनी है?ग्राहक सहायता टीम (08102↑611↑817} तक पहुंच सकते हैं और जितनी जल्दी हो सके अपनी शिकायत दर्ज कर सकते हैं। Read More
Нужна помощь не запускаются 4 программы от хрумер в виндовс 11 24H2 сборка 26100.2152
Разработчики помогите пожалуйста?
Нужна помощь не запускаются 4 программы от хрумер в виндовс 11 24H2 сборка 26100.2152 в версии 23H2 программы работали без проблем в новой сборке не запускаются вылетают ошибки.
Помогите пожалуйста решить проблему?
Разработчики помогите пожалуйста?Нужна помощь не запускаются 4 программы от хрумер в виндовс 11 24H2 сборка 26100.2152 в версии 23H2 программы работали без проблем в новой сборке не запускаются вылетают ошибки.Помогите пожалуйста решить проблему? Read More
Please recommend a good and easy Windows 10 backup software for my PC?
I’m searching for a reliable and easy-to-use backup software for my Windows 10 PC. I need something that can handle full system backups as well as file/folder backups with minimal hassle. Ideally, it should have a simple interface, offer scheduling options, and be able to restore data quickly if needed.
Any recommendations for Windows 10 backup software you’ve found user-friendly and effective? Free options are a bonus, but I’m open to paid solutions as well.
Cheers!
I’m searching for a reliable and easy-to-use backup software for my Windows 10 PC. I need something that can handle full system backups as well as file/folder backups with minimal hassle. Ideally, it should have a simple interface, offer scheduling options, and be able to restore data quickly if needed. Any recommendations for Windows 10 backup software you’ve found user-friendly and effective? Free options are a bonus, but I’m open to paid solutions as well. Cheers! Read More
VIVA Learning own content from subsites
Hello
I like to know if I can use on the SharePoint Site Collection (Landig Page) for own content subsites.
We have created a structure based on the BA & BU’s of the company and utilizing for BU’s subsites in VIVA Learning. The question is now, will be the content on the subsites (libraries) indexed?
Or can I use only folders in that site collection?
STRUCTURE
BA
BU-A BU-B BU-C
Library-Finance Library-Finance Library-Finance
Library-HRD
The idea is to give each BU the possibility to manage their own content and on BA level the content for ALL globally.
The index for the own content in the SharePoint Site will be created in 24h from the Mirosoft cloud.
I like to know if the subsites in that site will be indexed too…
Any help is appreciated and thanks in advance.
Kind regards
Michael
HelloI like to know if I can use on the SharePoint Site Collection (Landig Page) for own content subsites.We have created a structure based on the BA & BU’s of the company and utilizing for BU’s subsites in VIVA Learning. The question is now, will be the content on the subsites (libraries) indexed?Or can I use only folders in that site collection?STRUCTURE BA BU-A BU-B BU-C Library-Finance Library-Finance Library-Finance Library-HRD The idea is to give each BU the possibility to manage their own content and on BA level the content for ALL globally.The index for the own content in the SharePoint Site will be created in 24h from the Mirosoft cloud.I like to know if the subsites in that site will be indexed too… Any help is appreciated and thanks in advance.Kind regardsMichael Read More
Introducing Azure Product Retirement Livestreams
The Azure Retirements team, in collaboration with key partner groups, is excited to provide a transparent and interactive platform to share essential updates about Azure product retirements.
We want our customers and partners to be well-informed about the latest retirement announcements, the process guiding each announcement, the official communication channels, and what to expect throughout the migration experience.
Register to attend one of our upcoming livestreams on Microsoft Reactor:
Option 1: Thursday, October 24, 12:00 PM – 1:00 PM (UTC-4:00)
Option 2: Wednesday, October 30, 7:30 PM – 8:30 PM (UTC-4:00)
Featuring opening remarks by:
Mark Russinovich, Azure CTO and Deputy CISO
John Sheehan, Azure Quality CVP
As proud sponsors of the Azure Product Retirement program, Mark and John will kick off the events, offering valuable insights into their roles and priorities for the customer experience. Participants will be able to ask questions and share feedback via a moderated chat, helping us prioritize improvements to our processes.
Don’t miss out on these opportunities to engage directly with the Azure team. Register today. We look forward to seeing you there!
Microsoft Tech Community – Latest Blogs –Read More
Azure Cosmos DB for MongoDB
I am Suniti ( LinkedIn ), a Microsoft Learn Student Ambassador, in this blog, we will delve into the key features and benefits of Azure Cosmos DB for MongoDB, guiding developers on choosing the right architecture for scaling modern applications.
Let’s say your app is a social media platform for artists, where users can upload and share artwork, write posts, and interact through comments and likes. The app quickly gains popularity, especially after a few high-profile artists start using it and talking about it online. Suddenly, thousands of new users are signing up every hour, and your existing MongoDB infrastructure can’t keep up with the massive increase in traffic. By using Azure Cosmos DB for MongoDB, you can keep using your familiar MongoDB tools while benefiting from a highly scalable, globally available, and cost-efficient solution—allowing you to focus on improving your app, rather than worrying about infrastructure limitations.
Now, you have two architecture options for scaling your MongoDB app using Azure Cosmos DB for MongoDB : vCore Architecture and Request Unit (RU) Architecture.
1. vCore Architecture:
This option provides dedicated instances for your MongoDB app, offering a familiar scaling and seamless integration with Azure services, for users with existing MongoDB knowledge.
Seamless AI Integration: Let’s say your platform is introducing an AI feature that recommends artwork based on user preferences. With Integrated Vector Database, you can store both transactional data (user uploads, comments) and vector data (AI-generated recommendations) in the same database. This removes the complexity and cost of sending data between services, streamlining AI integration.
Efficient Text Search: Artists often search for specific keywords, like “abstract,” to find similar work. Using Text Indexes, you can enhance these searches across your MongoDB collections, ensuring quick, relevant results without extra complexity.
Vertical Scaling with No Shard Key: If your app grows and hits massive data sizes, you can scale vertically without worrying about defining a shard key, simplifying development and saving time. For example, as users upload larger and more high-resolution artwork, your database can scale effortlessly without requiring additional management.
Cost-Efficiency with Familiar Pricing: You pay based on compute (vCores & RAM) and storage (disks). So, if you’ve already been using MongoDB, you don’t have to adjust to a new pricing model. The flat pricing makes it easier to manage your costs as your platform grows.
Backup and Restore: With 35 days of backups and point-in-time restore (PITR), you’re protected from data loss, even if something goes wrong. For example, if a user accidentally deletes their profile, you can easily restore their data from a backup.
2. Request Unit (RU) Architecture:
This architecture is ideal for cloud-native apps and offers a flexible and dynamic scaling approach.
Instant Scalability: As influencers drive traffic spikes on your platform, the Autoscale feature dynamically adjusts capacity to handle the influx. Unlike other services that may take hours to scale up, Azure Cosmos DB can handle your increased traffic instantly, ensuring users don’t experience delays.
Automatic Sharding: As your platform grows and your database expands, automatic and transparent sharding ensures that scaling happens behind the scenes. This means you won’t have to manually manage or configure shards as your app grows to handle millions of users horizontally.
High Availability and Global Access: With Five 9’s (99.999%) availability and an active-active database, your app will continue to run even if part of the infrastructure fails. For example, if users from different regions are uploading artwork simultaneously, Azure Cosmos DB ensures there’s no single point of failure, and data is always accessible. MongoDB global clusters only support active-passive deployments for writes for the same data.
Real-Time Analytics (HTAP) : Your app’s dashboard tracks user engagement, and you want to run real-time analytics to see which types of artworks are gaining the most attraction. With Azure Synapse Link, you can do this without affecting the performance of your platform, allowing you to generate instant reports on user behaviour.
Serverless Deployments: When traffic is low, such as during off-peak hours, the serverless capacity mode means you only pay for the operations performed. If no users are uploading art at 3 a.m., you’re not paying for unused resources, making it highly cost-efficient.
Which Architecture, Should You Choose? 🤔
Choose vCore-based if:
You’re migrating an existing MongoDB workload or building a new MongoDB application.
Your workload involves long-running queries, complex aggregation pipelines, distributed transactions, or joins.
Your application requires 99.995% availability.
You need native support for storing and searching vector embeddings.
Choose RU-based if:
You’re developing new cloud-native MongoDB apps or refactoring existing apps for cloud-native environments.
Your workload focuses on point reads (fetching a single item by its _id and shard key) and has fewer long-running queries or complex aggregation operations.
You need unlimited horizontal scalability, instant scale-up, and precise throughput control.
Your application demands industry-leading 99.999% availability.
Both options ensure your app can handle growing traffic, provide high availability, and optimize costs based on your specific needs.
Why Cosmos DB ?
Azure Cosmos DB and MongoDB are both NoSQL databases that are highly available, scalable, and globally distributed. However, they have different strengths and weaknesses, here let’s discuss about why to move to Cosmos DB for MongoDB:
Highest uptime SLA for MongoDB offered by Cosmos DB
Instant up/down scaling with zero warm up period
Native integration with Azure Synapse Link for analytics
Cosmos DB supports multiple data models within a single database
Scalability in Azure Cosmos DB for MongoDB –
Horizontal Scaling – data and operations are split throughout a group of machines. The process of splitting the data is called sharding. It has no limits, you can data across multiple servers and data centres, Cosmos DB for MongoDB automatically manages, it for you.
Vertical Scaling – is achieved by upgrading to larger and larger machines, this will hit limits.
Integration with Azure : Azure Cosmos DB for MongoDB integrates with the Azure ecosystem, allowing developers to use their existing MongoDB tools and applications.
Conclusion
Azure Cosmos DB is a fully managed database solution designed for modern app development, supporting NoSQL, relational, and vector data models. It delivers single-digit millisecond response times, automatic instant scalability, and guarantees performance at any scale. Notably, it powers ChatGPT’s dynamic scalability with high reliability and minimal maintenance.
One key advantage of Azure Cosmos DB for MongoDB is its comprehensive SLA, which covers the entire stack, including the database and underlying infrastructure. Unlike third-party MongoDB services such as MongoDB Atlas, which only cover the database and exclude services, hardware, or software provided by the cloud platform.
Refer Resources –
Introduction/Overview – Azure Cosmos DB for MongoDB | Microsoft Learn
Your MongoDB app reimagined | Microsoft Learn
Introducing Azure Cosmos DB for MongoDB vCore: Now Generally Available! – Azure Cosmos DB Blog
Introduction/Overview – Azure Cosmos DB for MongoDB (vCore) | Microsoft Learn
Choose between RU-based and vCore-based models – Azure Cosmos DB for MongoDB | Microsoft Learn
Create Your Azure Free Account Or Pay As You Go | Microsoft Azure
Use Azure Cosmos DB for Free
Analytics with Azure Synapse Link – Azure Cosmos DB | Microsoft Learn
Microsoft Tech Community – Latest Blogs –Read More
New embedding model version compatibility
Hello,
I noticed that the embedding models on Azure OpenAI are versioned, and I’m currently using version 1. When the model is updated to version 2, will the embeddings calculated with version 1 still be compatible? In my application, I store embeddings, so I’d like to know if I’ll be able to perform vector searches using embeddings from both version 1 and version 2.
Thank you!
Hello,I noticed that the embedding models on Azure OpenAI are versioned, and I’m currently using version 1. When the model is updated to version 2, will the embeddings calculated with version 1 still be compatible? In my application, I store embeddings, so I’d like to know if I’ll be able to perform vector searches using embeddings from both version 1 and version 2.Thank you! Read More
Automate Markdown and Image Translations Using Co-op Translator: Phi-3 Cookbook Case Study
Automate Markdown and Image Translations Using Co-op Translator: Phi-3 Cookbook Case Study
Overview
In today’s global environment, it’s important to make technical content accessible to a wider audience. This is especially true for open source projects and technical documentation. In this blog post, I’ll introduce you to Co-op Translator, an open source tool designed to automate multilingual translations for your projects.
I’ll show you how I used the translation tool to effortlessly translate the Phi-3CookBook which is an official open source guide for Phi-3 / 3.5 small language model. You’ll be able to follow the same process for your own projects, streamlining the translation of even large technical resources.
What is Co-op Translator?
Co-op Translator is an open source tool designed to automate the translation of Markdown files and images containing embedded text into multiple languages. Powered by Azure AI Services, it streamlines the traditionally time-consuming translation process, allowing you to make your projects globally accessible with minimal manual effort.
Below is the architecture of this project:
The process begins with markdown and image files from your project folder, which are then processed by Azure AI Services. Azure OpenAI handles text translations from the markdown files, while Azure Computer Vision extracts text from the images. Once the text is extracted, Azure OpenAI translates the text from the images. The final translated files – both markdown and images – are saved in the designated translation folder, ready for use in multiple languages.
Key features
Translates both Markdown files and text within images.
Supports multiple languages simultaneously.
Leverages Azure Computer Vision and Azure OpenAI for high-quality translations.
Can be easily integrated into your existing workflows.
Table of contents
Phi-3 Cookbook translation: A case study
Set up Azure resources
Create the .env file in the root directory
Install the Co-op Translator package
Translate your project using Co-op Translator
Conclusion: Making your projects global
Phi-3 Cookbook translation: A case study
The Phi-3 Cookbook is an official open-source guide that provides detailed instructions on Phi-3 and Phi-3.5 small language models. Given its technical nature and importance to the global AI community, translating it into multiple languages was a crucial step toward making this valuable resource accessible to non-English-speaking developers and researchers.
By using Co-op Translator, I was able to streamline the translation process, automating the conversion of both Markdown files and images containing embedded text into several languages. This case study explains how the tool was applied to the Phi-3 Cookbook, the challenges encountered, and the solutions implemented.
The translation process
Preparation: The first step was to organize the markdown files and image assets from the Phi-3 Cookbook. I removed the previously manually translated files to ensure a clean slate for the automated process, as leaving them would result in repeated translations. I also created an .env file in the root directory to securely store the necessary Azure API keys and configuration settings.
Azure Setup: I configured Azure OpenAI to handle the markdown text translations and Azure Computer Vision for extracting and translating text from images. This setup allowed the tool to automatically detect and process both types of content seamlessly.
Installing Co-op Translator: I installed the Co-op Translator package using Poetry to manage the dependencies. After installing, I ran the translate command with appropriate language codes to initiate the translation process.
Execution with Co-op Translator: Using Co-op Translator, I initiated the translation process:
The markdown files were processed through Azure OpenAI for translation.
Azure Computer Vision was used to extract text from images, followed by Azure OpenAI for translating the extracted text.
The translated markdown and image files were saved in dedicated language-specific folders.
Review: After the translations were completed, I reviewed the output for accuracy. The automated process produced high-quality translations, significantly reducing the need for manual adjustments.
Once the translations are done, you’ll find the translations and translated_images folders in the root directory. You can see an example of the folder structure in the Phi-3 Cookbook:
Challenges and Solutions
While the process was mostly smooth, a few challenges arose:
Handling files with many code blocks: Some markdown files contained many code blocks. During translation, the tool splits the content into chunks, and if a split occurred within a code block, the translated output would sometimes break. To resolve this, we implemented a solution where code blocks are temporarily replaced with placeholders and skipped during translation, ensuring the integrity of both the code and the translation. In future versions, we plan to enhance this feature by translating comments within code blocks. This will involve separating code blocks from the rest of the markdown, accurately translating each part, and then reintegrating them into the final translated file.
Text from images: Extracting and translating text from complex images was challenging, especially when the images had a lot of text in small areas. This often led to the translated text in image being either overly stretched or compressed, affecting readability. While we managed to mitigate some of these issues, further improvements are needed in this area to ensure higher accuracy.
Results
The translation of the Phi-3 Cookbook has been successful. You can now view the translated version in multiple languages. If you’re interested in viewing the translated Phi-3 Cookbook, you can visit the Phi-3CookBook. This link will navigate you to the multilingual versions of the Phi-3 Cookbook.
Below is a markdown example from Phi-3 Cookbook, translated into Korean using the Co-op Translator:
Set up Azure resources
To translate markdown files and extract text from images, we need to set up two key Azure services: Azure OpenAI and Azure Computer Vision.
Create an Azure account
If you don’t already have an Azure account, you’ll need to create one.
Navigate to the Azure Sign Up page.
Select Try Azure for free or Pay as you go.
Follow the on-screen instructions to create your account.
Provide your personal details and contact information.
Verification: You’ll need to verify your identity using a credit card or phone number.
Create an Azure Computer Vision resource
Sign in to the Azure Portal.
Type computer vision in the search bar at the top of the portal page and select Computer vision from the options that appear.
Select + Create from the navigation menu.
Perform the following tasks:
Select your Azure Subscription.
Select the Resource group to use (create a new one if needed).
Select the Region you’d like to use.
Enter Name. It must be a unique value.
Select the Pricing tier you’d like to use.
Select Review + Create.
Select Create.
Create an Azure OpenAI resource
Type azure openai in the search bar at the top of the portal page and select Azure OpenAI from the options that appear.
Select + Create from the navigation menu.
Perform the following tasks:
Select your Azure Subscription.
Select the Resource group to use (create a new one if needed).
Select the Region you’d like to use.
Enter Name. It must be a unique value.
Select the Pricing tier you’d like to use.
Select Next to move to the Network page.
Select a network security Type you’d like to use.
Select Next to move to the Tags page.
Select Next to move to the Review + submit page.
Select Create.
Deploy Azure OpenAI models
Navigate to the Azure OpenAI resource that you created.
Select Go to Azure OpenAI Studio from the navigation menu.
Inside Azure OpenAI Studio, select Deployments from the left side tab.
Select + Deploy model from the navigation menu.
Select Deploy base model from the navigation menu to create a new gpt-4o deployment.
Perform the following tasks:
Inside Select a model page, select gpt-4o.
Select Confirm to navigate to the Deploy model gpt-4o page.
Inside Deploy model gpt-4o page, enter Deployment name. It must be a unique value. For example, gpt-4o.
Inside Deploy model gpt-4o page, select the Deployment type you’d like to use.
Select Deploy.
Create an .env file in the root directory
In this tutorial, we will guide you through setting up your environment variables for Azure services using an .env file. Environment variables allow you to securely manage sensitive credentials, such as API keys, without hard-coding them into your codebase.
Create the .env File
In the root directory of your project, create a file named .env. This file will store all your environment variables in a simple format.
Do not commit your .env file to version control systems like Git. Add .env to your .gitignore file to prevent accidental commits.
Navigate to the root directory of your project.
Create an .env file in the root directory of your project.
Open the .env file and paste the following template:
# Azure Credentials
AZURE_SUBSCRIPTION_KEY=”your_azure_subscription_key”
AZURE_AI_SERVICE_ENDPOINT=”https://your_azure_ai_service_endpoint”
# Azure OpenAI Credentials
AZURE_OPENAI_API_KEY=”your_azure_openai_api_key”
AZURE_OPENAI_ENDPOINT=”https://your_azure_openai_endpoint”
AZURE_OPENAI_MODEL_NAME=”your_model_name”
AZURE_OPENAI_CHAT_DEPLOYMENT_NAME=”your_deployment_name”
AZURE_OPENAI_API_VERSION=”your_api_version”
Gather your Azure credentials
You will need the following Azure credentials on hand to configure the environment:
For Azure AI Service:
Azure Subscription Key: Your Azure subscription key, which allows you to access the Azure AI services.
Azure AI Service Endpoint: The endpoint URL for your specific Azure AI service.
For Azure OpenAI Service:
Azure OpenAI API Key: The API key for accessing Azure OpenAI services.
Azure OpenAI Endpoint: The endpoint URL for your Azure OpenAI service.
Azure OpenAI Model Name: The name of the model you will be interacting with.
Azure OpenAI Deployment Name: The name of your deployment for Azure OpenAI models.
Azure OpenAI API Version: The version of the Azure OpenAI API you are using.
Add Azure environment variables
Perform the following tasks to add the Azure Subscription key and Azure AI Services Endpoint:
Type computer vision in the search bar at the top of the portal page and select Computer vision from the options that appear.
Navigate the Azure Computer Vision resource that you are currently using.
Copy and paste your Subscription key and Endpoint into the .env file.
Perform the following tasks to add the Azure OpenAI API Key and Endpoint:
Type azure openai in the search bar at the top of the portal page and select Azure OpenAI from the options that appear.
Navigate the Azure OpenAI resource that you are currently using.
Select Keys and Endpoint from the left side tab.
Copy and paste your Azure OpenAI API Key and Endpoint into the .env file.
Perform the following tasks to add the Azure OpenAI Deployment Name and Version:
Navigate to the Azure OpenAI resource that you created.
Select Go to Azure OpenAI Studio from the navigation menu.
Inside Azure OpenAI Studio, select Deployments from the left side tab.
Copy and paste your Azure OpenAI Name and model Version into the .env file.
Save the .env file.
Now, you can access these environment variables to use Co-op Translator with your Azure services.
Install the Co-op translator package
The Co-op Translator is a command-line interface (CLI) tool designed to help you translate all the markdown files and images in your project into multiple languages. This tutorial will guide you through configuring the translator and running it for various use cases.
Create a virtual environment
You can create a virtual environment using either pip or Poetry. Type one of the following commands inside your terminal.
Using pip
python -m venv .venv
Using Poetry
poetry init
Activate the virtual environment
After creating the virtual environment, you’ll need to activate it. The steps differ based on your operating system. Type the following command inside your terminal.
For both pip and Poetry
Windows:
venvScriptsactivate.bat
Mac/Linux:
source venv/bin/activate
Using Poetry
If you created the environment with Poetry, type the following command inside your terminal to activate it.
poetry shell
Installing the Package and required Packages
Once your virtual environment is set up and activated, the next step is to install the necessary dependencies.
Using Poetry (from pyproject.toml)
If you’re using Poetry, type the following command inside your terminal. It will automatically install the required packages specified in the pyproject.toml file:
poetry install
Translate your project using Co-op Translator
The Co-op Translator is a command-line interface (CLI) tool that helps you translate markdown and image files in your project into multiple languages. This section explains how to use the tool, covers the various CLI options, and provides examples for different use cases.
CLI options overview
The Co-op Translator CLI offers several options to customize the translation process:
-l (or –language-codes): Space-separated list of language codes for translation (e.g., “es fr de” for Spanish, French, and German). Use “all” to translate into all supported languages.
-r (or –root-dir): Specifies the root directory of the project (default is the current directory).
-a (or –add): Adds new translations without deleting existing ones (default behavior).
-u (or –update): Updates translations by deleting existing ones and re-creating them. Warning: This will delete all current translations.
-img (or –images): Translates only image files.
-md (or –markdown): Translates only markdown files.
-chk (or –check): Checks translated files for errors and retries translation if needed.
-d (or –debug): Enables debug mode for detailed logging.
Example scenarios and commands
Here are a few common use cases for the Co-op Translator, along with the appropriate commands to run.
1. Basic translation (Single language)
To translate your entire project (markdown files and images) into a single language, like Korean, use the following command:
translate -l “ko”
This command will translate all markdown and image files into Korean, adding new translations without deleting any existing ones.
Want to see which language codes are available in Co-op Translator? Visit the Supported Languages section in the repository for more details.
Example on Phi-3 CookBook
In the Phi-3 CookBook, I used the following method to add the Korean translation for the existing markdown files and images.
(.venv) C:Userssms79devPhi-3CookBook>translate -l“ko”
Translating images: 100%|████████████████████████████████████████████| 150/150 [45:53<00:00, 15.55s/it]
Translating markdown files: 100%|███████████████████████████████████| 95/95 [1:39:27<00:00, 125.62s/it]
2. Translating multiple languages
To translate your project into multiple languages (e.g., Spanish, French, and German), use this command:
translate -l “es fr de”
This command will translate the project into Spanish, French, and German, adding new translations without overwriting existing ones.
Example on Phi-3 CookBook
In the Phi-3 CookBook, after pulling the latest changes to reflect the most recent commits, I used the following method to translate newly added markdown files and images.
(.venv) C:Userssms79devPhi-3CookBook>translate -l“ko ja zh tw es fr” -a
Translating images: 100%|███████████████████████████████████████████████████| 273/273 [1:09:56<00:00, 15.37s/it]
Translating markdown files: 100%|████████████████████████████████████████████████| 6/6 [24:07<00:00, 241.31s/it]
While it’s generally recommended to translate one language at a time, in situations like this where specific changes need to be added, translating multiple languages at once can be efficient.
3. Specifying the root directory
By default, the translator uses the current working directory. If your project is located elsewhere, specify the root directory with the -r option:
translate -l “es fr de” -r “./my_project”
This command translates the files in ./my_project into Spanish, French, and German.
4. Add new translations without deleting existing ones
The default behavior is to add new translations without deleting existing ones. You can explicitly specify this by using the -a option:
translate -l “ko” -a
This command will add new translations in Korean without affecting the existing translations.
Example on Phi-3 CookBook
In the Phi-3 CookBook, to update the README.md translations, I first deleted the existing README.md translations and then used the following method to translate the updated content.
(.venv) C:Userssms79devPhi-3CookBook>translate -l“ko ja zh tw es fr” -a
Translating markdown files: 100%|████████████████████████████████████████████████| 6/6 [24:07<00:00, 241.31s/it]
5. Updating translations (Deletes existing translations)
To update existing translations (i.e., delete the current translations and replace them with new ones), use the -u option. This will delete all existing translations for the specified languages and re-translate them.
translate -l “ko” -u
Warning: This command will prompt you for confirmation before proceeding with deleting the existing translations.
Example on Phi-3 CookBook
In the Phi-3 CookBook, I used the following method to update all translated files in Spanish. I recommend using this method when there are significant changes to the original content across multiple markdown documents. If there are only a few translated markdown files to update, it’s more efficient to manually delete those specific files and then use the -a method to add the updated translations.
(.venv) C:Userssms79devPhi-3CookBook>translate -l “es” -u
Warning: The update command will delete all existing translations for ‘es’ and re-translate everything.
Do you want to continue? Type ‘yes’ to proceed: yes
Proceeding with update…
Translating images: 100%|████████████████████████████████████████████| 150/150 [43:46<00:00, 15.55s/it]
Translating markdown files: 100%|███████████████████████████████████| 95/95 [1:40:27<00:00, 125.62s/it]
6. Translating only images
To translate only the image files in your project, use the -img option:
translate -l “ko” -img
This command will translate only the images into Korean, without affecting any markdown files.
7. Translating only markdown files
To translate only the markdown files in your project, use the -md option:
translate -l “ko” -md
8. Checking for errors in translated files
If you want to check translated files for errors and retry the translation if necessary, use the -chk option:
translate -l “ko” -chk
This command will scan the translated markdown files and retry translation for any files with errors.
Example on Phi-3 CookBook
In the Phi-3 CookBook, I used the following method to check for translation errors in the Korean files and automatically retry translation for any files with detected issues.
(.venv) C:Userssms79devPhi-3CookBook>translate -l“ko” -chk
Checking translated files for errors in ko…
Checking files for ko: 100%|██████████████████████████████████████████████████| 95/95 [00:01<00:00, 65.47file/s]
Retrying vsc-extension-quickstart.md for ko: 0%| | 0/17 [00:00<?, ?file/s]
This option checks for translation errors. Currently, if the difference in line breaks between the original and translated files is more than six, the file is flagged as having a translation error. I plan to improve this criterion for greater flexibility in the future.
For example, this method is useful for detecting missing chunks or corrupted translations, and it will automatically retry the translation for those files.
However, if you already know which files are problematic, it’s more efficient to manually delete those files and use the -a option to re-translate them.
9. Debug mode
To enable detailed logging for troubleshooting, use the -d option:
translate -l “ko” -d
This command will run the translation in debug mode, providing additional logging information that can help you identify issues during the translation process.
Example on Phi-3 CookBook
In the Phi-3 CookBook, I encountered an issue where translations with many links in markdown files caused formatting errors, such as broken translations and ignored line breaks. To diagnose this problem, I used the -d option to see how the translation process was works.
(.venv) C:Userssms79devPhi-3CookBook>translate -l “ko” -d
DEBUG:openai._base_client:Request options: {‘method’: ‘post’, ‘url’: ‘/chat/completions’, ‘headers’: {‘api-key’: ‘af04e0bea45747d8a7b8c131c1971044’}, ‘files’: None, ‘json_data’: {‘messages’: [{‘role’: ‘user’, ‘content’: “Translate the following text to ko. NEVER ADD ANY EXTRA CONTENT OUTSIDE THE TRANSLATION. TRANSLATE ONLY WHAT IS GIVEN TO YOU.. MAINTAIN MARKDOWN FORMATnn# Phi-3 Cookbook: Hands-On Examples with Microsoft’s Phi-3 Models [![Open and use the samples in GitHub Codespaces](https://github.com/codespaces/badge.svg)](https://codespaces.new/microsoft/phi-3cookbook) [![Open in Dev Containers](https://img.shields.io/static/v1?style=for-the-badge&label=Dev%
…
10. Translating all languages
If you want to translate the project into all supported languages, use the all keyword.
Translating all languages at once can take a significant amount of time depending on the project size. For example, translating the Phi-3 CookBook into Spanish took about 2 hour. Given the scale, it’s not practical for one person to handle 20 languages. It’s recommended to split the work among multiple contributors, each managing one or two languages, and update translations gradually.
translate -l “all”
This command will translate the project into all available languages. If you proceed, the translation may take a significant amount of time depending on the size of the project.
Conclusion: Making your projects global
How can large-scale projects efficiently reach a global audience without getting bogged down in the complexities of translation? The Co-op Translator provides a compelling answer.
Based on my experience, here are a few key takeaways:
Time Management: Translating a large project can take considerable time. For example, translating the Phi-3 Cookbook into Spanish took about 2 hours. It’s important to account for this when working on large-scale translations to avoid being caught off guard by the time required. However, since this translation process only adds translated markdown or images once the translation is complete, you can pause the process at any time. Simply press Ctrl + C (or Cmd + C on macOS) at any time to stop the process, and when you’re ready to resume, use the -a option to continue the translation from where you left off.
Team Collaboration: If you’re part of a team, dividing the workload by assigning each member a specific language can help streamline the process. Co-op Translator‘s ability to easily add and update translations ensures smooth collaboration, allowing everyone to work without overwriting each other’s progress. Or, if you’re managing an open source project, I recommend encouraging contributors to use this tool to translate content into their respective languages, making the project more accessible to a global audience.
The Phi-3 Cookbook translation stands as a great example of how Co-op Translator can automate the localization process, making your project accessible to a global audience with minimal effort. Whether you’re translating Markdown files or images, Co-op Translator simplifies the task, handling the complexities of translation seamlessly.
Clean Up Azure Resources
Cleanup your Azure resources to avoid additional charges to your account. Go to the Azure portal and delete the following resources:
The Azure Computer Vision resource.
The Azure OpenAI resource
If you’re interested in trying out Co-op Translator or contributing to its development, check out the Co-op Translator’s GitHub repository. I’d love to hear your thoughts and experiences – feel free to contact me with any questions or feedback!
The Co-op Translator started as a proof of concept but has grown into a fully functional open source tool, thanks to community collaboration. Now, you can be part of this journey too!
🌍 Global Impact: Help us make this tool accessible worldwide
🚀 Contribute: Add new features, improve docs, or translate
🤝 Join the Team: Become a collaborator in our next release
Contribute to Co-op Translator
Your support will help bring translations to developers across the globe.
Microsoft Tech Community – Latest Blogs –Read More