Wednesday, June 5, 2019

Containerize legacy asp.net applications (Epicor E4SE) - take 3

After I figured out the MSDTC issue, all seemed well until I got hit by the MSMQ problem, even though I have diverted the queue to be on a remote machine, E4SE refused to work until it is satisfied that MSMQ is installed on the container, it turned out that MSMQ support in Container is only possible after build 1803 which isn't supported under Windows Server 2016.

Since I havne't got a 2019 server box on hand, I opted to use my windows 10 machine and pulled the 1803 image and rebuilt everything, good news is after all these the MSMQ issue is also gone.
will do more testing later but hopefully there should be no more major issues.

Tuesday, June 4, 2019

Containerize legacy asp.net applications (Epicor E4SE) - take 2

In my previous post, I almost thought everything is working fine until I got hit by this infamous MSDTC problem, I came across a few posts online and although they are quite inspiring and some of them have got it working, they are not quite the same situation as I got.

here is a summary of what people have got working for MSDTC:
> both app and sql are containerised
> under AWS, using ELB and port mapping
> under Azure, using CNI 

what I want to achieve:

> Containerized asp.net web app
> SQL Server running on VM
>  the web app will need to enable windows authentication under the domain, sql server is running under domain also

For me, my window server 2016 container host is running under vmware, initially I want to use the transparent network to make things easier however enabling promiscuous mode on the vmware environment doesn't seem to be an option so  this is out, at the end I ended up with running container just in NAT network and expose custom port to host.

It turns out that the above mentioned AWS scenario is the closest to mine, what I ended up with is to use KEMP load balancer in the place of ELB, 

what is really important:

> fix MSDTC port on the container and expose to host.
> expose RPC port 135 to host using customer port number
> expose port 80 to host via custom port number.

the KEMP load balancer will then map the ordinary port number to the custom port number on the container host, we also need to create the hostname DNS to point to the KEMP load balancer IP.

the above scenario only involves one container instance, if we need run multiple replica then we will most likely need to multiple the setup but I assume it is all straight forward and no drama here.

and finally MSDTC is working fine and hopefully the PoC is a success


Sunday, June 2, 2019

Machine learning

These days machine learning is an inescapable part of the realm if you are working in data field, as such I think it is necessary to learn something new and refresh what was taught at school over  a decade ago ... I took the coursea course "machine learning" by andrew ng  and finished the course today. I would recommend it to anyone who has interest to get to know a bit of machine learning , this course fall bit short on the theory and mathematics involved thought but that is understandable being an intro course.

Wednesday, May 29, 2019

Containerize legacy asp.net applications (Epicor E4SE) - take one

A while back I was looking at containerising Epicor E4SE which is an asp.net application running on .net 2/4 framework, windows container wasn't mature enough and I had problem getting some basic stuff done at that time so I gave up.
With the initiative to move things to cloud, I once again take up this challenge and trying to sort this out this time.
so far I have got the following in place:

1. Only containerize the web front end, BPM workflow service will run on VM instead, the VM will also serve as batch processing server.
2. Within the containerized web app's web.config, setup the queue to be the  remote queue (MSMQ) hosted on the VM.
3. Site build need to be scripted using powershell script, this is not possible inside a container.
4. there was an issue installing crystal report runtime in container but google gave me the answer.
5. SQLXML is required for installing ICE 1.5
6. ICE won't install with IIS 10 (This Setup Requires Internet Information Server 5.1 or higher), the trick is to  manually update registry value and revert back after the installation.
7. Use PIDKEY parameter with ICE msi to install in silent mode.
8. I had problem with ICE EP1 EP2 installation in silent mode sine the msi installation has a custom action that requires user response, I tried to crack it using installation argument but eventually gave up and manually install the new version of assembly instead and use assembly binding redirect to resolve the error caused by multiple version of same assembly.
9. use gMSA  (Group managed service account) with container to support windows authentication and accessing domain protected resource from within the container.



I dont' think containerize the crystal report server is such a good idea so will leave it running in VM too, at the end all the web front end can be containerized, BPM and Report Server will still need their own VM but I reckon this is good enough for now. 


Update: 


There is still a major hurdle before this whole thing can work, this time it is the msdtc that plays up, since I am using a VM for SQL , to make msdtc work it will requires tweaking quite a few things, hope this is not a show stopper. 

4/6: figured out that since I am running the container host in vmware environment, need to turn on the Promiscuous mode on the vSwitch, I doubt it will happen, might have to consider azure vm where cni will go into play.

Thursday, May 16, 2019

Microsoft Gains DevOps Momentum.

DevOps.com: Microsoft Gains DevOps Momentum. https://devops.com/microsoft-gains-devops-momentum/

The Register: Tangled in .NET: Will 5.0 really unify Microsoft's development stack?

The Register: Tangled in .NET: Will 5.0 really unify Microsoft's development stack?. https://www.theregister.co.uk/2019/05/16/will_net_5_really_unify_microsoft_development_stack/

Monday, May 6, 2019

Windows 10 will soon ship with a full, open source, GPLed Linux kernel

Ars Technica: Windows 10 will soon ship with a full, open source, GPLed Linux kernel. https://arstechnica.com/gadgets/2019/05/windows-10-will-soon-ship-with-a-full-open-source-gpled-linux-kernel/

Tuesday, April 23, 2019

openwrt strongswan config for android and ios using native vpn client


/etc/ipsec.config

 conn ios
              keyexchange=ikev1
              authby=xauthrsasig
              xauth=server
              left=%any
              leftsubnet=0.0.0.0/0
              leftfirewall=yes
              leftcert=serverCert.pem
              right=%any
              rightsubnet=192.168.1.0/24
              rightsourceip=%dhcp
              rightcert=clientCert.pem
              forceencaps=yes
              auto=add
conn android
 keyexchange=ikev2
 left=%any
 leftauth=pubkey
 leftcert=serverCert.pem
 leftid=yourdomain.dyndns.org
 leftsubnet=0.0.0.0/0,::/0
 right=%any
 rightsubnet=192.168.1.0/24
 rightsourceip=%dhcp
 rightauth=pubkey
 rightcert=androidCert.pem
 auto=add

Tuesday, April 9, 2019

Extending the angular and react single page application with graph api to customer web api

The original article are published here:

https://docs.microsoft.com/en-us/graph/tutorials/angular
https://docs.microsoft.com/en-us/graph/tutorials/react

to make it work with my own web api, I have to make considerable adjustment:

in the react version, there is a need to give the tenant specific authority URL.

class App extends Component {
constructor(props) {
super(props);

this.userAgentApplication = new UserAgentApplication(config.appId,
"https://login.microsoftonline.com/{tentan}", null);

this is how we call the web api:

async componentDidMount() {
try {
// Get the user's access token
var accessToken = await window.msal.acquireTokenSilent(config.scopes2);

var timesheets = await api.get(accessToken);
// Update the array of events in state
this.setState({timesheets: timesheets});


The angular version:

export class TimesheetService {

accessToken: string;

constructor(private authService: AuthService,
private alertsService: AlertsService,
private http: HttpClient
) {
//move the following two line to login method within authService and save token in cache.
//this.authService.getAccessToken(OAuthSettings.scopes2).then(data => {
//this.accessToken = data;
this.accessToken = sessionStorage.getItem('access_token');

});
}
getTimeSheets():Observable<TimeSheet[]> {
var header = {headers: new HttpHeaders()
.set('Authorization', 'Bearer ' + this.accessToken)
}
return this.http.get<Array<TimeSheet>>('https://localhost:44301/api/TimeSheets?WeekEnding=2018-10-01',header);
}


export class TimesheetComponent implements OnInit {

private timesheets: TimeSheet[];

constructor(private timesheetService: TimesheetService) { }

ngOnInit() {
this.getTimeSheets();
}
getTimeSheets() {
this.timesheetService.getTimeSheets().subscribe(data => {
this.timesheets = data;
});
}


Most importantly , you need to specific the scope for the web api when acquiring the token, a sample config is as below:

export const OAuthSettings = {
appId: 'xxxxx-xxxx-4edxxxx9-xxxx-xxxxxx',
scopes: [
"user.read",
"calendars.read"
],
scopes2: [
"https://{tenant}}.onmicrosoft.com/webapi/access_as_user"
]

};

Finally you also need to sort out CORS issue.

Tuesday, March 26, 2019

"Something is wrong with the numpy installation. While importing we detected an older version of numpy" error on azure databricks

"Something is wrong with the numpy installation. While importing we detected an older version of numpy"

I was doing the lab below and experienced the issue today.

https://cloudworkshop.blob.core.windows.net/cognitive-deep-learning/Hands-on%20lab/HOL%20step-by%20step%20-%20Cognitive%20services%20and%20deep%20learning.html

the problem seems to be related to azureml-sdk[databricks] because if I uninstall it some part of code can still run.
after spending quite some time scratching my hair, I looked up the azureml-sdk release page and found out that a new version 1.0.21 released just today (26 March 2019), I uninstall the default version and installed the prior version by using the format azureml-sdk[databricks]==1.0.18.1

alternatively if you choose runtime version 5.2 when creating the cluster then the problem doesn't occur.

Monday, February 18, 2019

Dax: currency conversion with date range

if you have daily exchange rate, the post at https://www.kasperonbi.com/currency-conversion-in-dax-for-power-bi-and-ssas/ shows a nice solution, however if you are like me where exchange rates are only updated periodically hence come with a start and end date then you need a difference solution.

with a small change to the above, I came up with an alternative solution as below:




Sales (in reporting currency) := if(HASONEVALUE(ReportCurrency[ReportCurrency]),
SUMX(FactSales,FactSales[Sales]*CALCULATE(MIN(FactExchangeRate[Factor]),
FILTER(FactExchangeRate,
AND(FactExchangeRate[FromDate]<=FactSales[Date],FactExchangeRate[ToDate]> FactSales[Date])
)
)
)
)

YTD Sales (year to date) :=
SUMX(
CALCULATETABLE(FactSales,ALL(FactSales[Date]),DATESYTD(DATE[date]))
FactSales[Sales]*CALCULATE(MIN(FactExchangeRate[Factor]),
FILTER(FactExchangeRate,
AND(FactExchangeRate[FromDate]<=FactSales[Date],FactExchangeRate[ToDate]> FactSales[Date])
)
)


)


LTD Sales (life to date) :=


SUMX(
CALCULATETABLE(FactSales,ALL(FactSales[Date]),
FILTER(ALL(DATE),DATE[Date] <= MAX(DATE[date]))
)
FactSales[Sales]*CALCULATE(MIN(FactExchangeRate[Factor]),
FILTER(FactExchangeRate,
AND(FactExchangeRate[FromDate]<=FactSales[Date],FactExchangeRate[ToDate]> FactSales[Date])
)
)

)

The above formula are quite resource intensive though, hence if you are working on a very large dataset, it is probably better to have them pre-calculated