My experiences and thoughts on [mostly] technology.

Category: Jhott Engineering Archive (Page 2 of 2)

This is a copy of my old blog site. (jhottengineering.blogspot.com) A lot of these posts are very outdated and even slightly embarrassing, but I like to keep them as a reminder of my thoughts over time. (And maybe to keep my humble ;)

Linking to a JavaScript File from a ASP.NET Master Page (web forms)

So we have these big JavaScript functions in one of our MasterPages. I wanted to move them to a separate file but it proved more difficult than I thought!

I could not add a “<link>” or “<script>” tag to the get and use an application relative path such as “~/Scripts/MyScript.js”.

Ended up just creating the control dynamically but it was interesting how I had to arrive to the right path. Here it is:

// Retrieve Header
HtmlHead header = (this.Page.Header as HtmlHead);

// If we have access of the page header:
if (header != null)
{
    // Add script block for Master functions:
    HtmlGenericControl scrptBlck = new HtmlGenericControl("script");
    scrptBlck.Attributes.Add("src",
        VirtualPathUtility.MakeRelative(
                Request.AppRelativeCurrentExecutionFilePath,
                "~/Scripts/MasterFunctions.js"));
    scrptBlck.Attributes.Add("type", "text/javascript");
    header.Controls.Add(scrptBlck);
}

IE Post and Redirect Errors…

UPDATE NOTE: I did find a much simple solution to this; as mentioned in the comments at the bottom,  I found that sending a compressed P3P security header did just fine to address the redirect issue and was much more simple to implement.
The HTTP module is now modified to send the P3P header on redirect. You can do so as follows:

_context.Response.AddHeader(“p3p”, “[P3P Policy Here]”)

You can find more information about P3P here.

Many developers use redirection as a tool to present the user with another page once a process has been completed. One of the most common uses is- for example- a login. A user logs in, and- if he is authenticated he is redirected to another page, if not the same page is displayed with an error message informing the user why the authentication failed. (Such as an incorrect username/password combination)

I don’t know if I would consider this the best way to carry out these types of transactions, since in my understanding that isn’t exactly what redirects were meant for- however, it is a simple way to address these type of situations and on strictly pragmatic terms- can often be the best approach.

I’ve been working on a page that does exactly this, when I ran into an error.

The Problem.

Apparently, if you do a POST to a page and that page sends back headers to set a cookie AND redirect in the same response, the cookie does not get set. This seems to be a bug in IE. And the fact that the page was originally called from an iframe seemed to be key to the problem.

How did I run into this? I was working with another system that seamlessly integrated into the web application, and it happened to open the website in an iframe. The process was mostly simple:

    1. Client used Web Application A which makes a form post to Web Application B within an iframe. (Web Application A uses iframes)
    1. Web Application B processes the post for authentication.
    1. If the authentication succeeds, Web Application B redirects the client to a “services” pages also on Web Application B that offer services.
  1. The client is now looking at a “services” page on Web Application B which is within an iframe of the Web Application A.

The problem was between steps 3 and 4 – in that the 4th step would never happen. The client was certainly redirected to the “services” page but the “services” page would redirect the client right back to a login page because the client was not authenticated. In step 3 the server would attempt to send back a cookie to identify the session and redirect in the same response, but the cookie would never be set and never sent back.

It was an “intermittent” problem- and I still haven’t gotten it to replicate more than twice to figure our the exact conditions under which it does work; the rest of the time it failed.

However, a suspicion tells me it was not an intermittent error, but rather that it always happens. (After all, there are no intermittent errors. It’s all a matter of finding out the exact conditions under which these anomalies occur)
I noticed that when the browser did successfully set the cookie and return it, it was most likely because I had previously signed on the site before, effectively forcing the browser to send back my old authentication cookie- “seemingly” making it work.

Anyway, I found a blog by Brian Donahue where it seems like he ran into the same bug, or at least a related bug:
http://www.persistall.com/archive/2008/01/25/cookies–redirects–nightmares.aspx

It was the closest thing I could find to the problem I was having. After attempting the solutions provided with no success, it seems to me like there is a little more to it, and in this case, the only thing I can think different is the fact that in my case it involves an iframe. However, something tells me that this can’t be it. For one, it doesn’t make sense, but I’m also not completely ruling it out- there just may be something about the way cookies are handled within an iframe that I don’t know about, but basically I’m still wondering why the solutions Brian Donahue wrote about did nothing for me. I would be more than appreciative to anyone who could educate me better on why this happens- for now I will continue to attempt to find the exact reason for this bug!

The Consequences.

Anyway, this cookie not being set was a rather large problem, because this would mean the user would not be authenticated. This would destroy the communication between both systems so I had to think of some way to fix the problem.

I started looking into the session object in ASP.NET and how that is handled, as well as just all the HTTP Pipeline in general.

I thought about enabling cookie-less sessions, and it might have worked out if I could use them on a per-page basis, but that was quite a port-over and not as secure. I hated the idea of an ID being displayed on the address bar, (especially for each pages) both for security and just aesthetic reasons, really. All in all, I finally decided against enabling cookie-less sessions.

Addressing The Problem.

How was this problem solved?

I wrote an HTTP Module. The module essentially “interrupts” the response, checks if the response is a redirect and sets a cookie- and if it does- it changes the URL direction by appending to the url a session ID, (like the cookie-less session feature) and a key that quickly expires.

The receiving page “interrupts” the request, checks for this session ID, generates a cookie with the ID value from the query string, and feeds this artificial cookie into the HTTP the pipeline. The session is then loaded normally by ASP.NET.
There is also some logic to check the key that was passed in for validity, as well to check if it was expired- if so, the session is destroyed effectively logging the user out before any of the code in the page object is executed.

I felt this solution was great for a few reasons:

    • The URL will virtually never display a session ID.
      For 98% for the application usage, the URL will never contain a session ID. When it does, it will be more secure on account of the expiring key. It will only show on those specific URLs that redirect and attempt setting a cookie at the same time.
      Currently the only page that redirect this way is linked from an iframe, which means the user will never see the page address; now I can lean towards 99.99%
    • More Secure than the enabling session-less cookies.
      The security is much tighter than cookie less sessions because an expiring key must also be authenticated when the session data is passed on.
    • It only affects redirects that set cookies.
      This was great in that I didn’t want to add complexity to the website and overhead that was unnecessary.
  • It is completely transparent to other developers.

The last point was one of the greatest benefits. This means any other developer can come after me, and write his pages like he normally does, without ever even being aware of the IE Redirect bug. Because the code works on a lower level in the HTTP Pipeline chain, this is all done transparently.

This drives the cost of ownership of the application down because future developers will not have to be versed on the changes or develop specifically to accommodate for the change, or even run into the error in the first place and attempt to address it. This drives development time down which translates to more time in other projects and no money wasted.

XML, Web Services, SOAP, and Transmission Protocols

In the pursuit of integration, I’ve found myself needing to integrate with clients with different integration needs/capabilities. In doing so I found some interesting things about .NET as well as just web services in general.
Communication Protocols

Although most web services are consumed over HTTP as the transmission protocol, not all communicate through SOAP. While generally the standard is communication through SOAP, some communicate over HTTP.

I recently found .NET does have the ability to use HTTP POST and GET as communication protocols as well, which seems expected. However, as of .NET 1.1 SOAP is the primary communication protocols and while HTTP and GET are supported, they are disabled by default. It turns out that it is very simple to enable them, only requiring the following change in your application’s “web.config” file:

<webservices>
  <protocols>
    <add name="HttpPost">
    <add name="HttpGet">
  </protocols>
</webservices>

This makes integration with many legacy systems much more possible- and especially other systems that are not on the .NET platform.

A page built to make a request to a web service using HTTP as both the transmission and communication protocol may look as such:

<form action="mywebservice.asmx/WebServiceMethod" method="post">
  <input name="parameterName" value="param1" />
  <input name="parameterName2" value="param2" />
  
  <input type="submit" value="Submit!" />
</form>

If you were developing your web service on .NET all you’d have to do is develop as normal and modify your web.config as described. You may also consume a web service using GET and query parameters.

One of the scenarios I’ve found myself in is a client desiring to consume the web service using HTTP as both the transmission and communication protocols, however passing XML as a “parameter” effectively using XML as a communication protocol as well. I found this is relatively simple and easy to work with as long as the XML is url-encoded as one of the form’s fields, and is decoded once loaded. (In .NET you would simply use HttpUtility.UrlDecode()).

You can then load this string in an XmlDocument and manage it easily using the object structure in .NET. When returning a response- if the client is able to accept an XML response- using .NET you can simply return the XmlDocument object. The XmlDocument Object can be found in the System.Xml namespace.do not return an XML-formatted string if you have an XmlDocuments.

XML and SQL Server 2005

In the midst of all this, I found myself storing XML. I realized there was an XML data type in SQL Server, and so I did a bit of research on it. I decided to use it and I was pretty amazed.

By storing data as XML in SQL Server 2005 using the XML data type, you can actually use XQuery to write queries that take advantage of the XML data structure. It’s almost like running your main query and s asubquery to filter the XML data. It make it very practical and simple to retrieve and store XML data.

A SQL Query using XQuery might look like:

SELECT OrigXmlRequest.query(‘/Notes//child::*’)
FROM YardiRequest

This would return all the XML child nodes and their descendants of ‘Notes’.

XML Data in SQL Server 2005 is stored in the same was a varchar(MAX) is, and has a max size limit of 2 gigs. (Which if you’re hitting anywhere close to that 2 GIG limit when storing an XML file, you should probably store the data normalized in tables)

One thing I ran into that I would like to point out when storing the XML data from .NET into SQL was the encoding format used. I kept receiving an encoding exception, until I realized the encoding needed to be changed from UTF-8 to UTF-16 in order to be stored in the database. If you have an XmlDocument object, an easy way to do this would be to just edit the first child and the you will be able to store the data. e.g.:

xmlDcRequest.LoadXml(myXmlString);
// Change encoding to UTF-16 for storage in SQL Server
xmlDcRequest.FirstChild.InnerText = xmlDcRequest.FirstChild.InnerText.ToLower().Replace(“utf-8”, “UTF-16”);

// Adding the XML to a stored procedure parameter to store the XML data:
sqlCmd.Parameters.Add(“@XMLRequest”, SqlDbType.Xml).Value = xmlDcRequest.InnerXml.ToString();

Final Notes on Web Service Development

I think one of the biggest lessons I am learning from putting together web services to service other clients is the cruciality of writing code as bulletproof as possible. Of course, you should always be doing this anyways. Make sure your web service is returning clear error codes and descriptions when errors do occur to make development for the client straightforward. Clear communication and a detailed scope of work are essential. Even though these are simple things, it seems many times these things are missed.

SQL Server Reporting Services and Headers Requiring Dataset Fields

Any search for “fields in headers SSRS” in Google will come up with many search results with pages addressing workarounds for one of Microsoft SQL Server Reporting Service’s biggest weakness: the inability to add fields to the header or footer from a dataset.

This limitation makes some sense in that a dataset returns rows of fields and so the problem might lie in identifying which row should be displayed, however one should be able to define exactly what to display- the first or last row or maybe an aggregate of the rows.

All that to say, this is yet another workaround I haven’t seen elsewhere. It addresses the problem of maintaining data in the header regardless of what page the report is being viewed on.

Why another work around? I tend to avoid storing fields in internal parameters because I’ve had problems with it before, and shared variables are a horrible solution in a production enviroment because of the possibility of concurrently running reports. It seems like these are always the solutions given to this issue.

So here goes: the way I’ve solved it is by making a group within the table displaying my dataset, and then using the footer of the group to store a list box, (if you are needing to use the footer, simply extend it to add another footer row right below it). In that list box you can place another table or fields and you can even have this list box pull from another datasource. When done, set the group footer’s “Hidden” property to “True” so that it doesn’t show on the rendered report, and you’re set. You can now reference these fields from the header or footer like so:

=First(ReportItems!NameOfTextboxInListItem.Value,”datasourceORgroupName“)

And since it’s an expression, you can do whatever you want. You can also use aggregates and such- just make sure you modify the expression on the field/table pulling directly from the dataset, and not the field in the header/footer that will actually be displaying the data.

The only major problem I’ve run into with this approach (that I am still working on) is that the fields in the header will export as blank if I export the report as a PDF. It will do fine in HTML, or even Excel. (Which is funny since most rendering errors seem to be tied to Excel instead of the other way around).

If any solution is found to this issue, or if someone can explain to me the PDF rendering process and why these fields show up as blank, it would be greatly appreciated!

Hopefully this post was of some use to those that do not need to export in PDF…

Time Logging and Cookie Data Encryption

One of the projects I’m working on is to log how long a visitor is on our website viewing a video and use that to enable a service. Yeah… Who would of thought?

Anyway this post is both about the time logging approach utilized as well as cookie encryption. Let me give you a little background before we dive in.

The Conditions
The goal is a little tricky because of the set up. I’m writing it in Classic ASP, (not exactly my favorite language). Also normally one would think of storing time logging data on a SQL server, but I didn’t have access to one. Another tricky part about logging the time a visitor is viewing the video is that the video being viewed can be upwards of an hour.

The Challenge
Using session to track the time data isn’t a full solution here since the session will timeout unless I increase that, (which right now isn’t an option). If I am to use sessions as a way to keep track of the time spent on the site, I need to continually refresh/access the page to update time logging data as well as to keep the session alive.

The Chosen Approach
The idea was making an AJAX call asynchronously “in the background” every 10 mins while the user is watching the video. This would keep the session alive, (which I’m finding out it does not for some reason), and would allow me to track with a 10 min accuracy if the user was on our video page or not, (hopefully watching the video). The advantage here is if that if the user moves away from the page, the updates stop effectively defining when the user began and finished viewing the video.

What Actually Happened…
As much as I hate to say it, I have to admit- this system will at least partly depend on JavaScript and I can’t think of any other practical way to avoid this. I find comfort in the fact that we can thank the major adoption of .NET-driven applications for JavaScript becoming a necessity such that a developer can count on a fairly large user base supporting it.

Anyway, not having access to a database, I chose to store time logging information on a cookie. I decided I would store the session ID and the time when the request was made. Of course, this could lead to easy tampering and so I chose to encrypt it.

Encryption
The encrypting algorithm I used was RC4. I found it easy to implement, thanks to 4guysfromrolla.com. After reading up on it a little I also felt it was rather secure- provided a strong key is used- especially for this purpose.

Little did I know the headaches I would run into. I would encrypt the data and store it in the cookie- but when I read it back, some of the data would be missing. It was very bizarre. I was trying all these ways to make sure I was reading/writing the cookie data correctly.

Before long I realized that it probably had to do with the way cookies are stored and just exactly what is and is not legal character-wise. I tried URL-encoding them. This bloated the data and I still had data missing.

I threw the whole URL-encoding idea out the window (for the moment) and tried filtering for illegal characters myself. I did some research, but all I could find was that <>:= were the illegal characters and nothing else. So I wrote some filters for that. Still missing data. Tried URL encoding as well as the filtering. Still missing data.

Finally I decided I would just write a function to go through each character in a string of encrypted data, and find any characters outside of a safe US-ASCII code range and escape them through a custom method.

As I was finishing up I ran into the Escape() and Unescape() methods in Classic ASP- so I gave them a shot before even trying my custom functions. They worked wonderfully. On the other hand, my idea of grouping time data by session ID and then using that to calculate time spent on each session to figure out a total of time spent on the site didn’t work out so well, so I am choosing another method. I might write on that later.

Lessons Learned:
When writing binary[-like] data to cookies, stick to a known safe subset of US-ASCII characters. I still don’t know exactly which character it was that caused the problem, but I’m thinking maybe some odd escape character or something along those lines.

At the time of this post I’m still in the process of writing the application, so I’m open to any ideas or suggestions on how anyone else would do it.

-Jheatt

Newer posts »

© 2023 Pablo Aizpiri

Theme by Anders NorenUp ↑